Implementation of Web Processing Services (WPS) over IPSL Earth System Grid Federation (ESGF) node
NASA Astrophysics Data System (ADS)
Kadygrov, Nikolay; Denvil, Sebastien; Carenton, Nicolas; Levavasseur, Guillaume; Hempelmann, Nils; Ehbrecht, Carsten
2016-04-01
The Earth System Grid Federation (ESGF) is aimed to provide access to climate data for the international climate community. ESGF is a system of distributed and federated nodes that dynamically interact with each other. ESGF user may search and download climatic data, geographically distributed over the world, from one common web interface and through standardized API. With the continuous development of the climate models and the beginning of the sixth phase of the Coupled Model Intercomparison Project (CMIP6), the amount of data available from ESGF will continuously increase during the next 5 years. IPSL holds a replication of the different global and regional climate models output, observations and reanalysis data (CMIP5, CORDEX, obs4MIPs, etc) that are available on the IPSL ESGF node. In order to let scientists perform analysis of the models without downloading vast amount of data the Web Processing Services (WPS) were installed at IPSL compute node. The work is part of the CONVERGENCE project founded by French National Research Agency (ANR). PyWPS implementation of the Web processing Service standard from Open Geospatial Consortium (OGC) in the framework of birdhouse software is used. The processes could be run by user remotely through web-based WPS client or by using command-line tool. All the calculations are performed on the server side close to the data. If the models/observations are not available at IPSL it will be downloaded and cached by WPS process from ESGF network using synda tool. The outputs of the WPS processes are available for download as plots, tar-archives or as NetCDF files. We present the architecture of WPS at IPSL along with the processes for evaluation of the model performance, on-site diagnostics and post-analysis processing of the models output, e.g.: - regriding/interpolation/aggregation - ocgis (OpenClimateGIS) based polygon subsetting of the data - average seasonal cycle, multimodel mean, multimodel mean bias - calculation of the climate indices with icclim library (CERFACS) - atmospheric modes of variability In order to evaluate performance of any new model, once it became available in ESGF, we implement WPS with several model diagnostics and performance metrics calculated using ESMValTool (Eyring et al., GMDD 2015). As a further step we are developing new WPS processes and core-functions to be implemented at ISPL ESGF compute node following the scientific community needs.
NASA Astrophysics Data System (ADS)
Čepický, Jáchym; Moreira de Sousa, Luís
2016-06-01
The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.
Enriching the Web Processing Service
NASA Astrophysics Data System (ADS)
Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer
2014-05-01
The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS-T interface. In order to deal with the results of these processing workflows, a server side extension enables the RichWPS Server and its clients to use WPS presentation directives (WPS-PD), a content related enhancement for the standardized WPS schema. We identified essential requirements of the components of our toolset by applying two use cases. The first enables the simplified comparison of modeled and measured data, a common task in hydro-engineering to validate the accuracy of a model. An implementation of the workflow includes reading, harmonizing and comparing two datasets in NetCDF-format. 2D Water level data from the German Bight can be chosen, presented and evaluated in a web client with interactive plots. The second use case is motivated by the Marine Strategy Directive (MSD) of the EU, which demands monitoring, action plans and at least an evaluation of the ecological situation in marine environment. Information technics adapted to those of INSPIRE should be used. One of the parameters monitored and evaluated for MSD is the expansion and quality of seagrass fields. With the view towards other evaluation parameters we decompose the complex process of evaluation of seagrass in reusable process steps and implement those packages as configurable WPS.
Sharing environmental models: An Approach using GitHub repositories and Web Processing Services
NASA Astrophysics Data System (ADS)
Stasch, Christoph; Nuest, Daniel; Pross, Benjamin
2016-04-01
The GLUES (Global Assessment of Land Use Dynamics, Greenhouse Gas Emissions and Ecosystem Services) project established a spatial data infrastructure for scientific geospatial data and metadata (http://geoportal-glues.ufz.de), where different regional collaborative projects researching the impacts of climate and socio-economic changes on sustainable land management can share their underlying base scenarios and datasets. One goal of the project is to ease the sharing of computational models between institutions and to make them easily executable in Web-based infrastructures. In this work, we present such an approach for sharing computational models relying on GitHub repositories (http://github.com) and Web Processing Services. At first, model providers upload their model implementations to GitHub repositories in order to share them with others. The GitHub platform allows users to submit changes to the model code. The changes can be discussed and reviewed before merging them. However, while GitHub allows sharing and collaborating of model source code, it does not actually allow running these models, which requires efforts to transfer the implementation to a model execution framework. We thus have extended an existing implementation of the OGC Web Processing Service standard (http://www.opengeospatial.org/standards/wps), the 52°North Web Processing Service (http://52north.org/wps) platform to retrieve all model implementations from a git (http://git-scm.com) repository and add them to the collection of published geoprocesses. The current implementation is restricted to models implemented as R scripts using WPS4R annotations (Hinz et al.) and to Java algorithms using the 52°North WPS Java API. The models hence become executable through a standardized Web API by multiple clients such as desktop or browser GIS and modelling frameworks. If the model code is changed on the GitHub platform, the changes are retrieved by the service and the processes will be updated accordingly. The admin tool of the 52°North WPS was extended to support automated retrieval and deployment of computational models from GitHub repositories. Once the R code is available in the GitHub repo, the contained process can be easily deployed and executed by simply defining the GitHub repository URL in the WPS admin tool. We illustrate the usage of the approach by sharing and running a model for land use system archetypes developed by the Helmholtz Centre for Environmental Research (UFZ, see Vaclavik et al.). The original R code was extended and published in the 52°North WPS using both, public and non-public datasets (Nüst et al., see also https://github.com/52North/glues-wps). Hosting the analysis in a Git repository now allows WPS administrators, client developers, and modelers to easily work together on new versions or completely new web processes using the powerful GitHub collaboration platform. References: Hinz, M. et. al. (2013): Spatial Statistics on the Geospatial Web. In: The 16th AGILE International Conference on Geographic Information Science, Short Papers. http://www.agile-online.org/Conference_Paper/CDs/agile_2013/Short_Papers/SP_S3.1_Hinz.pdf Nüst, D. et. al.: (2015): Open and reproducible global land use classification. In: EGU General Assembly Conference Abstracts . Vol. 17. European Geophysical Union, 2015, p. 9125, http://meetingorganizer.copernicus. org/EGU2015/EGU2015- 9125.pdf Vaclavik, T., et. al. (2013): Mapping global land system archetypes. Global Environmental Change 23(6): 1637-1647. Online available: October 9, 2013, DOI: 10.1016/j.gloenvcha.2013.09.004
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.
2013-09-01
Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.
Pragmatic service development and customisation with the CEDA OGC Web Services framework
NASA Astrophysics Data System (ADS)
Pascoe, Stephen; Stephens, Ag; Lowe, Dominic
2010-05-01
The CEDA OGC Web Services framework (COWS) emphasises rapid service development by providing a lightweight layer of OGC web service logic on top of Pylons, a mature web application framework for the Python language. This approach gives developers a flexible web service development environment without compromising access to the full range of web application tools and patterns: Model-View-Controller paradigm, XML templating, Object-Relational-Mapper integration and authentication/authorization. We have found this approach useful for exploring evolving standards and implementing protocol extensions to meet the requirements of operational deployments. This paper outlines how COWS is being used to implement customised WMS, WCS, WFS and WPS services in a variety of web applications from experimental prototypes to load-balanced cluster deployments serving 10-100 simultaneous users. In particular we will cover 1) The use of Climate Science Modeling Language (CSML) in complex-feature aware WMS, WCS and WFS services, 2) Extending WMS to support applications with features specific to earth system science and 3) A cluster-enabled Web Processing Service (WPS) supporting asynchronous data processing. The COWS WPS underpins all backend services in the UK Climate Projections User Interface where users can extract, plot and further process outputs from a multi-dimensional probabilistic climate model dataset. The COWS WPS supports cluster job execution, result caching, execution time estimation and user management. The COWS WMS and WCS components drive the project-specific NCEO and QESDI portals developed by the British Atmospheric Data Centre. These portals use CSML as a backend description format and implement features such as multiple WMS layer dimensions and climatology axes that are beyond the scope of general purpose GIS tools and yet vital for atmospheric science applications.
NASA Astrophysics Data System (ADS)
Hempelmann, Nils; Ehbrecht, Carsten; Alvarez-Castro, Carmen; Brockmann, Patrick; Falk, Wolfgang; Hoffmann, Jörg; Kindermann, Stephan; Koziol, Ben; Nangini, Cathy; Radanovics, Sabine; Vautard, Robert; Yiou, Pascal
2018-01-01
Analyses of extreme weather events and their impacts often requires big data processing of ensembles of climate model simulations. Researchers generally proceed by downloading the data from the providers and processing the data files ;at home; with their own analysis processes. However, the growing amount of available climate model and observation data makes this procedure quite awkward. In addition, data processing knowledge is kept local, instead of being consolidated into a common resource of reusable code. These drawbacks can be mitigated by using a web processing service (WPS). A WPS hosts services such as data analysis processes that are accessible over the web, and can be installed close to the data archives. We developed a WPS named 'flyingpigeon' that communicates over an HTTP network protocol based on standards defined by the Open Geospatial Consortium (OGC), to be used by climatologists and impact modelers as a tool for analyzing large datasets remotely. Here, we present the current processes we developed in flyingpigeon relating to commonly-used processes (preprocessing steps, spatial subsets at continent, country or region level, and climate indices) as well as methods for specific climate data analysis (weather regimes, analogues of circulation, segetal flora distribution, and species distribution models). We also developed a novel, browser-based interactive data visualization for circulation analogues, illustrating the flexibility of WPS in designing custom outputs. Bringing the software to the data instead of transferring the data to the code is becoming increasingly necessary, especially with the upcoming massive climate datasets.
Web Service for Positional Quality Assessment: the Wps Tier
NASA Astrophysics Data System (ADS)
Xavier, E. M. A.; Ariza-López, F. J.; Ureña-Cámara, M. A.
2015-08-01
In the field of spatial data every day we have more and more information available, but we still have little or very little information about the quality of spatial data. We consider that the automation of the spatial data quality assessment is a true need for the geomatic sector, and that automation is possible by means of web processing services (WPS), and the application of specific assessment procedures. In this paper we propose and develop a WPS tier centered on the automation of the positional quality assessment. An experiment using the NSSDA positional accuracy method is presented. The experiment involves the uploading by the client of two datasets (reference and evaluation data). The processing is to determine homologous pairs of points (by distance) and calculate the value of positional accuracy under the NSSDA standard. The process generates a small report that is sent to the client. From our experiment, we reached some conclusions on the advantages and disadvantages of WPSs when applied to the automation of spatial data accuracy assessments.
Leveraging Open Standard Interfaces in Accessing and Processing NASA Data Model Outputs
NASA Astrophysics Data System (ADS)
Falke, S. R.; Alameh, N. S.; Hoijarvi, K.; de La Beaujardiere, J.; Bambacus, M. J.
2006-12-01
An objective of NASA's Earth Science Division is to develop advanced information technologies for processing, archiving, accessing, visualizing, and communicating Earth Science data. To this end, NASA and other federal agencies have collaborated with the Open Geospatial Consortium (OGC) to research, develop, and test interoperability specifications within projects and testbeds benefiting the government, industry, and the public. This paper summarizes the results of a recent effort under the auspices of the OGC Web Services testbed phase 4 (OWS-4) to explore standardization approaches for accessing and processing the outputs of NASA models of physical phenomena. Within the OWS-4 context, experiments were designed to leverage the emerging OGC Web Processing Service (WPS) and Web Coverage Service (WCS) specifications to access, filter and manipulate the outputs of the NASA Goddard Earth Observing System (GEOS) and Goddard Chemistry Aerosol Radiation and Transport (GOCART) forecast models. In OWS-4, the intent is to provide the users with more control over the subsets of data that they can extract from the model results as well as over the final portrayal of that data. To meet that goal, experiments have been designed to test the suitability of use of OGC's Web Processing Service (WPS) and Web Coverage Service (WCS) for filtering, processing and portraying the model results (including slices by height or by time), and to identify any enhancements to the specs to meet the desired objectives. This paper summarizes the findings of the experiments highlighting the value of the Web Processing Service in providing standard interfaces for accessing and manipulating model data within spatial and temporal frameworks. The paper also points out the key shortcomings of the WPS especially in terms in comparison with a SOAP/WSDL approach towards solving the same problem.
WPS mediation: An approach to process geospatial data on different computing backends
NASA Astrophysics Data System (ADS)
Giuliani, Gregory; Nativi, Stefano; Lehmann, Anthony; Ray, Nicolas
2012-10-01
The OGC Web Processing Service (WPS) specification allows generating information by processing distributed geospatial data made available through Spatial Data Infrastructures (SDIs). However, current SDIs have limited analytical capacities and various problems emerge when trying to use them in data and computing-intensive domains such as environmental sciences. These problems are usually not or only partially solvable using single computing resources. Therefore, the Geographic Information (GI) community is trying to benefit from the superior storage and computing capabilities offered by distributed computing (e.g., Grids, Clouds) related methods and technologies. Currently, there is no commonly agreed approach to grid-enable WPS. No implementation allows one to seamlessly execute a geoprocessing calculation following user requirements on different computing backends, ranging from a stand-alone GIS server up to computer clusters and large Grid infrastructures. Considering this issue, this paper presents a proof of concept by mediating different geospatial and Grid software packages, and by proposing an extension of WPS specification through two optional parameters. The applicability of this approach will be demonstrated using a Normalized Difference Vegetation Index (NDVI) mediated WPS process, highlighting benefits, and issues that need to be further investigated to improve performances.
Data near processing support for climate data analysis
NASA Astrophysics Data System (ADS)
Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils
2016-04-01
Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.
A Web-based Visualization System for Three Dimensional Geological Model using Open GIS
NASA Astrophysics Data System (ADS)
Nemoto, T.; Masumoto, S.; Nonogaki, S.
2017-12-01
A three dimensional geological model is an important information in various fields such as environmental assessment, urban planning, resource development, waste management and disaster mitigation. In this study, we have developed a web-based visualization system for 3D geological model using free and open source software. The system has been successfully implemented by integrating web mapping engine MapServer and geographic information system GRASS. MapServer plays a role of mapping horizontal cross sections of 3D geological model and a topographic map. GRASS provides the core components for management, analysis and image processing of the geological model. Online access to GRASS functions has been enabled using PyWPS that is an implementation of WPS (Web Processing Service) Open Geospatial Consortium (OGC) standard. The system has two main functions. Two dimensional visualization function allows users to generate horizontal and vertical cross sections of 3D geological model. These images are delivered via WMS (Web Map Service) and WPS OGC standards. Horizontal cross sections are overlaid on the topographic map. A vertical cross section is generated by clicking a start point and an end point on the map. Three dimensional visualization function allows users to visualize geological boundary surfaces and a panel diagram. The user can visualize them from various angles by mouse operation. WebGL is utilized for 3D visualization. WebGL is a web technology that brings hardware-accelerated 3D graphics to the browser without installing additional software. The geological boundary surfaces can be downloaded to incorporate the geologic structure in a design on CAD and model for various simulations. This study was supported by JSPS KAKENHI Grant Number JP16K00158.
NASA Astrophysics Data System (ADS)
Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian
2016-04-01
Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.
River Basin Standards Interoperability Pilot
NASA Astrophysics Data System (ADS)
Pesquer, Lluís; Masó, Joan; Stasch, Christoph
2016-04-01
There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture tests the combination of Gauge data in a WPS that is triggered by a meteorological alert. The data is translated into OGC WaterML 2.0 time series data format and will be ingested in a SOS 2.0. SOS data is visualized in a SOS Client that is able to handle time series. The meteorological forecast data (with the supervision of an operator manipulating the WPS user interface) ingests with WaterML 2.0 time series and terrain data is input for a flooding modelling algorithm. The WPS is able to produce flooding datasets in the form of coverages that is offered to clients via a WCS 2.0 service or a WMS 1.3 service, and downloaded and visualized by the respective clients. The WPS triggers a notification or an alert that will be monitored from an emergency control response service. Acronyms AS: Alert Service ES: Event Service ICT: Information and Communication Technology NS: Notification Service OGC: Open Geospatial Consortium RIBASE: River Basin Standards Interoperability Pilot SOS: Sensor Observation Service WaterML: Water Markup Language WCS: Web Coverage Service WMS: Web Map Service WPS: Web Processing Service
NASA Astrophysics Data System (ADS)
Bandibas, J. C.; Takarada, S.
2013-12-01
Timely identification of areas affected by natural disasters is very important for a successful rescue and effective emergency relief efforts. This research focuses on the development of a cost effective and efficient system of identifying areas affected by natural disasters, and the efficient distribution of the information. The developed system is composed of 3 modules which are the Web Processing Service (WPS), Web Map Service (WMS) and the user interface provided by J-iView (fig. 1). WPS is an online system that provides computation, storage and data access services. In this study, the WPS module provides online access of the software implementing the developed frequency based change detection algorithm for the identification of areas affected by natural disasters. It also sends requests to WMS servers to get the remotely sensed data to be used in the computation. WMS is a standard protocol that provides a simple HTTP interface for requesting geo-registered map images from one or more geospatial databases. In this research, the WMS component provides remote access of the satellite images which are used as inputs for land cover change detection. The user interface in this system is provided by J-iView, which is an online mapping system developed at the Geological Survey of Japan (GSJ). The 3 modules are seamlessly integrated into a single package using J-iView, which could rapidly generate a map of disaster areas that is instantaneously viewable online. The developed system was tested using ASTER images covering the areas damaged by the March 11, 2011 tsunami in northeastern Japan. The developed system efficiently generated a map showing areas devastated by the tsunami. Based on the initial results of the study, the developed system proved to be a useful tool for emergency workers to quickly identify areas affected by natural disasters.
Design & implementation of distributed spatial computing node based on WPS
NASA Astrophysics Data System (ADS)
Liu, Liping; Li, Guoqing; Xie, Jibo
2014-03-01
Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
Application of open source standards and technologies in the http://climate4impact.eu/ portal
NASA Astrophysics Data System (ADS)
Plieger, Maarten; Som de Cerff, Wim; Pagé, Christian; Tatarinova, Natalia
2015-04-01
This presentation will demonstrate how to calculate and visualize the climate indice SU (number of summer days) on the climate4impact portal. The following topics will be covered during the demonstration: - Security: Login using OpenID for access to the Earth System Grid Fedeation (ESGF) data nodes. The ESGF works in conjunction with several external websites and systems. The climate4impact portal uses X509 based short lived credentials, generated on behalf of the user with a MyProxy service. Single Sign-on (SSO) is used to make these websites and systems work together. - Discovery: Facetted search based on e.g. variable name, model and institute using the ESGF search services. A catalog browser allows for browsing through CMIP5 and any other climate model data catalogues (e.g. ESSENCE, EOBS, UNIDATA). - Processing using Web Processing Services (WPS): Transform data, subset, export into other formats, and perform climate indices calculations using Web Processing Services implemented by PyWPS, based on NCAR NCPP OpenClimateGIS and IS-ENES2 ICCLIM. - Visualization using Web Map Services (WMS): Visualize data from ESGF data nodes using ADAGUC Web Map Services. The aim of climate4impact is to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. The portal is based on 21 impact use cases from 5 different European countries, and is evaluated by a user panel consisting of use case owners. It has been developed within the European projects IS-ENES and IS-ENES2 for more than 5 years, and its development currently continues within IS-ENES2 and CLIPC. As the climate impact community is very broad, the focus is mainly on the scientific impact community. This work has resulted in the ENES portal interface for climate impact communities and can be visited at http://climate4impact.eu/ The current main objectives for climate4impact can be summarized in two objectives. The first one is to work on a web interface which automatically generates a graphical user interface on WPS endpoints. The WPS calculates climate indices and subset data using OpenClimateGIS/ICCLIM on data stored in ESGF data nodes. Data is then transmitted from ESGF nodes over secured OpenDAP and becomes available in a new, per user, secured OpenDAP server. The results can then be visualized again using ADAGUC WMS. Dedicated wizards for processing of climate indices will be developed in close collaboration with users. The second one is to expose climate4impact services, so as to offer standardized services which can be used by other portals. This has the advantage to add interoperability between several portals, as well as to enable the design of specific portals aimed at different impact communities, either thematic or national, for example.
An automated and integrated framework for dust storm detection based on ogc web processing services
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.
2014-11-01
Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.
NASA Astrophysics Data System (ADS)
Mihajlovski, A.; Spinuso, A.; Plieger, M.; Som de Cerff, W.
2016-12-01
Modern Climate analysis platforms provide generic and standardized ways of accessing data and processing services. These are typically supported by a wide range of OGC formats and interfaces. However, the problem of instrumentally tracing the lineage of the transformations occurring on a dataset and its provenance remains an open challenge. It requires standard-driven and interoperable solutions to facilitate understanding, sharing of self-describing data products, fostering collaboration among peers. The CLIPC portal provided us real use case, where the need of an instrumented provenance management is fundamental. CLIPC provides a single point of access for scientific information on climate change. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses. This is made possible through the Copernicus Earth Observation Programme for Europe. With a backbone combining WPS and OPeNDAP services, CLIPC has two themes: 1. Harmonized access to climate datasets derived from models, observations and re-analyses 2. A climate impact tool kit to evaluate, rank and aggregate indicators The climate impact tool kit is realised with the orchestration of a number of WPS that ingest, normalize and combine NetCDF files. The WPS allowing this specific computation are hosted by the climate4impact portal, which is a more generic climate data-access and processing service. In this context, guaranteeing validation and reproducibility of results, is a clearly stated requirement to improve the quality of the results obtained by the combined analysis Two core contributions made, are the enabling of a provenance wrapper around WPS services and the enabling of provenance tracing within the NetCDF format, which adopts and extends the W3C's PROV model. To disseminate indicator data and create transformed data products, a standardized provenance, metadata and processing infrastructure is researched for CLIPC. These efforts will lead towards the provision of tools for further web service processing development and optimisation, opening up possibilities to scale and administer abstract users and data driven workflows.
Proposal for a Web Encoding Service (wes) for Spatial Data Transactio
NASA Astrophysics Data System (ADS)
Siew, C. B.; Peters, S.; Rahman, A. A.
2015-10-01
Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.
Towards Semantic Web Services on Large, Multi-Dimensional Coverages
NASA Astrophysics Data System (ADS)
Baumann, P.
2009-04-01
Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it does not anticipate any particular protocol. One such protocol is given by the OGC Web Coverage Service (WCS) Processing Extension standard which ties WCPS into WCS. Another protocol which makes WCPS an OGC Web Processing Service (WPS) Profile is under preparation. Thereby, WCPS bridges WCS and WPS. The conceptual model of WCPS relies on the coverage model of WCS, which in turn is based on ISO 19123. WCS currently addresses raster-type coverages where a coverage is seen as a function mapping points from a spatio-temporal extent (its domain) into values of some cell type (its range). A retrievable coverage has an identifier associated, further the CRSs supported and, for each range field (aka band, channel), the interpolation methods applicable. The WCPS language offers access to one or several such coverages via a functional, side-effect free language. The following example, which derives the NDVI (Normalized Difference Vegetation Index) from given coverages C1, C2, and C3 within the regions identified by the binary mask R, illustrates the language concept: for c in ( C1, C2, C3 ), r in ( R ) return encode( (char) (c.nir - c.red) / (c.nir + c.red), H˜DF-EOS\\~ ) The result is a list of three HDF-EOS encoded images containing masked NDVI values. Note that the same request can operate on coverages of any dimensionality. The expressive power of WCPS includes statistics, image, and signal processing up to recursion, to maintain safe evaluation. As both syntax and semantics of any WCPS expression is well known the language is Semantic Web ready: clients can construct WCPS requests on the fly, servers can optimize such requests (this has been investigated extensively with the rasdaman raster database system) and automatically distribute them for processing in a WCPS-enabled computing cloud. The WCPS Reference Implementation is being finalized now that the standard is stable; it will be released in open source once ready. Among the future tasks is to extend WCPS to general meshes, in synchronization with the WCS standard. In this talk WCPS is presented in the context of OGC standardization. The author is co-chair of OGC's WCS Working Group (WG) and Coverages WG.
BPELPower—A BPEL execution engine for geospatial web services
NASA Astrophysics Data System (ADS)
Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi
2012-10-01
The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.
Experiencing WPS services in several application domains: opportunities and challenges
NASA Astrophysics Data System (ADS)
lovergine, francesco paolo; tarantino, cristina; d'addabbo, annarita; adamo, patrizia; giuseppe, satalino; refice, alberto; blonda, palma; vicario, saverio
2016-04-01
Experiencing WPS services in several application domains: opportunities and challenges ====================================================================================== The implementation of OGC web services and specifically of WPS services revealed itself as a key aspect in order to encourage openess attitude of scientific investigators within several application domains. It can benefit scientific research under different regards, even considering the possibility to promote interoperability, modularity, and the possibility opened by web modeling and the workflow paradigm explotation. Nevertheless it is still a challenging activity and specifically processing services still seem being at an early stage of maturity. This work is about exploitation activities conducted within the GEO GEOSS AIP-8 call by focusing on several applications, such as biodiversity, flood monitoring and soil moisture computation, with implementations based on the pyWPS framework for WPS 1.0 as available at the time of this work. We will present results, lessons learnt and limits found in using those services for distributing demo processing models, along with pro and cons in our experience. References: Refice, A., Capolongo, D., Pasquariello, G., D'Addabbo, A., Bovenga, F., Nutricato, Lovergine F.P., R., Pietranera, L. (2014). SAR and InSAR for Flood Monitoring: Examples With COSMO-SkyMed Data. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 7(7), 2711--722. F. Mattia, G. Satalino, A. Balenzano, V. Pauwels, E. De Lathauwer, "GMES Sentinel-1 soil moisture algorithm development", Final report for the European Space Agency, ESA ESTEC Contract No. 4000101352/10 /NL/MP/ef, 30 Nov. 2011. V. Tomaselli, P. Dimopoulos, C. Marangi, A. S. Kallimanis, M. Adamo, C. Tarantino, M. Panitsa, M. Terzi, G. Veronico, F. Lovergine, H. Nagendra, R. Lucas, P. Mairota, C.A. Mucher, P. Blonda, "Translating land cover/land use classifications to habitat taxonomies for landscape monitoring: a Mediterranean assessment", Landscape Ecology, February 2013, DOI: 10.1007/s10980-013-9863-3 M. Adamo, C. Tarantino, V. Tomaselli, V. Kosmidou, Z. Petrou, I. Manakos, R.M. Lucas, C.A. Mucher, G. Veronico, C. Marangi, V. De Pasquale and P. Blonda, "Expert knowledge for translating land cover/use maps to General Habitat Categories (GHCs)", Landscape Ecology, DOI: 10.1007/s10980-014-0028-9, April 2014 Allen, B., Kon, M. & Bar-Yam, Y. A new phylogenetic diversity measure generalizing the shannon index and its application to phyllostomid bats. Am. Nat. 174, 236-43 (2009). Chao, A., Chiu, C.-H. & Jost, L. Phylogenetic diversity measures based on Hill numbers. Philos. Trans. R. Soc. Lond. B. Biol. Sci. 365,3599-609 (2010).
Decentralized Orchestration of Composite Ogc Web Processing Services in the Cloud
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Cao, J.
2016-09-01
Current web-based GIS or RS applications generally rely on centralized structure, which has inherent drawbacks such as single points of failure, network congestion, and data inconsistency, etc. The inherent disadvantages of traditional GISs need to be solved for new applications on Internet or Web. Decentralized orchestration offers performance improvements in terms of increased throughput and scalability and lower response time. This paper investigates build time and runtime issues related to decentralized orchestration of composite geospatial processing services based on OGC WPS standard specification. A case study of dust storm detection was demonstrated to evaluate the proposed method and the experimental results indicate that the method proposed in this study is effective for its ability to produce the high quality solution at a low cost of communications for geospatial processing service composition problem.
A suite of R packages for web-enabled modeling and analysis of surface waters
NASA Astrophysics Data System (ADS)
Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.
2014-12-01
Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.
EPA has initiated a process to revise certain requirements in the WPS. By the end of FY2018, EPA expects to publish a Notice of Proposed Rulemaking to solicit public input on proposed revisions to the WPS requirements for minimum ages, designated represen
ERIC Educational Resources Information Center
Rushinek, Avi; Rushinek, Sara
1984-01-01
Describes results of a system rating study in which users responded to WPS (word processing software) questions. Study objectives were data collection and evaluation of variables; statistical quantification of WPS's contribution (along with other variables) to user satisfaction; design of an expert system to evaluate WPS; and database update and…
NASA Astrophysics Data System (ADS)
Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.
2016-12-01
The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.
NASA Astrophysics Data System (ADS)
Gauvin St-Denis, B.; Landry, T.; Huard, D. B.; Byrns, D.; Chaumont, D.; Foucher, S.
2017-12-01
As the number of scientific studies and policy decisions requiring tailored climate information continues to increase, the demand for support from climate service centers to provide the latest information in the format most helpful for the end-user is also on the rise. Ouranos, being one such organization based in Montreal, has partnered with the Centre de recherche informatique de Montreal (CRIM) to develop a platform that will offer climate data products that have been identified as most useful for users through years of consultation. The platform is built as modular components that target the various requirements of climate data analysis. The data components host and catalog NetCDF data as well as geographical and political delimitations. The analysis components are made available as atomic operations through Web Processing Service (WPS) or as workflows, whereby the operations are chained through a simple JSON structure and executed on a distributed network of computing resources. The visualization components range from Web Map Service (WMS) to a complete frontend for searching the data, launching workflows and interacting with maps of the results. Each component can easily be deployed and executed as an independent service through the use of Docker technology and a proxy is available to regulate user workspaces and access permissions. PAVICS includes various components from birdhouse, a collection of WPS initially developed by the German Climate Research Center (DKRZ) and Institut Pierre Simon Laplace (IPSL) and is designed to be highly interoperable with other WPS as well as many Open Geospatial Consortium (OGC) standards. Further connectivity is made with the Earth System Grid Federation (ESGF) nodes and local results are made searchable using the same API terminology. Other projects conducted by CRIM that integrate with PAVICS include the OGC Testbed 13 Innovation Program (IP) initiative that will enhance advanced cloud capabilities, application packaging deployment processes, as well as enabling Earth Observation (EO) processes relevant to climate. As part of its experimental agenda, working implementations of scalable machine learning on big climate data with Spark and SciSpark were delivered.
Web processing service for landslide hazard assessment
NASA Astrophysics Data System (ADS)
Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.
2012-04-01
Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file
Business logic for geoprocessing of distributed geodata
NASA Astrophysics Data System (ADS)
Kiehle, Christian
2006-12-01
This paper describes the development of a business-logic component for the geoprocessing of distributed geodata. The business logic acts as a mediator between the data and the user, therefore playing a central role in any spatial information system. The component is used in service-oriented architectures to foster the reuse of existing geodata inventories. Based on a geoscientific case study of groundwater vulnerability assessment and mapping, the demands for such architectures are identified with special regard to software engineering tasks. Methods are derived from the field of applied Geosciences (Hydrogeology), Geoinformatics, and Software Engineering. In addition to the development of a business logic component, a forthcoming Open Geospatial Consortium (OGC) specification is introduced: the OGC Web Processing Service (WPS) specification. A sample application is introduced to demonstrate the potential of WPS for future information systems. The sample application Geoservice Groundwater Vulnerability is described in detail to provide insight into the business logic component, and demonstrate how information can be generated out of distributed geodata. This has the potential to significantly accelerate the assessment and mapping of groundwater vulnerability. The presented concept is easily transferable to other geoscientific use cases dealing with distributed data inventories. Potential application fields include web-based geoinformation systems operating on distributed data (e.g. environmental planning systems, cadastral information systems, and others).
Design, Implementation and Applications of 3d Web-Services in DB4GEO
NASA Astrophysics Data System (ADS)
Breunig, M.; Kuper, P. V.; Dittrich, A.; Wild, P.; Butwilowski, E.; Al-Doori, M.
2013-09-01
The object-oriented database architecture DB4GeO was originally designed to support sub-surface applications in the geo-sciences. This is reflected in DB4GeO's geometric data model as well as in its import and export functions. Initially, these functions were designed for communication with 3D geological modeling and visualization tools such as GOCAD or MeshLab. However, it soon became clear that DB4GeO was suitable for a much wider range of applications. Therefore it is natural to move away from a standalone solution and to open the access to DB4GeO data by standardized OGC web-services. Though REST and OGC services seem incompatible at first sight, the implementation in DB4GeO shows that OGC-based implementation of web-services may use parts of the DB4GeO-REST implementation. Starting with initial solutions in the history of DB4GeO, this paper will introduce the design, adaptation (i.e. model transformation), and first steps in the implementation of OGC Web Feature (WFS) and Web Processing Services (WPS), as new interfaces to DB4GeO data and operations. Among its capabilities, DB4GeO can provide data in different data formats like GML, GOCAD, or DB3D XML through a WFS, as well as its ability to run operations like a 3D-to-2D service, or mesh-simplification (Progressive Meshes) through a WPS. We then demonstrate, an Android-based mobile 3D augmented reality viewer for DB4GeO that uses the Web Feature Service to visualize 3D geo-database query results. Finally, we explore future research work considering DB4GeO in the framework of the research group "Computer-Aided Collaborative Subway Track Planning in Multi-Scale 3D City and Building Models".
Barchitta, M; Fragapane, S; Consoli, M T; Pennisi, C; Agodi, A
2012-01-01
The growing needs of people with disabilities require to integrate this issue into public health in order to improve political feasibility and to ensure that disability will not be left off from any strategic table. The main aim of the "Care for Work" project was to provide training contents to help workers and unemployed people to adapt their knowledge, skills and competencies to the care services sector in order to facilitate their insertion in a new employment source. The partners participating in the project are Organizations from 5 European countries. The project has been divided into seven Work Packages (WPs): three transversal WPs and four specific WPs, each addressing specific activities necessary to achieve the final objectives of the project. The "Care for Work" learning environment contains specific information and training on the techniques for caring people with acquired physical disabilities, as text documents and short training films. The project combines e-learning (Web 2.0) and mobile learning providing a flexible training platform for workers of care services sector. The "Care for Work" project offers specific training addressed to meet the new existing needs of workers of the care services sector and/or unemployed people. All the information and results of the project are available on the web page: www.careforwork.eu, and the present article is part of the WP "Valorization".
NASA Astrophysics Data System (ADS)
Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.
2017-11-01
The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander
2017-04-01
For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.
NASA Astrophysics Data System (ADS)
Steer, Adam; Trenham, Claire; Druken, Kelsey; Evans, Benjamin; Wyborn, Lesley
2017-04-01
High resolution point clouds and other topology-free point data sources are widely utilised for research, management and planning activities. A key goal for research and management users is making these data and common derivatives available in a way which is seamlessly interoperable with other observed and modelled data. The Australian National Computational Infrastructure (NCI) stores point data from a range of disciplines, including terrestrial and airborne LiDAR surveys, 3D photogrammetry, airborne and ground-based geophysical observations, bathymetric observations and 4D marine tracers. These data are stored alongside a significant store of Earth systems data including climate and weather, ecology, hydrology, geoscience and satellite observations, and available from NCI's National Environmental Research Data Interoperability Platform (NERDIP) [1]. Because of the NERDIP requirement for interoperability with gridded datasets, the data models required to store these data may not conform to the LAS/LAZ format - the widely accepted community standard for point data storage and transfer. The goal for NCI is making point data discoverable, accessible and useable in ways which allow seamless integration with earth observation datasets and model outputs - in turn assisting researchers and decision-makers in the often-convoluted process of handling and analyzing massive point datasets. With a use-case of providing a web data service and supporting a derived product workflow, NCI has implemented and tested a web-based point cloud service using the Open Geospatial Consortium (OGC) Web Processing Service [2] as a transaction handler between a web-based client and server-side computing tools based on a native Linux operating system. Using this model, the underlying toolset for driving a data service is flexible and can take advantage of NCI's highly scalable research cloud. Present work focusses on the Point Data Abstraction Library (PDAL) [3] as a logical choice for efficiently handling LAS/LAZ based point workflows, and native HDF5 libraries for handling point data kept in HDF5-based structures (eg NetCDF4, SPDlib [4]). Points stored in database tables (eg postgres-pointcloud [5]) will be considered as testing continues. Visualising and exploring massive point datasets in a web browser alongside multiple datasets has been demonstrated by the entwine-3D tiles project [6]. This is a powerful interface which enables users to investigate and select appropriate data, and is also being investigated as a potential front-end to a WPS-based point data service. In this work we show preliminary results for a WPS-based point data access system, in preparation for demonstration at FOSS4G 2017, Boston (http://2017.foss4g.org/) [1] http://nci.org.au/data-collections/nerdip/ [2] http://www.opengeospatial.org/standards/wps [3] http://www.pdal.io [4] http://www.spdlib.org/doku.php [5] https://github.com/pgpointcloud/pointcloud [6] http://cesium.entwine.io
User-driven Cloud Implementation of environmental models and data for all
NASA Astrophysics Data System (ADS)
Gurney, R. J.; Percy, B. J.; Elkhatib, Y.; Blair, G. S.
2014-12-01
Environmental data and models come from disparate sources over a variety of geographical and temporal scales with different resolutions and data standards, often including terabytes of data and model simulations. Unfortunately, these data and models tend to remain solely within the custody of the private and public organisations which create the data, and the scientists who build models and generate results. Although many models and datasets are theoretically available to others, the lack of ease of access tends to keep them out of reach of many. We have developed an intuitive web-based tool that utilises environmental models and datasets located in a cloud to produce results that are appropriate to the user. Storyboards showing the interfaces and visualisations have been created for each of several exemplars. A library of virtual machine images has been prepared to serve these exemplars. Each virtual machine image has been tailored to run computer models appropriate to the end user. Two approaches have been used; first as RESTful web services conforming to the Open Geospatial Consortium (OGC) Web Processing Service (WPS) interface standard using the Python-based PyWPS; second, a MySQL database interrogated using PHP code. In all cases, the web client sends the server an HTTP GET request to execute the process with a number of parameter values and, once execution terminates, an XML or JSON response is sent back and parsed at the client side to extract the results. All web services are stateless, i.e. application state is not maintained by the server, reducing its operational overheads and simplifying infrastructure management tasks such as load balancing and failure recovery. A hybrid cloud solution has been used with models and data sited on both private and public clouds. The storyboards have been transformed into intuitive web interfaces at the client side using HTML, CSS and JavaScript, utilising plug-ins such as jQuery and Flot (for graphics), and Google Maps APIs. We have demonstrated that a cloud infrastructure can be used to assemble a virtual research environment that, coupled with a user-driven development approach, is able to cater to the needs of a wide range of user groups, from domain experts to concerned members of the general public.
A BPMN solution for chaining OGC services to quality assure location-based crowdsourced data
NASA Astrophysics Data System (ADS)
Meek, Sam; Jackson, Mike; Leibovici, Didier G.
2016-02-01
The Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard enables access to a centralized repository of processes and services from compliant clients. A crucial part of the standard includes the provision to chain disparate processes and services to form a reusable workflow. To date this has been realized by methods such as embedding XML requests, using Business Process Execution Language (BPEL) engines and other external orchestration engines. Although these allow the user to define tasks and data artifacts as web services, they are often considered inflexible and complicated, often due to vendor specific solutions and inaccessible documentation. This paper introduces a new method of flexible service chaining using the standard Business Process Markup Notation (BPMN). A prototype system has been developed upon an existing open source BPMN suite to illustrate the advantages of the approach. The motivation for the software design is qualification of crowdsourced data for use in policy-making. The software is tested as part of a project that seeks to qualify, assure, and add value to crowdsourced data in a biological monitoring use case.
A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.
2015-12-01
Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.
A Metagenomic Survey of Serpentinites and Nearby Soils in Taiwan
NASA Astrophysics Data System (ADS)
Li, K. Y.; Hsu, Y. W.; Chen, Y. W.; Huang, T. Y.; Shih, Y. J.; Chen, J. S.; Hsu, B. M.
2016-12-01
The serpentinite of Taiwan is originated from the subduction zone of the Eurasian plate and the Philippine Sea plate. Many small bodies of serpentinite are scattered around the lands of the East Rift Valley, which are also one of the major agricultural areas in Taiwan. Since microbial communities play a role both on weathering process and soil recovery, uncovering the microbial compositions in serpentinites and surrounding soils may help people to understand the roles of microorganisms on serpentinites during the nature weathering process. In this study, microorganisms growing on the surface of serpentinites, in the surrounding soil, and agriculture soils that are miles of horizontal distance away from serpentinite were collected. Next generation sequencing (NGS) was carried out to examine the metagenomics of uncultured microbial community in these samples. The metagenomics were further clustered into operational taxonomic units (OTUs) to analyze relative abundance, heatmap of OTUs, and principal coordinates analysis (PCoA). Our data revealed the different types of geographic material had their own distinct structures of microbial community. In serpentinites, the heatmaps based on the phylogenetic pattern showed that the OTUs distributions were similar in phyla of Bacteroidetes, Cyanobacteria, Proteobacteria, Verrucomicrobia, and WPS-1/WPS-2. On the other hand, the heatmaps of phylogenetic pattern of agriculture soils showed that the OTUs distributions in phyla of Chloroflexi, Acidobacteria, Actinobacteria, WPS-1/WPS-2, and Proteobacteria were similar. In soil nearby the serpentinite, some clusters of OTUs in phyla of Bacteroidetes, Cyanobacteria, and WPS-1/WPS-2 have disappeared. Our data provided evidence regarding kinetic evolutions of microbial communities in different geographic materials.
Extending Climate Analytics-As to the Earth System Grid Federation
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.
2015-12-01
We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.
NASA Astrophysics Data System (ADS)
Yek, Peter Nai Yuh; Keey Liew, Rock; Shahril Osman, Mohammad; Chung Wong, Chee; Lam, Su Shiung
2017-11-01
Waste palm shell (WPS) is a biomass residue largely available from palm oil industries. An innovative microwave pyrolysis method was developed to produce biochar from WPS while the pyrolysis gas generated as another product is simultaneously used as activating agent to transform the biochar into waste palm shell activated carbon (WPSAC), thus allowing carbonization and activation to be performed simultaneously in a single-step approach. The pyrolysis method was investigated over a range of process temperature and feedstock amount with emphasis on the yield and composition of the WPSAC obtained. The WPSAC was tested as dye adsorbent in removing methylene blue. This pyrolysis approach provided a fast heating rate (37.5°/min) and short process time (20 min) in transforming WPS into WPSAC, recording a product yield of 40 wt%. The WPSAC was detected with high BET surface area (≥ 1200 m2/g), low ash content (< 5 wt%), and high pore volume (≥ 0.54 cm3/g), thus recording high adsorption efficiency of 440 mg of dye/g. The desirable process features (fast heating rate, short process time) and the recovery of WPSAC suggest the exceptional promise of the single-step microwave pyrolysis approach to produce high-grade WPSAC from WPS.
NASA Astrophysics Data System (ADS)
Pierleoni, Arnaldo; Casagrande, Luca; Bellezza, Michele; Casadei, Stefano
2010-05-01
The need for increasingly complex geospatial algorithms dedicated to the management of water resources, the fact that many of them require specific knowledge and the need for dedicated computing machines has led to the necessity of centralizing and sharing all the server applications and the plugins developed. For this purpose, a Web Processing Service (WPS) that can make available to users a range of geospatial analysis algorithms, geostatistics, remote sensing procedures and that can be used simply by providing data and input parameters and download the results has been developed. The core of the system infrastructure is a GRASS GIS, which acts as a computational engine, providing more than 350 forms of analysis and the opportunity to create new and ad hoc procedures. The implementation of the WPS was performed using the software PyWPS written in Python that is easily manageable and configurable. All these instruments are managed by a daemon named "Arcibald" specifically created for the purpose of listing the order of the requests that come from the users. In fact, it may happen that there are already ongoing processes so the system will queue the new ones registering the request and running it only when the previous calculations have been completed. However, individual Geoprocessing have an indicator to assess the resources necessary to implement it, enabling you to run geoprocesses that do not require excessive computing time in parallel. This assessment is also made in relation to the size of the input file provided. The WPS standard defines methods for accessing and running Geoprocessing regardless of the client used, however, the project has been developed specifically for a graphical client to access the resources. The client was built as a plugin for the GIS QGis Software which provides the most common tools for the view and the consultation of geographically referenced data. The tool was tested using the data taken during the bathymetric campaign at the Montedoglio Reservoir on the Tiber River in order to generate a digital model of the reservoir bed. Starting from a text file containing coordinates and the depth of the points (previously statistically treated to remove any inaccuracy), we used the plugin for QGis to connect to the Web service and started the process of cross validation in order to obtain the parameters to be used for interpolation. This makes possible to highlight the morphological variations of the basin of reservoirs due to silting phenomena, therefore to consider the actual capacity of the basin for a proper evaluation of the available water resource. Indeed, this is a critical step for the next phase of management. In this case, since the procedure is very long (order of days), the system automatically choose to send the results via email. Moreover the system, once the procedures invoked end, allows to choose whether to share data and results or to remove all traces of the calculation. This because in some cases data and sensitive information are used and this could violate privacy policies if shared. The entire project is made only with open-source software.
Development of a Web-Based Visualization Platform for Climate Research Using Google Earth
NASA Technical Reports Server (NTRS)
Sun, Xiaojuan; Shen, Suhung; Leptoukh, Gregory G.; Wang, Panxing; Di, Liping; Lu, Mingyue
2011-01-01
Recently, it has become easier to access climate data from satellites, ground measurements, and models from various data centers, However, searching. accessing, and prc(essing heterogeneous data from different sources are very tim -consuming tasks. There is lack of a comprehensive visual platform to acquire distributed and heterogeneous scientific data and to render processed images from a single accessing point for climate studies. This paper. documents the design and implementation of a Web-based visual, interoperable, and scalable platform that is able to access climatological fields from models, satellites, and ground stations from a number of data sources using Google Earth (GE) as a common graphical interface. The development is based on the TCP/IP protocol and various data sharing open sources, such as OPeNDAP, GDS, Web Processing Service (WPS), and Web Mapping Service (WMS). The visualization capability of integrating various measurements into cE extends dramatically the awareness and visibility of scientific results. Using embedded geographic information in the GE, the designed system improves our understanding of the relationships of different elements in a four dimensional domain. The system enables easy and convenient synergistic research on a virtual platform for professionals and the general public, gr$tly advancing global data sharing and scientific research collaboration.
PAVICS: A Platform for the Analysis and Visualization of Climate Science
NASA Astrophysics Data System (ADS)
Gauvin St-Denis, B.; Landry, T.; Huard, D. B.; Byrns, D.; Chaumont, D.; Foucher, S.
2016-12-01
Climate service providers are boundary organizations working at the interface of climate science research and users of climate information. Users include academics in other disciplines looking for credible, customized future climate scenarios, government planners, resource managers, asset owners, as well as service utilities. These users are looking for relevant information regarding the impacts of climate change as well as informing decisions regarding adaptation options. As climate change concerns become mainstream, the pressure on climate service providers to deliver tailored, high quality information in a timely manner increases rapidly. To meet this growing demand, Ouranos, a climate service center located in Montreal, is collaborating with the Centre de recherche informatique de Montreal (CRIM) to develop a climate data analysis web-based platform interacting with RESTful services covering data access and retrieval, geospatial analysis, bias correction, distributed climate indicator computing and results visualization. The project, financed by CANARIE, relies on the experience of the UV-CDAT and ESGF-CWT teams, as well as on the Birdhouse framework developed by the German Climate Research Center (DKRZ) and French IPSL. Climate data is accessed through OPEnDAP, while computations are carried through WPS. Regions such as watersheds or user-defined polygons, used as spatial selections for computations, are managed by GeoServer, also providing WMS, WFS and WPS capabilities. The services are hosted on independent servers communicating by high throughput network. Deployment, maintenance and collaboration with other development teams are eased by the use of Docker and OpenStack VMs. Web-based tools are developed with modern web frameworks such as React-Redux, OpenLayers 3, Cesium and Plotly. Although the main objective of the project is to build a functioning, usable data analysis pipeline within two years, time is also devoted to explore emerging technologies and assess their potential. For instance, sandbox environments will store climate data in HDFS, process it with Apache Spark and allow interaction through Jupyter Notebooks. Data streaming of observational data with OpenGL and Cesium is also considered.
Predicting wood pellet stove ownership and acquisition in Albuquerque, NM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lansford, R.; Skaggs, R.; Owensby, F.
1994-12-31
Wood pellet stove (WPS) ownership and acquisition in Albuquerque, New Mexico was predicted using a model of qualitative choice. Using data obtained from a telephone survey, households were divided into four groups: current WPS owners, non-owners considering ownership, non-owners not considering ownership, and those who had not heard of WPS technology. Variables used to predict what category a household will be in include homeowners` socioeconomic and home-heating characteristics. Results indicate few WPS stoves are currently in use in Albuquerque. However, current WPS owners and those considering WPS acquisition tend to have higher incomes, more years of education, larger homes, andmore » use their fireplaces more frequently than average. Clean air regulations in Albuquerque will require changes in home woodburning. The WPS is an efficient and clean device; however, lack of knowledge of WPS technology, satisfaction with current heating systems, and limited awareness of the potential impact of clean air regulations indicate WPS usage in Albuquerque will remain limited.« less
Is Word-Problem Solving a Form of Text Comprehension?
Fuchs, Lynn S.; Fuchs, Douglas; Compton, Donald L.; Hamlett, Carol L.; Wang, Amber Y.
2015-01-01
This study’s hypotheses were that (a) word-problem (WP) solving is a form of text comprehension that involves language comprehension processes, working memory, and reasoning, but (b) WP solving differs from other forms of text comprehension by requiring WP-specific language comprehension as well as general language comprehension. At the start of the 2nd grade, children (n = 206; on average, 7 years, 6 months) were assessed on general language comprehension, working memory, nonlinguistic reasoning, processing speed (a control variable), and foundational skill (arithmetic for WPs; word reading for text comprehension). In spring, they were assessed on WP-specific language comprehension, WPs, and text comprehension. Path analytic mediation analysis indicated that effects of general language comprehension on text comprehension were entirely direct, whereas effects of general language comprehension on WPs were partially mediated by WP-specific language. By contrast, effects of working memory and reasoning operated in parallel ways for both outcomes. PMID:25866461
Architecture of the local spatial data infrastructure for regional climate change research
NASA Astrophysics Data System (ADS)
Titov, Alexander; Gordov, Evgeny
2013-04-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, etc.) are actively used in modeling and analysis of climate change for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset studies in the area of climate and environmental change require a special software support based on SDI approach. A dedicated architecture of the local spatial data infrastructure aiming at regional climate change analysis using modern web mapping technologies is presented. Geoportal is a key element of any SDI, allowing searching of geoinformation resources (datasets and services) using metadata catalogs, producing geospatial data selections by their parameters (data access functionality) as well as managing services and applications of cartographical visualization. It should be noted that due to objective reasons such as big dataset volume, complexity of data models used, syntactic and semantic differences of various datasets, the development of environmental geodata access, processing and visualization services turns out to be quite a complex task. Those circumstances were taken into account while developing architecture of the local spatial data infrastructure as a universal framework providing geodata services. So that, the architecture presented includes: 1. Effective in terms of search, access, retrieval and subsequent statistical processing, model of storing big sets of regional georeferenced data, allowing in particular to store frequently used values (like monthly and annual climate change indices, etc.), thus providing different temporal views of the datasets 2. General architecture of the corresponding software components handling geospatial datasets within the storage model 3. Metadata catalog describing in detail using ISO 19115 and CF-convention standards datasets used in climate researches as a basic element of the spatial data infrastructure as well as its publication according to OGC CSW (Catalog Service Web) specification 4. Computational and mapping web services to work with geospatial datasets based on OWS (OGC Web Services) standards: WMS, WFS, WPS 5. Geoportal as a key element of thematic regional spatial data infrastructure providing also software framework for dedicated web applications development To realize web mapping services Geoserver software is used since it provides natural WPS implementation as a separate software module. To provide geospatial metadata services GeoNetwork Opensource (http://geonetwork-opensource.org) product is planned to be used for it supports ISO 19115/ISO 19119/ISO 19139 metadata standards as well as ISO CSW 2.0 profile for both client and server. To implement thematic applications based on geospatial web services within the framework of local SDI geoportal the following open source software have been selected: 1. OpenLayers JavaScript library, providing basic web mapping functionality for the thin client such as web browser 2. GeoExt/ExtJS JavaScript libraries for building client-side web applications working with geodata services. The web interface developed will be similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. The work is partially supported by RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2.1 and IP 131.
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
A web service for service composition to aid geospatial modelers
NASA Astrophysics Data System (ADS)
Bigagli, L.; Santoro, M.; Roncella, R.; Mazzetti, P.
2012-04-01
The identification of appropriate mechanisms for process reuse, chaining and composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. In the Earth and Space Sciences, such a facility could primarily enable integrated and interoperable modeling, for what several approaches have been proposed and developed, over the last years. In fact, GEOSS is specifically tasked with the development of the so-called "Model Web". At increasing levels of abstraction and generalization, the initial stove-pipe software tools have evolved to community-wide modeling frameworks, to Component-Based Architecture solution, and, more recently, started to embrace Service-Oriented Architectures technologies, such as the OGC WPS specification and the WS-* stack of W3C standards for service composition. However, so far, the level of abstraction seems too low for implementing the Model Web vision, and far too complex technological aspects must still be addressed by both providers and users, resulting in limited usability and, eventually, difficult uptake. As by the recent ICT trend of resource virtualization, it has been suggested that users in need of a particular processing capability, required by a given modeling workflow, may benefit from outsourcing the composition activities into an external first-class service, according to the Composition as a Service (CaaS) approach. A CaaS system provides the necessary interoperability service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general) in the form of executable workflows. This work introduces the architecture of a CaaS system, as a distributed information system for creating, validating, editing, storing, publishing, and executing geospatial workflows. This way, the users can be freed from the need of a composition infrastructure and alleviated from the technicalities of workflow definitions (type matching, identification of external services endpoints, binding issues, etc.) and focus on their intended application. Moreover, the user may submit an incomplete workflow definition, and leverage CaaS recommendations (that may derive from an aggregated knowledge base of user feedback, underpinned by Web 2.0 technologies) to execute it. This is of particular interest for multidisciplinary scientific contexts, where different communities may benefit of each other knowledge through model chaining. Indeed, the CaaS approach is presented as an attempt to combine the recent advances in service-oriented computing with collaborative research principles, and social network information in general. Arguably, it may be considered a fundamental capability of the Model Web. The CaaS concept is being investigated in several application scenarios identified in the FP7 UncertWeb and EuroGEOSS projects. Key aspects of the described CaaS solution are: it provides a standard WPS interface for invoking Business Processes and allows on the fly recursive compositions of Business Processes into other Composite Processes; it is designed according to the extended SOA (broker-based) and the System-of-Systems approach, to support the reuse and integration of existing resources, in compliance with the GEOSS Model Web architecture. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 248488.
Nemmar, Abderrahim; Al-Salam, Suhail; Beegam, Sumaya; Yuvaraju, Priya; Oulhaj, Abderrahim; Ali, Badreldin H
2017-01-01
It has been shown, both experimentally and clinically, that water-pipe smoke (WPS) exposure adversely affects the cardiovascular system (CVS) through the generation of oxidative stress and inflammation. Betaine, a naturally occurring compound in common foods, has antioxidant and anti-inflammatory actions. However, its potential to mitigate the adverse effect of WPS on the CVS has never been reported before. This is the subject of this study in mice. Mice were exposed daily for 30 min to either normal air (control), or to WPS for two consecutive weeks. Betaine was administered daily by gavage at a dose of 10mg/kg, 1h before either air or WPS exposure. Betaine mitigated the in vivo prothrombotic effect of WPS in pial arterioles and venules. Moreover, it reversed the WPS-induced decrease in circulating platelets. Likewise, betaine alleviated platelet aggregation in vitro, and the shortening of activated partial thromboplastin time and prothrombin time induced by WPS. Betaine reduced the increase of plasminogen activator inhibitor-1 and fibrinogen concentrations in plasma induced by WPS. Betaine also diminished the WPS-induced increase of plasma concentrations of interleukin 6 and tumor necrosis factor α, and attenuated the increase of lipid peroxidation and superoxide dismutase. Immunohistochemical analysis of the heart revealed an increase in the expression of inducible nitric oxide synthase and cytochrome C by cardiomyocytes of the WPS-exposed mice. These effects were averted by betaine. Our findings suggest that betaine treatment significantly mitigated WPS-induced hypercoagulability, and inflammation, as well as systemic and cardiac oxidative stress. © 2017 The Author(s)Published by S. Karger AG, Basel.
Early pulmonary events of nose-only water pipe (shisha) smoking exposure in mice
Nemmar, Abderrahim; Hemeiri, Ahmed Al; Hammadi, Naser Al; Yuvaraju, Priya; Beegam, Sumaya; Yasin, Javed; Elwasila, Mohamed; Ali, Badreldin H; Adeghate, Ernest
2015-01-01
Water pipe smoking (WPS) is increasing in popularity and prevalence worldwide. Convincing data suggest that the toxicants in WPS are similar to that of cigarette smoke. However, the underlying pathophysiologic mechanisms related to the early pulmonary events of WPS exposure are not understood. Here, we evaluated the early pulmonary events of nose-only exposure to mainstream WPS generated by commercially available honey flavored “moasel” tobacco. BALB/c mice were exposed to WPS 30 min/day for 5 days. Control mice were exposed using the same protocol to atmospheric air only. We measured airway resistance using forced oscillation technique, and pulmonary inflammation was evaluated histopathologically and by biochemical analysis of bronchoalveolar lavage (BAL) fluid and lung tissue. Lung oxidative stress was evaluated biochemically by measuring the level of reactive oxygen species (ROS), lipid peroxidation (LPO), reduced glutathione (GSH), catalase, and superoxide dismutase (SOD). Mice exposed to WPS showed a significant increase in the number of neutrophils (P < 0.05) and lymphocytes (P < 0.001). Moreover, total protein (P < 0.05), lactate dehydrogenase (P < 0.005), and endothelin (P < 0.05) levels were augmented in bronchoalveolar lavage fluid. Tumor necrosis factor α (P < 0.005) and interleukin 6 (P < 0.05) concentrations were significantly increased in lung following the exposure to WPS. Both ROS (P < 0.05) and LPO (P < 0.005) in lung tissue were significantly increased, whereas the level and activity of antioxidants including GSH (P < 0.0001), catalase (P < 0.005), and SOD (P < 0.0001) were significantly decreased after WPS exposure, indicating the occurrence of oxidative stress. In contrast, airway resistance was not increased in WPS exposure. We conclude that subacute, nose-only exposure to WPS causes lung inflammation and oxidative stress without affecting pulmonary function suggesting that inflammation and oxidative stress are early markers of WPS exposure that precede airway dysfunction. Our data provide information on the initial steps involved in the respiratory effects of WPS, which constitute the underlying causal chain of reactions leading to the long-term effects of WPS. PMID:25780090
Processing, Cataloguing and Distribution of Uas Images in Near Real Time
NASA Astrophysics Data System (ADS)
Runkel, I.
2013-08-01
Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications - where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services) image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security demands - the images can be checked and interpreted in near real-time. For sensible areas it gives you the possibility to inform remote decision makers or interpretation experts in order to provide them situations awareness, wherever they are. For monitoring and inspection tasks it speeds up the process of data capture and data interpretation. The fully automated workflow of data pre-processing, data georeferencing, data cataloguing and data dissemination in near real time was developed based on the Intergraph products ERDAS IMAGINE, ERDAS APOLLO and GEOSYSTEMS METAmorph!IT. It is offered as adaptable solution by GEOSYSTEMS GmbH.
Early pulmonary events of nose-only water pipe (shisha) smoking exposure in mice.
Nemmar, Abderrahim; Al Hemeiri, Ahmed; Al Hammadi, Naser; Yuvaraju, Priya; Beegam, Sumaya; Yasin, Javed; Elwasila, Mohamed; Ali, Badreldin H; Adeghate, Ernest
2015-03-01
Water pipe smoking (WPS) is increasing in popularity and prevalence worldwide. Convincing data suggest that the toxicants in WPS are similar to that of cigarette smoke. However, the underlying pathophysiologic mechanisms related to the early pulmonary events of WPS exposure are not understood. Here, we evaluated the early pulmonary events of nose-only exposure to mainstream WPS generated by commercially available honey flavored "moasel" tobacco. BALB/c mice were exposed to WPS 30 min/day for 5 days. Control mice were exposed using the same protocol to atmospheric air only. We measured airway resistance using forced oscillation technique, and pulmonary inflammation was evaluated histopathologically and by biochemical analysis of bronchoalveolar lavage (BAL) fluid and lung tissue. Lung oxidative stress was evaluated biochemically by measuring the level of reactive oxygen species (ROS), lipid peroxidation (LPO), reduced glutathione (GSH), catalase, and superoxide dismutase (SOD). Mice exposed to WPS showed a significant increase in the number of neutrophils (P < 0.05) and lymphocytes (P < 0.001). Moreover, total protein (P < 0.05), lactate dehydrogenase (P < 0.005), and endothelin (P < 0.05) levels were augmented in bronchoalveolar lavage fluid. Tumor necrosis factor α (P < 0.005) and interleukin 6 (P < 0.05) concentrations were significantly increased in lung following the exposure to WPS. Both ROS (P < 0.05) and LPO (P < 0.005) in lung tissue were significantly increased, whereas the level and activity of antioxidants including GSH (P < 0.0001), catalase (P < 0.005), and SOD (P < 0.0001) were significantly decreased after WPS exposure, indicating the occurrence of oxidative stress. In contrast, airway resistance was not increased in WPS exposure. We conclude that subacute, nose-only exposure to WPS causes lung inflammation and oxidative stress without affecting pulmonary function suggesting that inflammation and oxidative stress are early markers of WPS exposure that precede airway dysfunction. Our data provide information on the initial steps involved in the respiratory effects of WPS, which constitute the underlying causal chain of reactions leading to the long-term effects of WPS. © 2015 The Authors. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.
Modelling noise propagation using Grid Resources. Progress within GDI-Grid
NASA Astrophysics Data System (ADS)
Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut
2010-05-01
Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation. This immense intensive calculation needs to be performed for a major part of European landscape. A LINUX version of the commercial LimA software for noise mapping analysis has been implemented on a test cluster within the German D-GRID computer network. Results and performance indicators will be presented. The presentation is an extension to last-years presentation "Spatial Data Infrastructures and Grid Computing: the GDI-Grid project" that described the gridification concept developed in the GDI-Grid project and provided an overview of the conceptual gaps between Grid Computing and Spatial Data Infrastructures. Results from the GDI-Grid project are incorporated in the OGC-OGF (Open Grid Forum) collaboration efforts as well as the OGC WPS 2.0 standards working group developing the next major version of the WPS specification.
DISTANT EARLY WARNING SYSTEM for Tsunamis - A wide-area and multi-hazard approach
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Lendholt, Matthias; Wächter, Joachim
2010-05-01
The DEWS (Distant Early Warning System) [1] project, funded under the 6th Framework Programme of the European Union, has the objective to create a new generation of interoperable early warning systems based on an open sensor platform. This platform integrates OGC [2] SWE [3] compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements in the case of tsunami early warning. Based on the upstream information flow DEWS focuses on the improvement of downstream capacities of warning centres especially by improving information logistics for effective and targeted warning message aggregation for a multilingual environment. Multiple telecommunication channels will be used for the dissemination of warning messages. Wherever possible, existing standards have been integrated. The Command and Control User Interface (CCUI), a rich client application based on Eclipse RCP (Rich Client Platform) [4] and the open source GIS uDig [5], integrates various OGC services. Using WMS (Web Map Service) [6] and WFS (Web Feature Service) [7] spatial data are utilized to depict the situation picture and to integrate a simulation system via WPS (Web Processing Service) [8] to identify affected areas. Warning messages are compiled and transmitted in the OASIS [9] CAP (Common Alerting Protocol) [10] standard together with addressing information defined via EDXL-DE (Emergency Data Exchange Language - Distribution Element) [11]. Internal interfaces are realized with SOAP [12] web services. Based on results of GITEWS [13] - in particular the GITEWS Tsunami Service Bus [14] - the DEWS approach provides an implementation for tsunami early warning systems but other geological paradigms are going to follow, e.g. volcanic eruptions or landslides. Therefore in future also multi-hazard functionality is conceivable. The specific software architecture of DEWS makes it possible to dock varying sensors to the system and to extend the CCUI with hazard specific functionality. The presentation covers the DEWS project, the system architecture and the CCUI in conjunction with details of information logistics. The DEWS Wide Area Centre connecting national centres to allow the international communication and warning exchange is presented also. REFERENCES: [1] DEWS, www.dews-online.org [2] OGC, www.opengeospatial.org [3] SWE, www.opengeospatial.org/projects/groups/sensorweb [4] Eclipse RCP, www.eclipse.org/home/categories/rcp.php [5] uDig, udig.refractions.net [6] WMS, www.opengeospatial.org/standards/wms [7] WFS, www.opengeospatial.org/standards/wfs [8] WPS, www.opengeospatial.org/standards/wps [9] OASIS, www.oasis-open.org [10] CAP, www.oasis-open.org/specs/#capv1.1 [11] EDXL-DE, www.oasis-open.org/specs/#edxlde-v1.0 [12] SOAP, www.w3.org/TR/soap [13] GITEWS (German Indonesian Tsunami Early Warning System) is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone Indian Ocean region, www.gitews.org [14] The Tsunami Service Bus is the GITEWS sensor system integration platform offering standardised services for the detection and monitoring of tsunamis
Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web
NASA Astrophysics Data System (ADS)
Huang, Hong; Gong, Jianya
2008-12-01
GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.
The IS-ENES climate4impact portal: bridging the CMIP5 and CORDEX data to impact users
NASA Astrophysics Data System (ADS)
Som de Cerff, Wim; Plieger, Maarten; Page, Christian; Tatarinova, Natalia; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Sjökvist, Elin; Vega Saldarriaga, Manuel; Santiago Cofiño Gonzalez, Antonio
2015-04-01
The aim of climate4impact (climate4impact.eu) is to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. The portal is based on 17 impact use cases from 5 different European countries, and is evaluated by a user panel consisting of use case owners. It has been developed within the IS-ENES European project and is currently operated and further developed in the IS ENES2 project. As the climate impact community is very broad, the focus is mainly on the scientific impact community. Climate4impact is connected to the Earth System Grid Federation (ESGF) nodes containing global climate model data (GCM data) from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) and regional climate model data (RCM) data from the Coordinated Regional Climate Downscaling Experiment (CORDEX). This global network of climate model data centers offers services for data description, discovery and download. The climate4impact portal connects to these services using OpenID, and offers a user interface for searching, visualizing and downloading global climate model data and more. A challenging task is to describe the available model data and how it can be used. The portal informs users about possible caveats when using climate model data. All impact use cases are described in the documentation section, using highlighted keywords pointing to detailed information in the glossary. Climate4impact currently has two main objectives. The first one is to work on a web interface which automatically generates a graphical user interface on WPS endpoints. The WPS calculates climate indices and subset data using OpenClimateGIS/icclim on data stored in ESGF data nodes. Data is then transmitted from ESGF nodes over secured OpenDAP and becomes available in a new, per user, secured OpenDAP server. The results can then be visualized again using ADAGUC WMS. Dedicated wizards for processing of climate indices will be developed in close collaboration with users. The second one is to expose climate4impact services, so as to offer standardized services which can be used by other portals (like the future Copernicus platform, developed in the EU FP7 CLIPC project). This has the advantage to add interoperability between several portals, as well as to enable the design of specific portals aimed at different impact communities, either thematic or national. In the presentation the following subjects will be detailed: - Lessons learned developing climate4impact.eu - Download: Directly from ESGF nodes and other THREDDS catalogs - Connection with the downscaling portal of the university of Cantabria - Experiences on the question and answer site via Askbot - Visualization: Visualize data from ESGF data nodes using ADAGUC Web Map Services. - Processing: Transform data, subset, export into other formats, and perform climate indices calculations using Web Processing Services implemented by PyWPS, based on NCAR NCPP OpenClimateGIS and IS-ENES2 icclim. - Security: Login using OpenID for access to the ESGF data nodes. The ESGF works in conjunction with several external websites and systems. The climate4impact portal uses X509 based short lived credentials, generated on behalf of the user with a MyProxy service. Single Sign-on (SSO) is used to make these websites and systems work together. - Discovery: Facetted search based on e.g. variable name, model and institute using the ESGF search services. A catalog browser allows for browsing through CMIP5 and any other climate model data catalogues (e.g. ESSENCE, EOBS, UNIDATA).
NASA Astrophysics Data System (ADS)
Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.
2009-04-01
The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.
Yuvaraju, Priya; Beegam, Sumaya; Ali, Badreldin H.
2018-01-01
Water pipe smoking is a tobacco smoking method commonly used in Eastern countries and is gaining popularity in Europe and North America, in particular among adolescents and young adults. Several clinical and experimental studies have reported that exposure to water pipe smoke (WPS) induces lung inflammation and impairment of pulmonary function. However, the mechanisms of such effects are not understood, as are data on the possible palliative effect of exercise training. The present study evaluated the effects of regular aerobic exercise training (treadmill: 5 days/week, 40 min/day) on subchronic exposure to WPS (30 minutes/day, 5 days/week for 2 months). C57BL/6 mice were exposed to air or WPS with or without exercise training. Airway resistance measured using forced oscillation technique was significantly and dose-dependently increased in the WPS-exposed group when compared with the air-exposed one. Exercise training significantly prevented the effect of WPS on airway resistance. Histologically, the lungs of WPS-exposed mice had focal moderate interstitial inflammatory cell infiltration consisting of neutrophil polymorphs, plasma cells, and lymphocytes. There was a mild increase in intra-alveolar macrophages and a focal damage to alveolar septae in some foci. Exercise training significantly alleviated these effects and also decreased the WPS-induced increase of tumor necrosis factor α and interleukin 6 concentrations and attenuated the increase of 8-isoprostane in lung homogenates. Likewise, the lung DNA damage induced by WPS was significantly inhibited by exercise training. Moreover, exercise training inhibited nuclear factor kappa-B (NF-κB) expression induced by WPS and increased that of nuclear factor erythroid 2-related factor 2 (Nrf2). Our findings suggest that exercise training significantly mitigated WPS-induced increase in airway resistance, inflammation, oxidative stress, and DNA damage via mechanisms that include inhibiting NF-κB and activating Nrf2 signalling pathways. PMID:29692875
Radwan, Ghada; Hecht, Stephen S; Carmella, Steven G; Loffredo, Christopher A
2013-01-01
The causal relationship between tobacco smoking and a variety of cancers is attributable to the carcinogens that smokers inhale, including tobacco-specific nitrosamines (TSNAs). We aimed to assess the exposure to TSNAs in waterpipe smokers (WPS), cigarette smokers (CS), and nonsmoking females exposed to tobacco smoke. We measured 2 metabolites, 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (NNAL) and its glucuronides (NNAl-Gluc) in the urine of males who were either current CS or WPS, and their wives exposed to either cigarette or waterpipe smoke in a sample of 46 subjects from rural Egypt. Of the 24 current male smokers, 54.2% were exclusive CS and 45.8% were exclusive WPS. Among wives, 59.1% reported exposure to cigarette smoke and 40.9% to waterpipe smoke. The geometric mean of urinary NNAL was 0.19 ± 0.60 pmol/ml urine (range 0.005-2.58) in the total sample. Significantly higher levels of NNAL were observed among male smokers of either cigarettes or waterpipe (0.89 ± 0.53 pmol/ml, range 0.78-2.58 in CS and 0.21-1.71 in WPS) compared with nonsmoking wives (0.04 ± 0.18 pmol/ml, range 0.01-0.60 in CS wives, 0.05-0.23 in WPS wives, p = .000). Among males, CS had significantly higher levels of NNAL compared with WPS (1.22 vs. 0.62; p = .007). However, no significant difference was detected in NNAL levels between wives exposed to cigarette smoke or waterpipe smoke. Cigarette smokers levels of NNAL were higher than WPS levels in males. Exposure to tobacco smoke was evident in wives of both CS and WPS. Among WPS, NNAL tended to increase with increasing numbers of hagars smoked/day.
NASA Astrophysics Data System (ADS)
Ninsawat, Sarawut; Yamamoto, Hirokazu; Kamei, Akihide; Nakamura, Ryosuke; Tsuchida, Satoshi; Maeda, Takahisa
2010-05-01
With the availability of network enabled sensing devices, the volume of information being collected by networked sensors has increased dramatically in recent years. Over 100 physical, chemical and biological properties can be sensed using in-situ or remote sensing technology. A collection of these sensor nodes forms a sensor network, which is easily deployable to provide a high degree of visibility into real-world physical processes as events unfold. The sensor observation network could allow gathering of diverse types of data at greater spatial and temporal resolution, through the use of wired or wireless network infrastructure, thus real-time or near-real time data from sensor observation network allow researchers and decision-makers to respond speedily to events. However, in the case of environmental monitoring, only a capability to acquire in-situ data periodically is not sufficient but also the management and proper utilization of data also need to be careful consideration. It requires the implementation of database and IT solutions that are robust, scalable and able to interoperate between difference and distributed stakeholders to provide lucid, timely and accurate update to researchers, planners and citizens. The GEO (Global Earth Observation) Grid is primarily aiming at providing an e-Science infrastructure for the earth science community. The GEO Grid is designed to integrate various kinds of data related to the earth observation using the grid technology, which is developed for sharing data, storage, and computational powers of high performance computing, and is accessible as a set of services. A comprehensive web-based system for integrating field sensor and data satellite image based on various open standards of OGC (Open Geospatial Consortium) specifications has been developed. Web Processing Service (WPS), which is most likely the future direction of Web-GIS, performs the computation of spatial data from distributed data sources and returns the outcome in a standard format. The interoperability capabilities and Service Oriented Architecture (SOA) of web services allow incorporating between sensor network measurement available from Sensor Observation Service (SOS) and satellite remote sensing data from Web Mapping Service (WMS) as distributed data sources for WPS. Various applications have been developed to demonstrate the efficacy of integrating heterogeneous data source. For example, the validation of the MODIS aerosol products (MOD08_D3, the Level-3 MODIS Atmosphere Daily Global Product) by ground-based measurements using the sunphotometer (skyradiometer, Prede POM-02) installed at Phenological Eyes Network (PEN) sites in Japan. Furthermore, the web-based framework system for studying a relationship between calculated Vegetation Index from MODIS satellite image surface reflectance (MOD09GA, the Surface Reflectance Daily L2G Global 1km and 500m Product) and Gross Primary Production (GPP) field measurement at flux tower site in Thailand and Japan has been also developed. The success of both applications will contribute to maximize data utilization and improve accuracy of information by validate MODIS satellite products using high degree of accuracy and temporal measurement of field measurement data.
NASA Astrophysics Data System (ADS)
Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.
2015-12-01
As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.
NASA Astrophysics Data System (ADS)
Clements, Oliver; Walker, Peter
2014-05-01
The cost of working with extremely large data sets is an increasingly important issue within the Earth Observation community. From global coverage data at any resolution to small coverage data at extremely high resolution, the community has always produced big data. This will only increase as new sensors are deployed and their data made available. Over time standard workflows have emerged. These have been facilitated by the production and adoption of standard technologies. Groups such as the International Organisation for Standardisation (ISO) and the Open Geospatial Consortium (OGC) have been a driving force in this area for many years. The production of standard protocols and interfaces such as OPeNDAP, Web Coverage Service (WCS), Web Processing Service (WPS) and the newer emerging standards such as Web Coverage Processing Service (WCPS) have helped to galvanise these workflows. An example of a traditional workflow, assume a researcher wants to assess the temporal trend in chlorophyll concentration. This would involve a discovery phase, an acquisition phase, a processing phase and finally a derived product or analysis phase. Each element of this workflow has an associated temporal and monetary cost. Firstly the researcher would require a high bandwidth connection or the acquisition phase would take too long. Secondly the researcher must have their own expensive equipment for use in the processing phase. Both of these elements cost money and time. This can make the whole process prohibitive to scientists from the developing world or "citizen scientists" that do not have the processing infrastructure necessary. The use of emerging technologies can help improve both the monetary and time costs associated with these existing workflows. By utilising a WPS that is hosted at the same location as the data a user is able to apply processing to the data without needing their own processing infrastructure. This however limits the user to predefined processes that are made available by the data provider. The emerging OGC WCPS standard combined with big data analytics engines may provide a mechanism to improve this situation. The technology allows users to create their own queries using an SQL like query language and apply them over available large data archive, once again at the data providers end. This not only removes the processing cost whilst still allowing user defined processes it also reduces the bandwidth required, as only the final analysis or derived product needs to be downloaded. The maturity of the new technologies is a stage where their use should be justified by a quantitative assessment rather than simply by the fact that they are new developments. We will present a study of the time and cost requirements for a selection of existing workflows and then show how new/emerging standards and technologies can help to both reduce the cost to the user by shifting processing to the data, and reducing the required bandwidth for analysing large datasets, making analysis of big-data archives possible for a greater and more diverse audience.
Learning in Workplace Simulations in Vocational Education: A Student Perspective
ERIC Educational Resources Information Center
Jossberger, Helen; Brand-Gruwel, Saskia; van de Wiel, Margje W. J.; Boshuizen, Henny
2018-01-01
In vocational education, workplace simulations (WPS) have been implemented to ensure a better connection between the educational setting and the labour market. Moreover, WPS are supposed to motivate students and promote self-directed learning. So far, however, not much is known about the way students experience these WPS. The aim of the present…
Predicting Development of Mathematical Word Problem Solving Across the Intermediate Grades
Tolar, Tammy D.; Fuchs, Lynn; Cirino, Paul T.; Fuchs, Douglas; Hamlett, Carol L.; Fletcher, Jack M.
2012-01-01
This study addressed predictors of the development of word problem solving (WPS) across the intermediate grades. At beginning of 3rd grade, 4 cohorts of students (N = 261) were measured on computation, language, nonverbal reasoning skills, and attentive behavior and were assessed 4 times from beginning of 3rd through end of 5th grade on 2 measures of WPS at low and high levels of complexity. Language skills were related to initial performance at both levels of complexity and did not predict growth at either level. Computational skills had an effect on initial performance in low- but not high-complexity problems and did not predict growth at either level of complexity. Attentive behavior did not predict initial performance but did predict growth in low-complexity, whereas it predicted initial performance but not growth for high-complexity problems. Nonverbal reasoning predicted initial performance and growth for low-complexity WPS, but only growth for high-complexity WPS. This evidence suggests that although mathematical structure is fixed, different cognitive resources may act as limiting factors in WPS development when the WPS context is varied. PMID:23325985
Female ornamentation and territorial conflicts in collared flycatchers ( Ficedula albicollis)
NASA Astrophysics Data System (ADS)
Hegyi, Gergely; Garamszegi, László Zsolt; Eens, Marcel; Török, János
2008-10-01
Female ornaments in species with conventional sex roles often indicate individual quality, but the evolutionary forces maintaining them are less clear. Sexual competition for breeding opportunities may represent an important role for female signals, especially in polygynous species, but there is little experimental evidence for this. The wing patch size (WPS) of female collared flycatchers indicates age and body condition and predicts social mating patterns. We challenged nest-building females with decoy females of varying WPS and found that the aggressive response of residents increased with decoy WPS, suggesting a role for this female ornament in territorial competition. Our results explain why female WPS predicts territorial distances when mated to a polygynous male and indicate that the role of WPS in female competitive interactions is similar to that in males of the same population.
The growing epidemic of water pipe smoking: health effects and future needs.
Bou Fakhreddine, Hisham M; Kanj, Amjad N; Kanj, Nadim A
2014-09-01
Water pipe smoking (WPS), an old method of tobacco smoking, is re-gaining widespread popularity all over the world and among various populations. Smoking machine studies have shown that the water pipe (WP) mainstream smoke (MSS) contains a wide array of chemical substances, many of which are highly toxic and carcinogenic for humans. The concentrations of some substances exceed those present in MSS of cigarettes. Despite being of low grade, current evidence indicates that WPS is associated with different adverse health effects, not only on the respiratory system but also on the cardiovascular, hematological, and reproductive systems, including pregnancy outcomes. In addition, association between WPS and malignancies, such as lung, oral and nasopharyngeal cancer, has been suggested in different studies and systematic reviews. Despite its long standing history, WPS research still harbors a lot of deficiencies. The magnitude of toxicants and carcinogen exposures, effects on human health, as well as the addiction and dependence potentials associated with WPS need to be studied in well-designed prospective trials. Unfortunately, many of the tobacco control and clean indoor policies have exempted water pipes. World wide awareness among the public, smokers, and policymakers about the potential health effects of WPS is urgently required. Furthermore, stringent policies and laws that control and ban WPS in public places, similar to those applied on cigarettes smoking need to be implemented. Copyright © 2014 Elsevier Ltd. All rights reserved.
Reproductive toxicity to male mice of nose only exposure to water- pipe smoke.
Ali, Badreldin H; Adham, Sirin A; Al Balushi, Khalid A; Shalaby, Asem; Waly, Mostafa I; Manoj, Priyadarsin; Beegam, Sumaya; Yuvaraju, Priya; Nemmar, Abderrahim
2015-01-01
Water-pipe smoking (WPS) is popular in the Middle East and is starting to gain popularity in several Western countries as well. It is widely and erroneously perceived to be less harmful than other forms of tobacco use. The reproductive adverse effects of cigarette smoking have been studied before with conflicting results, but data on the possible adverse reproductive effects of WPS are lacking. Here, we assessed the effects of nose-only exposure to mainstream WPS generated by commercially available honey-flavored "moasel" tobacco in mice. The duration of the session was 30 min/day for one month. Control mice were exposed to air. Twenty-four h after the last exposure, mice were killed and the testes and plasma removed for analysis. In testicular homogenates total protein, alkaline phosphatase activity, several indices of oxidative damage and Vascular Endothelial Growth Factor Receptor 2 (VEGFR2) were quantified. The plasma concentrations of leptin, testosterone, estrogen and luteinizing hormone (LH) were also measured. Histological analysis of testes and lungs was also conducted. WPS caused statistically significant decreases in the plasma concentrations of leptin, testosterone, and LH, and in the concentrations of total protein and the antioxidant indices measured. A statistically non-significant decrease in VEGFR2 protein in the WPS--exposed mice compared to the control mice was also found. The body and testicular weights of mice exposed to WPS, as well as their testicular alkaline phosphatase activity and light microscopic histology, and plasma estrogen concentration were all not significantly affected by WPS. Further studies on the functional implications of these findings in mice exposed to WPS for longer durations are warranted.
Health effects associated with waterpipe smoking
El-Zaatari, Ziad M; Chami, Hassan A; Zaatari, Ghazi S
2015-01-01
Objective It is widely held that waterpipe smoking (WPS) is not associated with health hazards. However, several studies have documented the uptake of several toxicants and carcinogens during WPS that is strongly associated with harmful health effects. This paper reviews the literature on the health effects of WPS. Data sources Three databases-PubMed, MEDLINE and EMBASE-were searched until August 2014 for the acute and long-term health effects of WPS using the terms ‘waterpipe’ and its synonyms (hookah, shisha, goza, narghileh, arghileh and hubble-bubble) in various spellings. Study selection We included original clinical studies, case reports and systematic reviews and focused on clinical human studies. ∼10% of the identified studies met the selection criteria. Data extraction Data were abstracted by all three authors and summarised into tables. Abstracted data included study type, results and methodological limitations and were analysed jointly by all three authors. Data synthesis WPS acutely leads to increased heart rate, blood pressure, impaired pulmonary function and carbon monoxide intoxication. Chronic bronchitis, emphysema and coronary artery disease are serious complications of long-term use. Lung, gastric and oesophageal cancer are associated with WPS as well as periodontal disease, obstetrical complications, osteoporosis and mental health problems. Conclusions Contrary to the widely held misconception, WPS is associated with a variety of adverse short-term and long-term health effects that should reinforce the need for stronger regulation. In addition, this review highlights the limitations of the published work, which is mostly cross-sectional or retrospective. Prospective studies should be undertaken to assess the full spectrum of health effects of WPS, particularly in view of its growing popularity and attractiveness to youth. PMID:25661414
Learning and recall of Worker Protection Standard (WPS) training in vineyard workers.
Anger, W Kent; Patterson, Lindsey; Fuchs, Martha; Will, Liliana L; Rohlman, Diane S
2009-01-01
Worker Protection Standard (WPS) training is one of the U.S. Environmental Protection Agency's (EPA) primary methods for preventing pesticide exposure in agricultural workers. Retention of the knowledge from the training may occasionally be tested by state Occupational Safety and Health Administrations (state OSHAs) during a site visit, but anecdotal evidence suggests that there is no consistent testing of knowledge after WPS training. EPA's retraining requirements are at 5-year intervals, meaning the knowledge must be retained for that long. Vineyard workers completed a test of their baseline WPS knowledge, computer-based training on WPS, a post-test immediately after training and a re-test 5 months later. Pre-test performance suggested that there was a relatively high level of baseline knowledge of WPS information on two-answer multiple choice tests (74% to 75%) prior to training. Training increased the knowledge to 85% on the post-test with the same questions, a significant increase (p < .001, 1-tailed) and a large effect size (d) of .90. Re-test performance (78%) at 5 months revealed a return towards but not back to the pre-test levels. Better test performance was significantly correlated with higher education and to a lesser extent with younger ages. Whether this level of knowledge is sufficient to protect agricultural workers remains an open question, although an increase in the proportion of people in a work group who know the critical WPS information may be the most important impact of training.
WholePathwayScope: a comprehensive pathway-based analysis tool for high-throughput data
Yi, Ming; Horton, Jay D; Cohen, Jonathan C; Hobbs, Helen H; Stephens, Robert M
2006-01-01
Background Analysis of High Throughput (HTP) Data such as microarray and proteomics data has provided a powerful methodology to study patterns of gene regulation at genome scale. A major unresolved problem in the post-genomic era is to assemble the large amounts of data generated into a meaningful biological context. We have developed a comprehensive software tool, WholePathwayScope (WPS), for deriving biological insights from analysis of HTP data. Result WPS extracts gene lists with shared biological themes through color cue templates. WPS statistically evaluates global functional category enrichment of gene lists and pathway-level pattern enrichment of data. WPS incorporates well-known biological pathways from KEGG (Kyoto Encyclopedia of Genes and Genomes) and Biocarta, GO (Gene Ontology) terms as well as user-defined pathways or relevant gene clusters or groups, and explores gene-term relationships within the derived gene-term association networks (GTANs). WPS simultaneously compares multiple datasets within biological contexts either as pathways or as association networks. WPS also integrates Genetic Association Database and Partial MedGene Database for disease-association information. We have used this program to analyze and compare microarray and proteomics datasets derived from a variety of biological systems. Application examples demonstrated the capacity of WPS to significantly facilitate the analysis of HTP data for integrative discovery. Conclusion This tool represents a pathway-based platform for discovery integration to maximize analysis power. The tool is freely available at . PMID:16423281
Kimura, Yoshiyuki; Sumiyoshi, Maho; Kobayashi, Toshiya
2014-01-01
Whey proteins or peptides exhibit various actions, including an antioxidant action, an anticancer action, and a protective action against childhood asthma and atopic syndrome. The effects of orally administered whey peptides (WPs) on chronic ultraviolet B (UVB) radiation-induced cutaneous changes, including changes in cutaneous thickness, elasticity, wrinkle formation, etc., have not been examined. In this study, we studied the preventive effects of WPs on cutaneous aging induced by chronic UVB irradiation in melanin-possessing male hairless mice (HRM). UVB (36-180 mJ/cm(2)) was irradiated to the dorsal area for 17 wk in HRM, and the measurements of cutaneous thickness and elasticity in UVB irradiated mice were performed every week. WPs (200 and 400 mg/kg, twice daily) were administered orally for 17 wk. WPs inhibited the increase in cutaneous thickness, wrinkle formation, and melanin granules and the reduction in cutaneous elasticity associated with photoaging. Furthermore, it has been reported that UVB irradiation-induced skin aging is closely associated with the increase in expression of matrix metalloproteinase (MMP), vascular endothelial growth factor (VEGF), Ki-67-, and 8-hydroxy-2'-deoxyguanosine (8-OHdG)-positive cells. WPs also prevented increases in the expression of MMP-2 and pro-MMP-9, VEGF, and Ki-67- and 8-OHdG-positive cells induced by chronic UVB irradiation. It was found that WPs prevent type IV collagen degradation, angiogenesis, proliferation, and DNA damage caused by UVB irradiation. Overall, these results demonstrate the considerable benefit of WPs for protection against solar UV-irradiated skin aging as a supplemental nutrient.
The climate4impact portal: bridging the CMIP5 and CORDEX data infrastructure to impact users
NASA Astrophysics Data System (ADS)
Plieger, Maarten; Som de Cerff, Wim; Pagé, Christian; Tatarinova, Natalia; Cofiño, Antonio; Vega Saldarriaga, Manuel; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Sjökvist, Elin
2015-04-01
The aim of climate4impact is to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. The portal is based on 21 impact use cases from 5 different European countries, and is evaluated by a user panel consisting of use case owners. It has been developed within the European projects IS-ENES and IS-ENES2 for more than 5 years, and its development currently continues within IS-ENES2 and CLIPC. As the climate impact community is very broad, the focus is mainly on the scientific impact community. This work has resulted in the ENES portal interface for climate impact communities and can be visited at www.climate4impact.eu. The climate4impact is connected to the Earth System Grid Federation (ESGF) nodes containing global climate model data (GCM data) from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) and regional climate model data (RCM) data from the Coordinated Regional Climate Downscaling Experiment (CORDEX). This global network of climate model data centers offers services for data description, discovery and download. The climate4impact portal connects to these services using OpenID, and offers a user interface for searching, visualizing and downloading global climate model data and more. A challenging task was to describe the available model data and how it can be used. The portal tries to inform users about possible caveats when using climate model data. All impact use cases are described in the documentation section, using highlighted keywords pointing to detailed information in the glossary. During the project, the content management system Drupal was used to enable partners to contribute on the documentation section. In this presentation the architecture and following items will be detailed: - Visualization: Visualize data from ESGF data nodes using ADAGUC Web Map Services. - Processing: Transform data, subset, export into other formats, and perform climate indices calculations using Web Processing Services implemented by PyWPS, based on NCAR NCPP OpenClimateGIS and IS-ENES2 icclim. - Security: Login using OpenID for access to the ESGF data nodes. The ESGF works in conjunction with several external websites and systems. The climate4impact portal uses X509 based short lived credentials, generated on behalf of the user with a MyProxy service. Single Sign-on (SSO) is used to make these websites and systems work together. - Discovery: Facetted search based on e.g. variable name, model and institute using the ESGF search services. A catalog browser allows for browsing through CMIP5 and any other climate model data catalogues (e.g. ESSENCE, EOBS, UNIDATA). - Download: Directly from ESGF nodes and other THREDDS catalogs This architecture will also be used for the future Copernicus platform, developed in the EU FP7 CLIPC project. - Connection with the downscaling portal of the university of Cantabria - Experiences on the question and answer site via Askbot The current main objectives for climate4impact can be summarized in two objectives. The first one is to work on a web interface which automatically generates a graphical user interface on WPS endpoints. The WPS calculates climate indices and subset data using OpenClimateGIS/icclim on data stored in ESGF data nodes. Data is then transmitted from ESGF nodes over secured OpenDAP and becomes available in a new, per user, secured OpenDAP server. The results can then be visualized again using ADAGUC WMS. Dedicated wizards for processing of climate indices will be developed in close collaboration with users. The second one is to expose climate4impact services, so as to offer standardized services which can be used by other portals. This has the advantage to add interoperability between several portals, as well as to enable the design of specific portals aimed at different impact communities, either thematic or national, for example.
Pouplin, Samuel; Roche, Nicolas; Vaugier, Isabelle; Jacob, Antoine; Figere, Marjorie; Pottier, Sandra; Antoine, Jean-Yves; Bensmail, Djamel
2016-02-01
To determine whether the number of words displayed in the word prediction software (WPS) list affects text input speed (TIS) in people with cervical spinal cord injury (SCI), and whether any influence is dependent on the level of the lesion. A cross-sectional trial. A rehabilitation center. Persons with cervical SCI (N=45). Lesion level was high (C4 and C5, American Spinal Injury Association [ASIA] grade A or B) for 15 participants (high-lesion group) and low (between C6 and C8, ASIA grade A or B) for 30 participants (low-lesion group). TIS was evaluated during four 10-minute copying tasks: (1) without WPS (Without); (2) with a display of 3 predicted words (3Words); (3) with a display of 6 predicted words (6Words); and (4) with a display of 8 predicted words (8Words). During the 4 copying tasks, TIS was measured objectively (characters per minute, number of errors) and subjectively through subject report (fatigue, perception of speed, cognitive load, satisfaction). For participants with low-cervical SCI, TIS without WPS was faster than with WPS, regardless of the number of words displayed (P<.001). For participants with high-cervical SCI, the use of WPS did not influence TIS (P=.99). There was no influence of the number of words displayed in a word prediction list on TIS; however, perception of TIS differed according to lesion level. For persons with low-cervical SCI, a small number of words should be displayed, or WPS should not be used at all. For persons with high-cervical SCI, a larger number of words displayed increases the comfort of use of WPS. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Weglicki, Linda S; Templin, Thomas; Hammad, Adnan; Jamil, Hikmet; Abou-Mediene, Sharifa; Farroukh, Mona; Rice, Virginia Hill
2007-01-01
To determine tobacco use rates (cigarette, water pipe smoking [WPS] or narghile) in Arab American compared to non-Arab youth. A convenience sample of 2,782 14- to 18-year-old high school students from a midwest community completed a 21-item tobacco use history survey. Seventy-one percent of the participants were ArA. Grades 9 through 12 were equally represented. Results included 'ever tried cigarettes [narghile]' (20%, 39%); 'smoked cigarettes [narghile] in the past 30 days' (7%, 22%); and 'regular smoking [narghile]' (3%, 15%) for ArA and non-Arab youths, respectively. Each was significantly related to grade and ethnicity. WPS for ArA and non-Arab youths was (38%, 21%); (17%, 11%); and (7%, 5%) for 'ever used,' 'used in the past 30 days,' and 'regular use,' respectively. Grade, ethnicity, and sex were significantly related to WPS. Cigarette smoking rates for non-Arab youth were lower than current national youth smoking rates but significantly higher than ArA youth. Rates for ArA youth were much lower than current national reported data. Rates of WPS for US youth, regardless of race or ethnicity, are not known. Findings from this study indicate that both ArA and non-Arab youth are experimenting and using WPS regularly. These results underscore the importance of assessing novel forms of tobacco use, particularly WPS, a growing phenomenon among US youth.
Worker Protection Standard Relabeling Process for Retailers and Wholesalers
This is Attachment 1 for Pesticide Registration Notice 95-5, Labeling Revisions Required By The Worker Protection Standard (WPS) for Sale or Distribution of Certain Agricultural Pesticides after October 23, 1995.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shum, D.K.M.
This paper examines various issues that would impact the incorporation of warm prestress (WPS) effects in the fracture-margin assessment of reactor pressure vessels (RPVs). By way of an example problem, possible beneficial effects of including type-I WPS in the assessment of an RPV subjected to a small break loss of coolant accident are described. In addition, the need to consider possible loss of constraint effects when interpreting available small specimen WPS-enhanced fracture toughness data is demonstrated through two- and three-dimensional local crack-lip field analyses of a compact tension specimen. Finally, a hybrid correlative-predictive model of WPS base on J-Q theorymore » and the Ritchie-Knott-Rice model is applied to a small scale yielding boundary layer formulation to investigate near crack-tip fields under varying degrees of loading and unloading.« less
Agricultural Worker Protection Standard (WPS)
EPA's Agricultural Worker Protection Standard (WPS) is aimed at reducing the risk of pesticide poisoning and injury among agricultural workers and pesticide handlers. It places specific requirements on employers of such workers.
The Climate Data Analytic Services (CDAS) Framework.
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; Duffy, D.
2016-12-01
Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.
Warm prestress effects in fracture-margin assessment of PWR-RPVs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shum, D.K.M.
This paper examines various issues that would impact the incorporation of warm prestress (WPS) effects in the fracture-margin assessment of reactor pressure vessels (RPVs). By way of an example problem, possible beneficial effects of including type-I WPS in the assessment of an RPV subjected to a small break loss of coolant accident are described. In addition, the need to consider possible loss of constraint effects when interpreting available small specimen WPS-enhanced fracture toughness data is demonstrated through two- and three-dimensional local crack-lip field analyses of a compact tension specimen. Finally, a hybrid correlative-predictive model of WPS base on J-Q theorymore » and the Ritchie-Knott-Rice model is applied to a small scale yielding boundary layer formulation to investigate near crack-tip fields under varying degrees of loading and unloading.« less
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, Martin; Stephens, Ag; Damasio da Costa, Eduardo
2014-05-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, M. N.; Stephens, A.; da Costa, E. D.
2013-12-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.
Scaling and clustering effects of extreme precipitation distributions
NASA Astrophysics Data System (ADS)
Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Li, Jianfeng
2012-08-01
SummaryOne of the impacts of climate change and human activities on the hydrological cycle is the change in the precipitation structure. Closely related to the precipitation structure are two characteristics: the volume (m) of wet periods (WPs) and the time interval between WPs or waiting time (t). Using daily precipitation data for a period of 1960-2005 from 590 rain gauge stations in China, these two characteristics are analyzed, involving scaling and clustering of precipitation episodes. Our findings indicate that m and t follow similar probability distribution curves, implying that precipitation processes are controlled by similar underlying thermo-dynamics. Analysis of conditional probability distributions shows a significant dependence of m and t on their previous values of similar volumes, and the dependence tends to be stronger when m is larger or t is longer. It indicates that a higher probability can be expected when high-intensity precipitation is followed by precipitation episodes with similar precipitation intensity and longer waiting time between WPs is followed by the waiting time of similar duration. This result indicates the clustering of extreme precipitation episodes and severe droughts or floods are apt to occur in groups.
Developing an Online Framework for Publication of Uncertainty Information in Hydrological Modeling
NASA Astrophysics Data System (ADS)
Etienne, E.; Piasecki, M.
2012-12-01
Inaccuracies in data collection and parameters estimation, and imperfection of models structures imply uncertain predictions of the hydrological models. Finding a way to communicate the uncertainty information in a model output is important in decision-making. This work aims to publish uncertainty information (computed by project partner at Penn State) associated with hydrological predictions on catchments. To this end we have developed a DB schema (derived from the CUAHSI ODM design) which is focused on storing uncertainty information and its associated metadata. The technologies used to build the system are: OGC's Sensor Observation Service (SOS) for publication, the uncertML markup language (also developed by the OGC) to describe uncertainty information, and use of the Interoperability and Automated Mapping (INTAMAP) Web Processing Service (WPS) that handles part of the statistics computations. We develop a service to provide users with the capability to exploit all the functionality of the system (based on DRUPAL). Users will be able to request and visualize uncertainty data, and also publish their data in the system.
Yi, Ming; Mudunuri, Uma; Che, Anney; Stephens, Robert M
2009-06-29
One of the challenges in the analysis of microarray data is to integrate and compare the selected (e.g., differential) gene lists from multiple experiments for common or unique underlying biological themes. A common way to approach this problem is to extract common genes from these gene lists and then subject these genes to enrichment analysis to reveal the underlying biology. However, the capacity of this approach is largely restricted by the limited number of common genes shared by datasets from multiple experiments, which could be caused by the complexity of the biological system itself. We now introduce a new Pathway Pattern Extraction Pipeline (PPEP), which extends the existing WPS application by providing a new pathway-level comparative analysis scheme. To facilitate comparing and correlating results from different studies and sources, PPEP contains new interfaces that allow evaluation of the pathway-level enrichment patterns across multiple gene lists. As an exploratory tool, this analysis pipeline may help reveal the underlying biological themes at both the pathway and gene levels. The analysis scheme provided by PPEP begins with multiple gene lists, which may be derived from different studies in terms of the biological contexts, applied technologies, or methodologies. These lists are then subjected to pathway-level comparative analysis for extraction of pathway-level patterns. This analysis pipeline helps to explore the commonality or uniqueness of these lists at the level of pathways or biological processes from different but relevant biological systems using a combination of statistical enrichment measurements, pathway-level pattern extraction, and graphical display of the relationships of genes and their associated pathways as Gene-Term Association Networks (GTANs) within the WPS platform. As a proof of concept, we have used the new method to analyze many datasets from our collaborators as well as some public microarray datasets. This tool provides a new pathway-level analysis scheme for integrative and comparative analysis of data derived from different but relevant systems. The tool is freely available as a Pathway Pattern Extraction Pipeline implemented in our existing software package WPS, which can be obtained at http://www.abcc.ncifcrf.gov/wps/wps_index.php.
Moldy Cheese: Is It Unsafe to Eat?
... www.fsis.usda.gov/wps/portal/fsis/topics/food-safety-education/get-answers/food-safety-fact-sheets/foodborne-illness-and-disease/foodborne-illness- ... www.fsis.usda.gov/wps/portal/fsis/topics/food-safety-education/get-answers/food-safety-fact-sheets/safe- ...
HELI-DEM portal for geo-processing services
NASA Astrophysics Data System (ADS)
Cannata, Massimiliano; Antonovic, Milan; Molinari, Monia
2014-05-01
HELI-DEM (Helvetia-Italy Digital Elevation Model) is a project developed in the framework of Italy/Switzerland Operational Programme for Trans-frontier Cooperation 2007-2013 whose major aim is to create a unified digital terrain model that includes the alpine and sub-alpine areas between Italy and Switzerland. The partners of the project are: Lombardy Region, Piedmont Region, Polytechnic of Milan, Polytechnic of Turin and Fondazione Politecnico from Italy; Institute of Earth Sciences (SUPSI) from Switzerland. The digital terrain model has been produced by integrating and validating the different elevation data available for the areas of interest, characterized by different reference frame, resolutions and accuracies: DHM at 25 m resolution from Swisstopo, DTM at 20 m resolution from Lombardy Region, DTM at 5 m resolution from Piedmont Region and DTM LiDAR PST-A at about 1 m resolution, that covers the main river bed areas and is produced by the Italian Ministry of the Environment. Further results of the project are: the generation of a unique Italian Swiss geoid with an accuracy of few centimeters (Gilardoni et al. 2012); the establishment of a GNSS permanent network, prototype of a transnational positioning service; the development of a geo-portal, entirely based on open source technologies and open standards, which provides the cross-border DTM and offers some capabilities of analysis and processing through the Internet. With this talk, the authors want to present the main steps of the project with a focus on the HELI-DEM geo-portal development carried out by the Institute of Earth Sciences, which is the access point to the DTM outputted from the project. The portal, accessible at http://geoservice.ist.supsi.ch/helidem, is a demonstration of open source technologies combined for providing access to geospatial functionalities to wide non GIS expert public. In fact, the system is entirely developed using only Open Standards and Free and Open Source Software (FOSS) both on the server side (services) and on the client side (interface). In addition to self developed code the system relies mainly on teh software GRASS 7 [1], ZOO-project [2], Geoserver [3] and OpenLayers [4] and the standards WMS [5], WCS [6] and WPS [7]. At the time of writing, the portal offers features like profiling, contour extraction, watershed delineation and analysis, derivatives calculation, data extraction, coordinate conversion but it is evolving and it is planned to extend to a series of environmental modeling that the IST developed in the past like dam break simulation, landslide run-out estimation and floods due to landslide impact in artificial basins. [1] Neteler M., Mitasova H., Open Source GIS: A GRASS GIS Approach. 3rd Ed. 406 pp, Springer, New York, 2008. [2] Fenoy G., Bozon N., Raghavan V., ZOO Project: The Open Wps Platform. Proceeding of 1st International Workshop on Pervasive Web Mapping, Geoprocessing and Services (WebMGS). Como, http://www.isprs.org/proceedings/XXXVIII/4-W13/ID_32.pdf, 26-27 agosto 2010. [3] Giannecchini S., Aime A., GeoServer, il server open source per la gestione interoperabile dei dati geospaziali. Atti 15a Conferenza Nazionale ASITA. Reggia di Colorno, 15-18 novembre 2011. [4] Perez A.S., OpenLayers Cookbook. Packt Publishing, 2012. ISBN 1849517843. [5] OGC, OpenGIS Web Map Server Implementation Specification, http://www.opengeospatial.org/standards/wms, 2006. [6] OGC, OGC WCS 2.0 Interface Standard - Core, http://portal.opengeospatial.org/files/?artifact_id=41437, 2010b. [7] OGC, OpenGIS Web Processing Service, http://portal.opengeospatial.org/files/?artifact_id=24151, 2007.
NASA Astrophysics Data System (ADS)
Ross, A.; Stackhouse, P. W.; Tisdale, B.; Tisdale, M.; Chandler, W.; Hoell, J. M., Jr.; Kusterer, J.
2014-12-01
The NASA Langley Research Center Science Directorate and Atmospheric Science Data Center have initiated a pilot program to utilize Geographic Information System (GIS) tools that enable, generate and store climatological averages using spatial queries and calculations in a spatial database resulting in greater accessibility of data for government agencies, industry and private sector individuals. The major objectives of this effort include the 1) Processing and reformulation of current data to be consistent with ESRI and openGIS tools, 2) Develop functions to improve capability and analysis that produce "on-the-fly" data products, extending these past the single location to regional and global scales. 3) Update the current web sites to enable both web-based and mobile application displays for optimization on mobile platforms, 4) Interact with user communities in government and industry to test formats and usage of optimization, and 5) develop a series of metrics that allow for monitoring of progressive performance. Significant project results will include the the development of Open Geospatial Consortium (OGC) compliant web services (WMS, WCS, WFS, WPS) that serve renewable energy and agricultural application products to users using GIS software and tools. Each data product and OGC service will be registered within ECHO, the Common Metadata Repository, the Geospatial Platform, and Data.gov to ensure the data are easily discoverable and provide data users with enhanced access to SSE data, parameters, services, and applications. This effort supports cross agency, cross organization, and interoperability of SSE data products and services by collaborating with DOI, NRCan, NREL, NCAR, and HOMER for requirements vetting and test bed users before making available to the wider public.
Mo, Yu; Zhao, Lei; Wang, Zhonghui; Chen, Chia-Lung; Tan, Giin-Yu Amy; Wang, Jing-Yuan
2014-04-01
A work applied response surface methodology coupled with Box-Behnken design (RSM-BBD) has been developed to enhance styrene recovery from waste polystyrene (WPS) through pyrolysis. The relationship between styrene yield and three selected operating parameters (i.e., temperature, heating rate, and carrier gas flow rate) was investigated. A second order polynomial equation was successfully built to describe the process and predict styrene yield under the study conditions. The factors identified as statistically significant to styrene production were: temperature, with a quadratic effect; heating rate, with a linear effect; carrier gas flow rate, with a quadratic effect; interaction between temperature and carrier gas flow rate; and interaction between heating rate and carrier gas flow rate. The optimum conditions for the current system were determined to be at a temperature range of 470-505°C, a heating rate of 40°C/min, and a carrier gas flow rate range of 115-140mL/min. Under such conditions, 64.52% WPS was recovered as styrene, which was 12% more than the highest reported yield for reactors of similar size. It is concluded that RSM-BBD is an effective approach for yield optimization of styrene recovery from WPS pyrolysis. Copyright © 2014 Elsevier Ltd. All rights reserved.
2014-01-01
Introduction The novel arthritis-specific Work Productivity Survey (WPS) was developed to estimate patient productivity limitations associated with arthritis within and outside the home, which is an unmet need in psoriatic arthritis (PsA). The WPS has been validated in rheumatoid arthritis. This report assesses the discriminant validity, responsiveness and reliability of the WPS in adult-onset PsA. Methods Psychometric properties were assessed using data from the RAPID-PsA trial (NCT01087788) investigating certolizumab pegol (CZP) efficacy and safety in PsA. WPS was completed at baseline and every 4 weeks until Week 24. Validity was evaluated at baseline via known-groups defined using first and third quartiles of patients’ Disease Activity Score 28 based on C-reactive protein (DAS28(CRP)), Health Assessment Questionnaire-Disability Index (HAQ-DI), Short Form-36 (SF-36) items and PsA Quality of Life (PsAQoL) scores. Responsiveness and reliability were assessed by comparing WPS mean changes at Week 12 in American College of Rheumatology 20% improvement criteria (ACR20) or HAQ-DI Minimal Clinically Important Difference (MCID) 0.3 responders versus non-responders, as well as using standardized response means (SRM). All comparisons were conducted on the observed cases in the Randomized Set, regardless of the randomization group, using a non-parametric bootstrap-t method. Results Compared with patients with a better health state, patients with a worse health state had on average 2 to 6 times more household work days lost, more days with reduced household productivity, more days missed of family/social/leisure activities, more days with outside help hired and a significantly higher interference of arthritis per month. Among employed patients, those with a worse health state had 2 to 4 times more workplace days lost, more days with patient workplace productivity reduced, and a significantly higher interference of arthritis on patient workplace productivity versus patients with a better health state. WPS was also responsive to clinical changes, with responders having significantly larger improvements at Week 12 in WPS scores versus non-responders. The effect sizes for changes in productivity in ACR20 or HAQ-DI MCID responders were moderate (0.5 < SRM < 0.8) or small. Conclusions These analyses demonstrate the validity, responsiveness and reliability of the WPS, as an instrument for the measurement of patient productivity within and outside the home in an adult-onset PsA population. PMID:24996416
Using Virtualization to Integrate Weather, Climate, and Coastal Science Education
NASA Astrophysics Data System (ADS)
Davis, J. R.; Paramygin, V. A.; Figueiredo, R.; Sheng, Y.
2012-12-01
To better understand and communicate the important roles of weather and climate on the coastal environment, a unique publically available tool is being developed to support research, education, and outreach activities. This tool uses virtualization technologies to facilitate an interactive, hands-on environment in which students, researchers, and general public can perform their own numerical modeling experiments. While prior efforts have focused solely on the study of the coastal and estuary environments, this effort incorporates the community supported weather and climate model (WRF-ARW) into the Coastal Science Educational Virtual Appliance (CSEVA), an education tool used to assist in the learning of coastal transport processes; storm surge and inundation; and evacuation modeling. The Weather Research and Forecasting (WRF) Model is a next-generation, community developed and supported, mesoscale numerical weather prediction system designed to be used internationally for research, operations, and teaching. It includes two dynamical solvers (ARW - Advanced Research WRF and NMM - Nonhydrostatic Mesoscale Model) as well as a data assimilation system. WRF-ARW is the ARW dynamics solver combined with other components of the WRF system which was developed primarily at NCAR, community support provided by the Mesoscale and Microscale Meteorology (MMM) division of National Center for Atmospheric Research (NCAR). Included with WRF is the WRF Pre-processing System (WPS) which is a set of programs to prepare input for real-data simulations. The CSEVA is based on the Grid Appliance (GA) framework and is built using virtual machine (VM) and virtual networking technologies. Virtualization supports integration of an operating system, libraries (e.g. Fortran, C, Perl, NetCDF, etc. necessary to build WRF), web server, numerical models/grids/inputs, pre-/post-processing tools (e.g. WPS / RIP4 or UPS), graphical user interfaces, "Cloud"-computing infrastructure and other tools into a single ready-to-use package. Thus, the previous ornery task of setting up and compiling these tools becomes obsolete and the research, educator or student can focus on using the tools to study the interactions between weather, climate and the coastal environment. The incorporation of WRF into the CSEVA has been designed to be synergistic with the extensive online tutorials and biannual tutorials hosted by NCAR. Included are working examples of the idealized test simulations provided with WRF (2D sea breeze and squalls, a large eddy simulation, a Held and Suarez simulation, etc.) To demonstrate the integration of weather, coastal and coastal science education, example applications are being developed to demonstrate how the system can be used to couple a coastal and estuarine circulation, transport and storm surge model with downscale reanalysis weather and future climate predictions. Documentation, tutorials and the enhanced CSEVA itself will be found on the web at: http://cseva.coastal.ufl.edu.
Tethys: A Platform for Water Resources Modeling and Decision Support Apps
NASA Astrophysics Data System (ADS)
Nelson, J.; Swain, N. R.
2015-12-01
The interactive nature of web applications or "web apps" makes it an excellent medium for conveying complex scientific concepts to lay audiences and creating decision support tools that harness cutting edge modeling techniques. However, the technical expertise required to develop web apps represents a barrier for would-be developers. This barrier can be characterized by the following hurdles that developers must overcome: (1) identify, select, and install software that meet the spatial and computational capabilities commonly required for water resources modeling; (2) orchestrate the use of multiple free and open source (FOSS) projects and navigate their differing application programming interfaces; (3) learn the multi-language programming skills required for modern web development; and (4) develop a web-secure and fully featured web portal to host the app. Tethys Platform has been developed to lower the technical barrier and minimize the initial development investment that prohibits many scientists and engineers from making use of the web app medium. It includes (1) a suite of FOSS that address the unique data and computational needs common to water resources web app development, (2) a Python software development kit that streamlines development, and (3) a customizable web portal that is used to deploy the completed web apps. Tethys synthesizes several software projects including PostGIS, 52°North WPS, GeoServer, Google Maps™, OpenLayers, and Highcharts. It has been used to develop a broad array of web apps for water resources modeling and decision support for several projects including CI-WATER, HydroShare, and the National Flood Interoperability Experiment. The presentation will include live demos of some of the apps that have been developed using Tethys to demonstrate its capabilities.
Using Third Party Data to Update a Reference Dataset in a Quality Evaluation Service
NASA Astrophysics Data System (ADS)
Xavier, E. M. A.; Ariza-López, F. J.; Ureña-Cámara, M. A.
2016-06-01
Nowadays it is easy to find many data sources for various regions around the globe. In this 'data overload' scenario there are few, if any, information available about the quality of these data sources. In order to easily provide these data quality information we presented the architecture of a web service for the automation of quality control of spatial datasets running over a Web Processing Service (WPS). For quality procedures that require an external reference dataset, like positional accuracy or completeness, the architecture permits using a reference dataset. However, this reference dataset is not ageless, since it suffers the natural time degradation inherent to geospatial features. In order to mitigate this problem we propose the Time Degradation & Updating Module which intends to apply assessed data as a tool to maintain the reference database updated. The main idea is to utilize datasets sent to the quality evaluation service as a source of 'candidate data elements' for the updating of the reference database. After the evaluation, if some elements of a candidate dataset reach a determined quality level, they can be used as input data to improve the current reference database. In this work we present the first design of the Time Degradation & Updating Module. We believe that the outcomes can be applied in the search of a full-automatic on-line quality evaluation platform.
Keys to career satisfaction: insights from a survey of women pediatric surgeons.
Caniano, Donna A; Sonnino, Roberta E; Paolo, Anthony M
2004-06-01
Declining interest in the field of surgery is attributed to lifestyle issues, more women per class, high debt, and long residency. To maintain surgery as a premier career choice, female students must find surgery to be professionally and personally rewarding. A 35-item questionnaire was mailed to 95 women pediatric surgeons (WPS), assessing multiple professional and personal factors. Responses were entered into a confidential database and analyzed by chi2 or t tests. Seventy-nine percent of surveys were returned; practice was identified as academic (60%) and private (40%). Respondents were grouped by age: A, less than 44 years (41%); B, 45 to 54 years (37%); and C, greater than 55 years (22%). For academic WPS, 81% are on timeline for promotion. Insufficient protected time was a significant obstacle for a successful academic career in groups A and B (P =.001). Clinical load, on-call responsibilities, lack of mentorship, and departmental support were major obstacles in all groups (P =.05). Seventy-three percent of WPS in private practice were satisfied with their role in practice management; poor practice conditions were cited as the most frequent reason for job relocation. Sixty-one percent of WPS are married, and 46% are raising children. WPS had statistically significant more responsibilities for child care and household tasks in comparison with their partners. Eighty-three percent report career satisfaction but desire more time with family and for personal interests. Part-time and flexible work schedules were identified as attractive ways to achieve career-family balance. Eighty-four percent believe that quality-of-life issues are the dominant reason that fewer medical students choose surgical fields. WPS express career satisfaction but share the concerns of their female colleagues in other surgical disciplines. Quality of life is viewed as central to career choice for the current generation of medical students; female role models are key to recruiting women into pediatric surgery.
Methane in the South China Sea and the Western Philippine Sea
NASA Astrophysics Data System (ADS)
Tseng, Hsiao-Chun; Chen, Chen-Tung Arthur; Borges, Alberto V.; DelValls, T. Angel; Chang, Yu-Chang
2017-03-01
Approximately 700 water samples from the South China Sea (SCS) and 300 water samples from the western Philippine Sea (wPS) were collected during eight cruises from August 2003 to July 2007 to determine methane (CH4) distributions from the surface to a depth of 4250 m. The surface CH4 concentrations exceeded atmospheric equilibrium, both in the SCS and the wPS, and the concentrations were 4.5±3.6 and 3.0±1.2 nmol L-1, respectively. The sea-to-air fluxes were calculated, and the SCS and the wPS were found to emit CH4 to the atmosphere at 8.6±6.4 μmol m-2 d-1 and 4.9±4.9 μmol m-2 d-1, respectively. In the SCS, CH4 emissions were higher over the continental shelf (11.0±7.4 μmol m-2 d-1) than over the deep ocean (6.1±6.0 μmol m-2 d-1), owing to greater biological productivity and closer coupling with the sediments on the continental shelf. The SCS emitted 30.1×106 mol d-1 CH4 to the atmosphere and exported 1.82×106 mol d-1 CH4 to the wPS. The concentrations of both CH4 and chlorophyll a were high in the 150 m surface layer of the wPS, but were not significantly correlated with each other. CH4 concentrations generally declined with increasing depth below the euphotic zone but remained constant below 1,000 m, both in the SCS and the wPS. Some high CH4 concentrations were observed at mid-depths and bottom waters in the SCS, and were most likely caused by the release of CH4 from gas hydrates or gas seepage.
The Earth Data Analytic Services (EDAS) Framework
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; Duffy, D.
2017-12-01
Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.
USDA-ARS?s Scientific Manuscript database
A model to simulate radiative transfer (RT) of sun-induced chlorophyll fluorescence (SIF) of three-dimensional (3-D) canopy, FluorWPS, was proposed and evaluated. The inclusion of fluorescence excitation was implemented with the ‘weight reduction’ and ‘photon spread’ concepts based on Monte Carlo ra...
ERIC Educational Resources Information Center
Jossberger, Helen; Brand-Gruwel, Saskia; Boshuizen, Henny; van de Wiel, Margje
2010-01-01
Workplace simulations (WPS), authentic learning environments at school, are increasingly used in vocational education. This article provides a theoretical analysis and synthesis of requirements considering learner skills, characteristics of the learning environment and the role of the teacher that influence good functioning in WPS and foster…
NASA Astrophysics Data System (ADS)
Errami, Youssef; Obbadi, Abdellatif; Sahnoun, Smail; Ouassaid, Mohammed; Maaroufi, Mohamed
2018-05-01
This paper proposes a Direct Torque Control (DTC) method for Wind Power System (WPS) based Permanent Magnet Synchronous Generator (PMSG) and Backstepping approach. In this work, generator side and grid-side converter with filter are used as the interface between the wind turbine and grid. Backstepping approach demonstrates great performance in complicated nonlinear systems control such as WPS. So, the control method combines the DTC to achieve Maximum Power Point Tracking (MPPT) and Backstepping approach to sustain the DC-bus voltage and to regulate the grid-side power factor. In addition, control strategy is developed in the sense of Lyapunov stability theorem for the WPS. Simulation results using MATLAB/Simulink validate the effectiveness of the proposed controllers.
The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.
2017-12-01
The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.
The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.
2016-12-01
The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.
Smaldone, Giorgio; Marrone, Raffaele; Palma, Giuseppe; Sarnelli, Paolo; Anastasio, Aniello
2017-10-20
The European Food Safety Authority stated that many traditional marinating and cold smoking methods are not sufficient to kill A. simplex and asked to evaluate alternative treatments for killing viable parasites in fishery . Baccalà is a well-liked traditional product. The aim of study was to evaluate the effectiveness of the salting process on the inactivation of nematodes of the genus Anisakis in naturally infected Baccalà fillets. N. 19 fillets, subjected to a dual salting process (brine and dry salting) were analyzed. Visual inspection and chloropeptic digestion were performed. Larvae viability was evaluated, and parameters such as NaCl (%), moisture (%), WPS and a w were determined. In n. 17 samples parasites were found 123 parasites with a mean intensity of 7.23±4.78 and an mean abundance of 6.47±5.05. Visual examination has revealed 109 parasites. 61.8% of larvae were found in the ventral portions. The results show that salting process with a salt concentration of 18.6%, a w values of 0.7514 and 24.15% WPS in all parts of baccalà fillets, devitalise Anisakidae larvae in a 15-day period.
Smaldone, Giorgio; Marrone, Raffaele; Palma, Giuseppe; Sarnelli, Paolo; Anastasio, Aniello
2017-01-01
The European Food Safety Authority stated that many traditional marinating and cold smoking methods are not sufficient to kill A. simplex and asked to evaluate alternative treatments for killing viable parasites in fishery. Baccalà is a well-liked traditional product. The aim of study was to evaluate the effectiveness of the salting process on the inactivation of nematodes of the genus Anisakis in naturally infected Baccalà fillets. N. 19 fillets, subjected to a dual salting process (brine and dry salting) were analyzed. Visual inspection and chloropeptic digestion were performed. Larvae viability was evaluated, and parameters such as NaCl (%), moisture (%), WPS and aw were determined. In n. 17 samples parasites were found 123 parasites with a mean intensity of 7.23±4.78 and an mean abundance of 6.47±5.05. Visual examination has revealed 109 parasites. 61.8% of larvae were found in the ventral portions. The results show that salting process with a salt concentration of 18.6%, aw values of 0.7514 and 24.15% WPS in all parts of baccalà fillets, devitalise Anisakidae larvae in a 15-day period. PMID:29564240
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
ERIC Educational Resources Information Center
Leh, Jayne
2011-01-01
Substantial evidence indicates that teacher-delivered schema-based instruction (SBI) facilitates significant increases in mathematics word problem solving (WPS) skills for diverse students; however research is unclear whether technology affordances facilitate superior gains in computer-mediated (CM) instruction in mathematics WPS when compared to…
Fincham, Dylan; Kagee, Ashraf; Swartz, Leslie
2010-04-01
A psychometric scale assessing inhibitors and facilitators of willingness to participate (WTP) in an HIV vaccine trial has not yet been developed. This study aimed to construct and derive the exploratory factor structure of such a scale. The 35-item Inhibitors and Facilitators of Willingness to Participate Scale (WPS) was developed and administered to a convenience sample of 264 Black females between the ages of 16 and 49 years living in an urban-informal settlement near Cape Town. The subscales of the WPS demonstrated good internal consistency with Cronbach's alpha coefficients ranging between 0.69 and 0.82. A principal components exploratory factor analysis revealed the presence of five latent factors. The factors, which accounted for 45.93% of the variance in WTP, were (1) personal costs, (2) safety and convenience, (3) stigmatisation, (4) personal gains and (5) social approval and trust. Against the backdrop of the study limitations, these results provide initial support for the reliability and construct validity of the WPS among the most eligible trial participants in the Western Cape of South Africa.
Improved Functional Characteristics of Whey Protein Hydrolysates in Food Industry
Jeewanthi, Renda Kankanamge Chaturika; Lee, Na-Kyoung; Paik, Hyun-Dong
2015-01-01
This review focuses on the enhanced functional characteristics of enzymatic hydrolysates of whey proteins (WPHs) in food applications compared to intact whey proteins (WPs). WPs are applied in foods as whey protein concentrates (WPCs), whey protein isolates (WPIs), and WPHs. WPs are byproducts of cheese production, used in a wide range of food applications due to their nutritional validity, functional activities, and cost effectiveness. Enzymatic hydrolysis yields improved functional and nutritional benefits in contrast to heat denaturation or native applications. WPHs improve solubility over a wide range of pH, create viscosity through water binding, and promote cohesion, adhesion, and elasticity. WPHs form stronger but more flexible edible films than WPC or WPI. WPHs enhance emulsification, bind fat, and facilitate whipping, compared to intact WPs. Extensive hydrolyzed WPHs with proper heat applications are the best emulsifiers and addition of polysaccharides improves the emulsification ability of WPHs. Also, WPHs improve the sensorial properties like color, flavor, and texture but impart a bitter taste in case where extensive hydrolysis (degree of hydrolysis greater than 8%). It is important to consider the type of enzyme, hydrolysis conditions, and WPHs production method based on the nature of food application. PMID:26761849
Improved Functional Characteristics of Whey Protein Hydrolysates in Food Industry.
Jeewanthi, Renda Kankanamge Chaturika; Lee, Na-Kyoung; Paik, Hyun-Dong
2015-01-01
This review focuses on the enhanced functional characteristics of enzymatic hydrolysates of whey proteins (WPHs) in food applications compared to intact whey proteins (WPs). WPs are applied in foods as whey protein concentrates (WPCs), whey protein isolates (WPIs), and WPHs. WPs are byproducts of cheese production, used in a wide range of food applications due to their nutritional validity, functional activities, and cost effectiveness. Enzymatic hydrolysis yields improved functional and nutritional benefits in contrast to heat denaturation or native applications. WPHs improve solubility over a wide range of pH, create viscosity through water binding, and promote cohesion, adhesion, and elasticity. WPHs form stronger but more flexible edible films than WPC or WPI. WPHs enhance emulsification, bind fat, and facilitate whipping, compared to intact WPs. Extensive hydrolyzed WPHs with proper heat applications are the best emulsifiers and addition of polysaccharides improves the emulsification ability of WPHs. Also, WPHs improve the sensorial properties like color, flavor, and texture but impart a bitter taste in case where extensive hydrolysis (degree of hydrolysis greater than 8%). It is important to consider the type of enzyme, hydrolysis conditions, and WPHs production method based on the nature of food application.
Associations of Adolescents' Cigarette, Waterpipe, and Dual Tobacco Use With Parental Tobacco Use.
Veeranki, Sreenivas P; Alzyoud, Sukaina; Dierking, Leah; Kheriallah, Khalid; Mzayek, Fawaz; Pbert, Lori; Ward, Kenneth D
2016-05-01
Previous studies have demonstrated the influence of parental (both mother and father) cigarette smoking on adolescents' cigarette smoking. Little is known, however, about how parental tobacco use is related to waterpipe and dual waterpipe/cigarette use, which is increasing dramatically in the Arab countries. Study data (n = 34 788, N = 6 109 572) were obtained from nationally representative Global Youth Tobacco Surveys in 17 Arab countries. Study outcome was adolescents' tobacco use categorized into none, cigarette smoking only, waterpipe smoking (WPS) only, and dual use. Primary exposure included parental tobacco use categorized into 10 groups-maternal (mother) cigarette smoking only, maternal WPS only, maternal dual use, paternal (father) cigarette smoking only, paternal WPS only, paternal dual use, parental (both mother and father) cigarette smoking only, parental WPS only, parental dual use, and none. Weighted multinomial regression models were conducted to assess the relationships. Adolescents reported smoking WPS only (5.7%), cigarettes only (2.9%), and dual use (3.5%). Compared to adolescent with no exposure to parental tobacco use, adolescent exposure to parental dual use was associated with significant increase in WPS only (OR = 6.08, 95% CI = 2.38-15.51) and dual use (OR = 3.86, 95% CI = 1.43-10.43). Effect modification of the relationship by adolescents' sex was observed. This is the first study to examine adolescent cigarette, waterpipe, and dual use with parental tobacco use. Study findings may help development of cessation interventions targeting parental tobacco use to prevent the rising waterpipe and dual use strain of the global tobacco epidemic. (1) Influence of parents' cigarette smoking on adolescents' smoking has been demonstrated in earlier studies, however, little is known about how tobacco use behaviors of mother and father influences an adolescent's cigarette, waterpipe and dual cigarette/waterpipe use. (2) Associations of parental (both mother and father) tobacco use with adolescents' tobacco use differed significantly if the adolescent is a waterpipe smoker or dual user compared to an adolescent cigarette smoker. (3) Adolescents' exposed to their mothers' WPS or dual use were more likely to be a waterpipe smoker or dual user. High likelihood of adolescents' cigarette, waterpipe and dual use is found in homes where parental tobacco use is rampant with both parents smoking either cigarette, waterpipe or both. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
The GEOSS Clearinghouse based on the GeoNetwork opensource
NASA Astrophysics Data System (ADS)
Liu, K.; Yang, C.; Wu, H.; Huang, Q.
2010-12-01
The Global Earth Observation System of Systems (GEOSS) is established to support the study of the Earth system in a global community. It provides services for social management, quick response, academic research, and education. The purpose of GEOSS is to achieve comprehensive, coordinated and sustained observations of the Earth system, improve monitoring of the state of the Earth, increase understanding of Earth processes, and enhance prediction of the behavior of the Earth system. In 2009, GEO called for a competition for an official GEOSS clearinghouse to be selected as a source to consolidating catalogs for Earth observations. The Joint Center for Intelligent Spatial Computing at George Mason University worked with USGS to submit a solution based on the open-source platform - GeoNetwork. In the spring of 2010, the solution is selected as the product for GEOSS clearinghouse. The GEOSS Clearinghouse is a common search facility for the Intergovernmental Group on Ea rth Observation (GEO). By providing a list of harvesting functions in Business Logic, GEOSS clearinghouse can collect metadata from distributed catalogs including other GeoNetwork native nodes, webDAV/sitemap/WAF, catalog services for the web (CSW)2.0, GEOSS Component and Service Registry (http://geossregistries.info/), OGC Web Services (WCS, WFS, WMS and WPS), OAI Protocol for Metadata Harvesting 2.0, ArcSDE Server and Local File System. Metadata in GEOSS clearinghouse are managed in a database (MySQL, Postgresql, Oracle, or MckoiDB) and an index of the metadata is maintained through Lucene engine. Thus, EO data, services, and related resources can be discovered and accessed. It supports a variety of geospatial standards including CSW and SRU for search, FGDC and ISO metadata, and WMS related OGC standards for data access and visualization, as linked from the metadata.
NASA Astrophysics Data System (ADS)
Chen, C. T. A.
2015-12-01
It has been known that Kuroshio subsurface waters are the major source of nutrients to the East China Sea continental shelf, a major fishing ground. It has also been known that subsurface waters that upwell onto the shelf are heavily affected by the South China Sea (SCS) Tropical Water and the SCS Intermediate Water which contain more nutrients than the tropical (Smax) and intermediate (Smin) waters from the West Philippine Sea (WPS). A front has been found to separate the tropical and intermediate waters from the SCS and WPS. The reported front in the Okinawa Trough, however, was identified based only on one-time data from a single cross-section in the central Okinawa Trough. Here historical hydrographic data between Mar. 1950 and Dec. 2011 in the Okinawa Trough and its neighborhood are analyzed. A vertical front tilted toward the west is found in all seasons in all years across the World Ocean Circulation Repeated Lines PR 18 and 19 as well as at the PN cross-section in the central Okinawa Trough. The front at the Smax level (sigma theta=24.6-24.9) shows large seasonal and interannual variations. In winter during normal and La Niña periods the presence of the SCS Tropical Water is the most prominent. It is the weakest in autumn during normal periods and in spring during La Nina periods. Yet during El Niño periods the SCS Tropical Water is the most prominent in spring and it becomes the weakest in winter. As for intermediate waters (Smin at sigma theta= 26.7-26.9) the WPS Intermediate Water and SCS Intermediate Water show much weaker seasonality compared with tropical waters although during normal periods in winter the WPS Intermediate Water contribution is slightly larger than during other times. During El Niño periods the WPS Intermediate Water contribution is the smallest but in spring it is much strengthened. On the other hand, the WPS Intermediate Water contribution is the smallest in spring, and the largest in winter during La Niña periods.
Seethaler, Pamela M.; Fuchs, Lynn S.; Fuchs, Douglas; Compton, Donald L.
2015-01-01
The purpose of this study was to assess the added value of dynamic assessment (DA) beyond more conventional static measures for predicting individual differences in year-end 1st-grade calculation (CA) and word-problem (WP) performance, as a function of limited English proficiency (LEP) status. At the start of 1st grade, students (129 LEP; 163 non-LEP) were assessed on a brief static mathematics test, an extended static mathematics test, static tests of domain-general abilities associated with CAs and WPs (vocabulary; reasoning), and DA. Near end of 1st grade, they were assessed on CA and WP. Regression analyses indicated that the value of the predictor depends on the predicted outcome and LEP status. In predicting CAs, the extended mathematics test and DA uniquely explained variance for LEP children, with stronger predictive value for the extended mathematics test; for non-LEP children, the extended mathematics test was the only significant predictor. However, in predicting WPs, only DA and vocabulary were uniquely predictive for LEP children, with stronger value for DA; for non-LEP children, the extended mathematics test and DA were comparably uniquely predictive. Neither the brief static mathematics test nor reasoning was significant in predicting either outcome. The potential value of a gated screening process, using an extended mathematics assessment to predict CAs and using DA to predict WPs, is discussed. PMID:26523068
NASA Astrophysics Data System (ADS)
Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.
2009-12-01
This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the fingertips of users around the globe. This user-friendly and low-cost information dissemination provides global information as a basis for decision-making in a number of critical areas, including public health, energy, agriculture, weather, water, climate, natural disasters and ecosystems. GEONETCast makes available satellite images via Digital Video Broadcast (DVB) technology. An OGC WMS interface and plug-ins which convert GEONETCast data streams allow an ILWIS user to integrate various distributed data sources with data locally stored on his machine. Our paper describes a use case in which ILWIS is used with GEONETCast satellite imagery for decision making processes in Ghana. We also explain how the ILWIS software can be extended with additional functionality by means of building plug-ins and unfold our plans to implement other OGC standards, such as WCS and WPS in the same context. Especially, the latter one can be seen as a major step forward in terms of moving well-proven desktop based processing functionality to the web. This enables the embedding of ILWIS functionality in Spatial Data Infrastructures or even the execution in scalable and on-demand cloud computing environments.
What Drives Pakistan’s Interest in Afghanistan?
2011-05-19
2010), 13. 172 Kamran Shafi, “Putting on a Brave Face and Standing Tall,” http://www.dawn.com/wps/wcm/connect/dawn-content-library/dawn/the-newspaper...columnists/ kamran -shafi- putting-on-a-brave-face-and-standing-tall-480 (accessed September 10, 2010). 59 include sanctuary, to militant...Shafi, Kamran . “Putting on a Brave Face and Standing Tall.” http://www.dawn.com/wps/wcm/connect/dawn-content-library/dawn/the- newspaper
WASTE PACKAGE REMEDIATION SYSTEM DESCRIPTION DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
N.D. Sudan
2000-06-22
The Waste Package Remediation System remediates waste packages (WPs) and disposal containers (DCs) in one of two ways: preparation of rejected DC closure welds for repair or opening of the DC/WP. DCs are brought to the Waste Package Remediation System for preparation of rejected closure welds if testing of the closure weld by the Disposal Container Handling System indicates an unacceptable, but repairable, welding flaw. DC preparation of rejected closure welds will require removal of the weld in such a way that the Disposal Container Handling System may resume and complete the closure welding process. DCs/WPs are brought to themore » Waste Package Remediation System for opening if the Disposal Container Handling System testing of the DC closure weld indicates an unrepairable welding flaw, or if a WP is recovered from the subsurface repository because suspected damage to the WP or failure of the WP has occurred. DC/WP opening will require cutting of the DC/WP such that a temporary seal may be installed and the waste inside the DC/WP removed by another system. The system operates in a Waste Package Remediation System hot cell located in the Waste Handling Building that has direct access to the Disposal Container Handling System. One DC/WP at a time can be handled in the hot cell. The DC/WP arrives on a transfer cart, is positioned within the cell for system operations, and exits the cell without being removed from the cart. The system includes a wide variety of remotely operated components including a manipulator with hoist and/or jib crane, viewing systems, machine tools for opening WPs, and equipment used to perform pressure and gas composition sampling. Remotely operated equipment is designed to facilitate DC/WP decontamination and hot cell equipment maintenance, and interchangeable components are provided where appropriate. The Waste Package Remediation System interfaces with the Disposal Container Handling System for the receipt and transport of WPs and DCs. The Waste Handling Building System houses the system, and provides the facility, safety, and auxiliary systems required to support operations. The system receives power from the Waste Handling Building Electrical System. The system also interfaces with the various DC systems.« less
NASA Astrophysics Data System (ADS)
Wang, Chao; Guo, Weidong; Li, Yan; Stubbins, Aron; Li, Yizhen; Song, Guodong; Wang, Lei; Cheng, Yuanyue
2017-12-01
The Kuroshio intrusion from the West Philippine Sea (WPS) and mesoscale eddies are important hydrological features in the northern South China Sea (SCS). In this study, absorption and fluorescence of dissolved organic matter (CDOM and FDOM) were determined to assess the impact of these hydrological features on DOM dynamics in the SCS. DOM in the upper 100 m of the northern SCS had higher absorption, fluorescence, and degree of humification than in the Kuroshio Current of the WPS. The results of an isopycnal mixing model showed that CDOM and humic-like FDOM inventories in the upper 100 m of the SCS were modulated by the Kuroshio intrusion. However, protein-like FDOM was influenced by in situ processes. This basic trend was modified by mesoscale eddies, three of which were encountered during the fieldwork (one warm eddy and two cold eddies). DOM optical properties inside the warm eddy resembled those of DOM in the WPS, indicating that warm eddies could derive from the Kuroshio Current through Luzon Strait. DOM at the center of cold eddies was enriched in humic-like fluorescence and had lower spectral slopes than in eddy-free waters, suggesting inputs of humic-rich DOM from upwelling and enhanced productivity inside the eddy. Excess CDOM and FDOM in northern SCS intermediate water led to export to the Pacific Ocean interior, potentially delivering refractory carbon to the deep ocean. This study demonstrated that DOM optical properties are promising tools to study active marginal sea-open ocean interactions.
Lai, Yinghui; Zhu, Xiaoshuang; Chen, Yinghe; Li, Yanjun
2015-01-01
Mathematics is one of the most objective, logical, and practical academic disciplines. Yet, in addition to cognitive skills, mathematical problem solving also involves affective factors. In the current study, we first investigated effects of mathematics anxiety (MA) and mathematical metacognition on word problem solving (WPS). We tested 224 children (116 boys, M = 10.15 years old, SD = 0.56) with the Mathematics Anxiety Scale for Children, the Chinese Revised-edition Questionnaire of Pupil's Metacognitive Ability in Mathematics, and WPS tasks. The results indicated that mathematical metacognition mediated the effect of MA on WPS after controlling for IQ. Second, we divided the children into four mathematics achievement groups including high achieving (HA), typical achieving (TA), low achieving (LA), and mathematical learning difficulty (MLD). Because mathematical metacognition and MA predicted mathematics achievement, we compared group differences in metacognition and MA with IQ partialled out. The results showed that children with MLD scored lower in self-image and higher in learning mathematics anxiety (LMA) than the TA and HA children, but not in mathematical evaluation anxiety (MEA). MLD children's LMA was also higher than that of their LA counterparts. These results provide insight into factors that may mediate poor WPS performance which emerges under pressure in mathematics. These results also suggest that the anxiety during learning mathematics should be taken into account in mathematical learning difficulty interventions.
Porites white patch syndrome: associated viruses and disease physiology
NASA Astrophysics Data System (ADS)
Lawrence, S. A.; Davy, J. E.; Wilson, W. H.; Hoegh-Guldberg, O.; Davy, S. K.
2015-03-01
In recent decades, coral reefs worldwide have undergone significant changes in response to various environmental and anthropogenic impacts. Among the numerous causes of reef degradation, coral disease is one factor that is to a large extent still poorly understood. Here, we characterize the physiology of white patch syndrome (WPS), a disease affecting poritid corals on the Great Barrier Reef. WPS manifests as small, generally discrete patches of tissue discolouration. Physiological analysis revealed that chlorophyll a content was significantly lower in lesions than in healthy tissues, while host protein content remained constant, suggesting that host tissue is not affected by WPS. This was confirmed by transmission electron microscope (TEM) examination, which showed intact host tissue within lesions. TEM also revealed that Symbiodinium cells are lost from the host gastrodermis with no apparent harm caused to the surrounding host tissue. Also present in the electron micrographs were numerous virus-like particles (VLPs), in both coral and Symbiodinium cells. Small (<50 nm diameter) icosahedral VLPs were significantly more abundant in coral tissue taken from diseased colonies, and there was an apparent, but not statistically significant, increase in abundance of filamentous VLPs in Symbiodinium cells from diseased colonies. There was no apparent increase in prokaryotic or eukaryotic microbial abundance in diseased colonies. Taken together, these results suggest that viruses infecting the coral and/or its resident Symbiodinium cells may be the causative agents of WPS.
Lai, Yinghui; Zhu, Xiaoshuang; Chen, Yinghe; Li, Yanjun
2015-01-01
Mathematics is one of the most objective, logical, and practical academic disciplines. Yet, in addition to cognitive skills, mathematical problem solving also involves affective factors. In the current study, we first investigated effects of mathematics anxiety (MA) and mathematical metacognition on word problem solving (WPS). We tested 224 children (116 boys, M = 10.15 years old, SD = 0.56) with the Mathematics Anxiety Scale for Children, the Chinese Revised-edition Questionnaire of Pupil’s Metacognitive Ability in Mathematics, and WPS tasks. The results indicated that mathematical metacognition mediated the effect of MA on WPS after controlling for IQ. Second, we divided the children into four mathematics achievement groups including high achieving (HA), typical achieving (TA), low achieving (LA), and mathematical learning difficulty (MLD). Because mathematical metacognition and MA predicted mathematics achievement, we compared group differences in metacognition and MA with IQ partialled out. The results showed that children with MLD scored lower in self-image and higher in learning mathematics anxiety (LMA) than the TA and HA children, but not in mathematical evaluation anxiety (MEA). MLD children’s LMA was also higher than that of their LA counterparts. These results provide insight into factors that may mediate poor WPS performance which emerges under pressure in mathematics. These results also suggest that the anxiety during learning mathematics should be taken into account in mathematical learning difficulty interventions. PMID:26090806
Hawari, F I; Obeidat, N A; Ghonimat, I M; Ayub, H S; Dawahreh, S S
2017-01-01
Evidence regarding the health effects of habitual waterpipe smoking is limited, particularly in young smokers. Respiratory health and cardiopulmonary exercise tests were compared in young male habitual waterpipe smokers (WPS) versus non-smokers. 69 WPS (≥3 times/week for three years) and 69 non-smokers were studied. Respiratory health was assessed through the American Thoracic Society and the Division of Lung Diseases (ATS-DLD-78) adult questionnaire. Pulmonary function and cardiopulmonary exercise tests were performed. Self-reported respiratory symptoms, forced expiratory volume in first second (FEV 1 ), forced vital capacity (FVC), FEV 1 /FVC ratio, forced expiratory flow between 25 and 75% of FVC (FEF 25-75% ), peak expiratory flow (PEF), exercise time, peak end-tidal CO 2 tension (PetCO 2 ), subject-reported leg fatigue and dyspnea; peak O 2 uptake (VO 2 max), and end-expiratory lung volume (EELV) change from baseline (at peak exercise) were measured. WPS were more likely than non-smokers to report respiratory symptoms. WPS also demonstrated: shorter exercise time; lower peak VO 2 ; higher perceived dyspnea at mid-exercise; lower values of the following: FEV 1 , FVC, PEF, and EELV change. Habitual waterpipe tobacco smoking in young seemingly healthy individuals is associated with a greater burden of respiratory symptoms and impaired exercise capacity. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bastiaanssen, Wim G M; Steduto, Pasquale
2017-01-01
Scarce water resources are one of the major constraints to achieve more food production. Food production needs therefore also to be evaluated in terms of water consumption, besides the conventional unit of land. Crop Water Productivity (CWP) is defined as the crop yield per unit of water evaporated. Contrary to crop yield, local benchmark values for CWP do not exist. This paper shows how operational earth observation satellites can measure CWP indirectly on a pixel-by-pixel basis, which provides an opportunity to define local, regional and global benchmark values. In analogy to a grading system for earthquakes (Richter) or wind force (Beaufort), a grading system for CWP is introduced: the Water Productivity Score (WPS). A regional scale WPS and a global version - Global Water Productivity Score (GWPS) - are presented. Crop yield zones are used to reflect local production potential, which reflects also the presence of irrigation systems besides general physio-graphical conditions. The 99 th percentiles of climatic normalized CWP values at global scale are 2.45, 2.3 and 4.9kgm -3 for wheat, rice and maize respectively. There is significant scope to produce the same - or more - food from less water resources, provided that locally specific best on-farm practices are implemented. At the upstream level, Governments can use (G)WPS to define national water and food policies and use it as a means to report to the Sustainable Development Goal standards. At the downstream level, WPS helps to improve on-farm water management practices by growers, both for rainfed and irrigated crops. While the current paper is based on wheat, rice and maize, the same framework can be expanded to potatoes, sugarbeet, sugarcane, fruit trees, cotton and other crops. Copyright © 2016. Published by Elsevier B.V.
Pixel-based CTE Correction of ACS/WFC: New Constraints from Short Darks
NASA Astrophysics Data System (ADS)
Anderson, Jay; ACS Team
2012-01-01
The original Anderson & Bedin (2010) pixel-based correction for imperfect charge-transfer efficiency (CTE) in HST's ACS was based on a study of Warm Pixels (WPs) in a series of 1000s dark exposures. WPs with more than about 25 electrons were sufficiently isolated in these images that we could examine and model their trails. However, WPs with fewer electrons than this were more plentiful and suffered from significant crowding. To remedy this, we have taken a series of shorter dark exposures: 30s, 100s, and 339s. These supplemental exposures have two benefits. The first is that in the shorter exposures, 10 electron WPs are more sparse and their trails can be measured in isolation. The second benefit is that we can now get a handle on the absolute CTE losses, since the long-dark exposures can be used to accurately predict how many counts the WPs in the short-dark exposures should see. Any missing counts are a reflection of imperfect CTE. This new absolute handle on the CTE losses allows us to probe CTE even for very low charge packets. We find that CTE losses reach a nearly pathological level for charge packets with fewer than 20 electrons. Most ACS observations have backgrounds that are higher than this, so this does not have a large impact on science. Nevertheless, understanding CTE losses at all charge-packet levels is still important, as biases and darks often have low backgrounds. We note that these WP-based approaches to understanding CTE losses could be used in laboratory studies, as well. At present, many laboratory studies focus on Iron-55 sources, which all have 1620 electrons. Astronomical sources of interest are often fainter than this. By varying the dark exposure time, a wide diversity of WP intensities can be generated and cross-checked.
Temperature dependence of the coherence in polariton condensates
NASA Astrophysics Data System (ADS)
Rozas, E.; Martín, M. D.; Tejedor, C.; Viña, L.; Deligeorgis, G.; Hatzopoulos, Z.; Savvidis, P. G.
2018-02-01
We present a time-resolved experimental study of the temperature effect on the coherence of traveling polariton condensates. The simultaneous detection of their emission both in real and reciprocal space allows us to fully monitor the condensates' dynamics. We obtain fringes in reciprocal space as a result of the interference between polariton wave packets (WPs) traveling with the same speed. The periodicity of these fringes is inversely proportional to the spatial distance between the interfering WPs. In a similar fashion, we obtain interference fringes in real space when WPs traveling in opposite directions meet. The visibility of both real- and reciprocal-space interference fringes rapidly decreases with increasing temperature and vanishes. A theoretical description of the phase transition, considering the coexistence of condensed and noncondensed particles, for an out-of-equilibrium condensate such as ours is still missing, yet a comparison with theories developed for atomic condensates allows us to infer a critical temperature for the BEC-like transition when the visibility goes to zero.
The climate4impact platform: Providing, tailoring and facilitating climate model data access
NASA Astrophysics Data System (ADS)
Pagé, Christian; Pagani, Andrea; Plieger, Maarten; Som de Cerff, Wim; Mihajlovski, Andrej; de Vreede, Ernst; Spinuso, Alessandro; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Vega, Manuel; Cofiño, Antonio; d'Anca, Alessandro; Fiore, Sandro; Kolax, Michael
2017-04-01
One of the main objectives of climate4impact is to provide standardized web services and tools that are reusable in other portals. These services include web processing services, web coverage services and web mapping services (WPS, WCS and WMS). Tailored portals can be targeted to specific communities and/or countries/regions while making use of those services. Easier access to climate data is very important for the climate change impact communities. To fulfill this objective, the climate4impact (http://climate4impact.eu/) web portal and services has been developed, targeting climate change impact modellers, impact and adaptation consultants, as well as other experts using climate change data. It provides to users harmonized access to climate model data through tailored services. It features static and dynamic documentation, Use Cases and best practice examples, an advanced search interface, an integrated authentication and authorization system with the Earth System Grid Federation (ESGF), a visualization interface with ADAGUC web mapping tools. In the latest version, statistical downscaling services, provided by the Santander Meteorology Group Downscaling Portal, were integrated. An innovative interface to integrate statistical downscaling services will be released in the upcoming version. The latter will be a big step in bridging the gap between climate scientists and the climate change impact communities. The climate4impact portal builds on the infrastructure of an international distributed database that has been set to disseminate the results from the global climate model results of the Coupled Model Intercomparison project Phase 5 (CMIP5). This database, the ESGF, is an international collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of climate model data. The European FP7 project IS-ENES, Infrastructure for the European Network for Earth System modelling, supports the European contribution to ESGF and contributes to the ESGF open source effort, notably through the development of search, monitoring, quality control, and metadata services. In its second phase, IS-ENES2 supports the implementation of regional climate model results from the international Coordinated Regional Downscaling Experiments (CORDEX). These services were extended within the European FP7 Climate Information Portal for Copernicus (CLIPC) project, and some could be later integrated into the European Copernicus platform.
GSKY: A scalable distributed geospatial data server on the cloud
NASA Astrophysics Data System (ADS)
Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben
2017-04-01
Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.
Towards a Brokering Framework for Business Process Execution
NASA Astrophysics Data System (ADS)
Santoro, Mattia; Bigagli, Lorenzo; Roncella, Roberto; Mazzetti, Paolo; Nativi, Stefano
2013-04-01
Advancing our knowledge of environmental phenomena and their interconnections requires an intensive use of environmental models. Due to the complexity of Earth system, the representation of complex environmental processes often requires the use of more than one model (often from different disciplines). The Group on Earth Observation (GEO) launched the Model Web initiative to increase present accessibility and interoperability of environmental models, allowing their flexible composition into complex Business Processes (BPs). A few, basic principles are at the base of the Model Web concept (Nativi, et al.): (i) Open access, (ii) Minimal entry-barriers, (iii) Service-driven approach, and (iv) Scalability. This work proposes an architectural solution, based on the Brokering approach for multidisciplinary interoperability, aiming to contribute to the Model Web vision. The Brokering approach is currently adopted in the new GEOSS Common Infrastructure (GCI) as was presented at the last GEO Plenary meeting in Istanbul, November 2011. We designed and prototyped a component called BP Broker. The high-level functionalities provided by the BP Broker are: • Discover the needed model implementations in an open, distributed and heterogeneous environment; • Check I/O consistency of BPs and provide suggestions for mismatches resolving: • Publish the EBP as a standard model resource for re-use. • Submit the compiled BP (EBP) to a WF-engine for execution. A BP Broker has the following features: • Support multiple abstract BP specifications; • Support encoding in multiple WF-engine languages. According to the Brokering principles, the designed system is flexible enough to support the use of multiple BP design (visual) tools, heterogeneous Web interfaces for model execution (e.g. OGC WPS, WSDL, etc.), and different Workflow engines. The present implementation makes use of BPMN 2.0 notation for BP design and jBPM workflow engine for eBP execution; however, the strong decoupling which characterizes the design of the BP Broker easily allows supporting other technologies. The main benefits of the proposed approach are: (i) no need for a composition infrastructure, (ii) alleviation from technicalities of workflow definitions, (iii) support of incomplete BPs, and (iv) the reuse of existing BPs as atomic processes. The BP Broker was designed and prototyped in the EC funded projects EuroGEOSS (http://www.eurogeoss.eu) and UncertWeb (http://www.uncertweb.org); the latter project provided also the use scenarios that were used to test the framework: the eHabitat scenario (calculation habitat similarity likelihood) and the FERA scenario (impact of climate change on land-use and crop yield). Three more scenarios are presently under development. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreements n. 248488 and n. 226487. References Nativi, S., Mazzetti, P., & Geller, G. (2012), "Environmental model access and interoperability: The GEO Model Web initiative". Environmental Modelling & Software , 1-15
A Brokering Solution for Business Process Execution
NASA Astrophysics Data System (ADS)
Santoro, M.; Bigagli, L.; Roncella, R.; Mazzetti, P.; Nativi, S.
2012-12-01
Predicting the climate change impact on biodiversity and ecosystems, advancing our knowledge of environmental phenomena interconnection, assessing the validity of simulations and other key challenges of Earth Sciences require intensive use of environmental modeling. The complexity of Earth system requires the use of more than one model (often from different disciplines) to represent complex processes. The identification of appropriate mechanisms for reuse, chaining and composition of environmental models is considered a key enabler for an effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. The Group on Earth Observation (GEO) Model Web initiative aims to increase present accessibility and interoperability of environmental models, allowing their flexible composition into complex Business Processes (BPs). A few, basic principles are at the base of the Model Web concept (Nativi, et al.): 1. Open access 2. Minimal entry-barriers 3. Service-driven approach 4. Scalability In this work we propose an architectural solution aiming to contribute to the Model Web vision. This solution applies the Brokering approach for facilitiating complex multidisciplinary interoperability. The Brokering approach is currently adopted in the new GEOSS Common Infrastructure (GCI) as was presented at the last GEO Plenary meeting in Istanbul, November 2011. According to the Brokering principles, the designed system is flexible enough to support the use of multiple BP design (visual) tools, heterogeneous Web interfaces for model execution (e.g. OGC WPS, WSDL, etc.), and different Workflow engines. We designed and prototyped a component called BP Broker that is able to: (i) read an abstract BP, (ii) "compile" the abstract BP into an executable one (eBP) - in this phase the BP Broker might also provide recommendations for incomplete BPs and parameter mismatch resolution - and (iii) finally execute the eBP using a Workflow engine. The present implementation makes use of BPMN 2.0 notation for BP design and jBPM workflow engine for eBP execution; however, the strong decoupling which characterizes the design of the BP Broker easily allows supporting other technologies. The main benefits of the proposed approach are: (i) no need for a composition infrastructure, (ii) alleviation from technicalities of workflow definitions, (iii) support of incomplete BPs, and (iv) the reuse of existing BPs as atomic processes. The BP Broker was designed and prototyped in the EC funded projects EuroGEOSS (http://www.eurogeoss.eu) and UncertWeb (http://www.uncertweb.org); the latter project provided also the use scenarios that were used to test the framework: the eHabitat scenario (calculation habitat similarity likelihood) and the FERA scenario (impact of climate change on land-use and crop yield). Three more scenarios are presently under development. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreements n. 248488 and n. 226487. References Nativi, S., Mazzetti, P., & Geller, G. (2012), "Environmental model access and interoperability: The GEO Model Web initiative". Environmental Modelling & Software , 1-15
A new look at the position operator in quantum theory
NASA Astrophysics Data System (ADS)
Lev, F. M.
2015-01-01
The postulate that coordinate and momentum representations are related to each other by the Fourier transform has been accepted from the beginning of quantum theory by analogy with classical electrodynamics. As a consequence, an inevitable effect in standard theory is the wave packet spreading (WPS) of the photon coordinate wave function in directions perpendicular to the photon momentum. This leads to the following paradoxes: if the major part of photons emitted by stars are in wave packet states (what is the most probable scenario) then we should see not separate stars but only an almost continuous background from all stars; no anisotropy of the CMB radiation should be observable; data on gamma-ray bursts, signals from directional radio antennas (in particular, in experiments on Shapiro delay) and signals from pulsars show no signs of WPS. In addition, a problem arises why there are no signs of WPS for protons in the LHC ring. We argue that the above postulate is based neither on strong theoretical arguments nor on experimental data and propose a new consistent definition of the position operator. Then WPS in directions perpendicular to the particle momentum is absent and the paradoxes are resolved. Different components of the new position operator do not commute with each other and, as a consequence, there is no wave function in coordinate representation. Implications of the results for entanglement, quantum locality and the problem of time in quantum theory are discussed.
Design of water pumping system by wind turbine for using in coastal areas of Bangladesh
NASA Astrophysics Data System (ADS)
Alam, Muhammad Mahbubul; Tasnim, Tamanna; Doha, Umnia
2017-06-01
In this work, a theoretical analysis has been carried out to analyze the prospect of Wind Pumping System (WPS) for using in coastal areas of Bangladesh. Wind speed data of three coastal areas of Bangladesh-Kutubdia, Patenga and Sathkhira has been analyzed and an optimal wind turbine viable for this wind speed range has been designed using the simulation software Q-blade. The simulated turbine is then coupled with a rotodynamic pump. The output of the Wind Pumping System (WPS) for the three coastal areas has been studied.
Geospatial Web Services in Real Estate Information System
NASA Astrophysics Data System (ADS)
Radulovic, Aleksandra; Sladic, Dubravka; Govedarica, Miro; Popovic, Dragana; Radovic, Jovana
2017-12-01
Since the data of cadastral records are of great importance for the economic development of the country, they must be well structured and organized. Records of real estate on the territory of Serbia met many problems in previous years. To prevent problems and to achieve efficient access, sharing and exchange of cadastral data on the principles of interoperability, domain model for real estate is created according to current standards in the field of spatial data. The resulting profile of the domain model for the Serbian real estate cadastre is based on the current legislation and on Land Administration Domain Model (LADM) which is specified in the ISO19152 standard. Above such organized data, and for their effective exchange, it is necessary to develop a model of services that must be provided by the institutions interested in the exchange of cadastral data. This is achieved by introducing a service-oriented architecture in the information system of real estate cadastre and with that ensures efficiency of the system. It is necessary to develop user services for download, review and use of the real estate data through the web. These services should be provided to all users who need access to cadastral data (natural and legal persons as well as state institutions) through e-government. It is also necessary to provide search, view and download of cadastral spatial data by specifying geospatial services. Considering that real estate contains geometric data for parcels and buildings it is necessary to establish set of geospatial services that would provide information and maps for the analysis of spatial data, and for forming a raster data. Besides the theme Cadastral parcels, INSPIRE directive specifies several themes that involve data on buildings and land use, for which data can be provided from real estate cadastre. In this paper, model of geospatial services in Serbia is defined. A case study of using these services to estimate which household is at risk of flooding using the Web Processing Service (WPS) spatial analysis is described.
Modeling elephant-mediated cascading effects of water point closure.
Hilbers, Jelle P; Van Langevelde, Frank; Prins, Herbert H T; Grant, C C; Peel, Mike J S; Coughenour, Michael B; De Knegt, Henrik J; Slotow, Rob; Smit, Izak P J; Kiker, Greg A; De Boer, Willem F
2015-03-01
Wildlife management to reduce the impact of wildlife on their habitat can be done in several ways, among which removing animals (by either culling or translocation) is most often used. There are, however, alternative ways to control wildlife densities, such as opening or closing water points. The effects of these alternatives are poorly studied. In this paper, we focus on manipulating large herbivores through the closure of water points (WPs). Removal of artificial WPs has been suggested in order to change the distribution of African elephants, which occur in high densities in national parks in Southern Africa and are thought to have a destructive effect on the vegetation. Here, we modeled the long-term effects of different scenarios of WP closure on the spatial distribution of elephants, and consequential effects on the vegetation and other herbivores in Kruger National Park, South Africa. Using a dynamic ecosystem model, SAVANNA, scenarios were evaluated that varied in availability of artificial WPs; levels of natural water; and elephant densities. Our modeling results showed that elephants can indirectly negatively affect the distributions of meso-mixed feeders, meso-browsers, and some meso-grazers under wet conditions. The closure of artificial WPs hardly had any effect during these natural wet conditions. Under dry conditions, the spatial distribution of both elephant bulls and cows changed when the availability of artificial water was severely reduced in the model. These changes in spatial distribution triggered changes in the spatial availability of woody biomass over the simulation period of 80 years, and this led to changes in the rest of the herbivore community, resulting in increased densities of all herbivores, except for giraffe and steenbok, in areas close to rivers. The spatial distributions of elephant bulls and cows showed to be less affected by the closure of WPs than most of the other herbivore species. Our study contributes to ecologically informed decisions in wildlife management. The results from this modeling exercise imply that long-term effects of this intervention strategy should always be investigated at an ecosystem scale.
Improving low-wage, midsized employers' health promotion practices: a randomized controlled trial.
Hannon, Peggy A; Harris, Jeffrey R; Sopher, Carrie J; Kuniyuki, Alan; Ghosh, Donetta L; Henderson, Shelly; Martin, Diane P; Weaver, Marcia R; Williams, Barbara; Albano, Denise L; Meischke, Hendrika; Diehr, Paula; Lichiello, Patricia; Hammerback, Kristen E; Parks, Malcolm R; Forehand, Mark
2012-08-01
The Guide to Community Preventive Services (Community Guide) offers evidence-based intervention strategies to prevent chronic disease. The American Cancer Society (ACS) and the University of Washington Health Promotion Research Center co-developed ACS Workplace Solutions (WPS) to improve workplaces' implementation of Community Guide strategies. To test the effectiveness of WPS for midsized employers in low-wage industries. Two-arm RCT; workplaces were randomized to receive WPS during the study (intervention group) or at the end of the study (delayed control group). Forty-eight midsized employers (100-999 workers) in King County WA. WPS provides employers one-on-one consulting with an ACS interventionist via three meetings at the workplace. The interventionist recommends best practices to adopt based on the workplace's current practices, provides implementation toolkits for the best practices the employer chooses to adopt, conducts a follow-up visit at 6 months, and provides technical assistance. Employers' implementation of 16 best practices (in the categories of insurance benefits, health-related policies, programs, tracking, and health communications) at baseline (June 2007-June 2008) and 15-month follow-up (October 2008-December 2009). Data were analyzed in 2010-2011. Intervention employers demonstrated greater improvement from baseline than control employers in two of the five best-practice categories; implementing policies (baseline scores: 39% program, 43% control; follow-up scores: 49% program, 45% control; p=0.013) and communications (baseline scores: 42% program, 44% control; follow-up scores: 76% program, 55% control; p=0.007). Total best-practice implementation improvement did not differ between study groups (baseline scores: 32% intervention, 37% control; follow-up scores: 39% intervention, 42% control; p=0.328). WPS improved employers' health-related policies and communications but did not improve insurance benefits design, programs, or tracking. Many employers were unable to modify insurance benefits and reported that the time and costs of implementing best practices were major barriers. This study is registered at clinicaltrials.gov NCT00452816. Copyright © 2012 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Observed and Self-Reported Pesticide Protective Behaviors of Latino Migrant and Seasonal Farmworkers
Walton, AnnMarie Lee; LePrevost, Catherine; Wong, Bob; Linnan, Laura; Sanchez-Birkhead, Ana; Mooney, Kathi
2016-01-01
Agricultural pesticide exposure has potential adverse health effects for farmworkers that may be reduced by pesticide protective behaviors (PPBs). The Environmental Protection Agency’s (EPA) Worker Protection Standard (WPS) requires PPBs be taught to farmworkers prior to field work. Studies to date have not utilized observational methods to evaluate the degree to which PPBs are practiced by Latino migrant and seasonal farmworkers in the United States. The purpose of this study was to describe, compare, and contrast observed and self-reported PPBs used by Latino farmworkers; both PPBs that the WPS requires be taught and other PPBs were included. Observed and self-reported data were collected from 71 Latino farmworkers during the 2014 tobacco growing season in North Carolina. Participants were consistent in reporting and using long pants and closed shoes in the field most of the time. In addition, gloves, hats/bandanas, and water-resistant outerwear were frequently observed, although they are not required to be taught by the WPS. Farmworkers reported more long-sleeve (p = .028) and glove use (p = .000) than what was observed. It was uncommon to observe washing behavior before eating or drinking, even when washing supplies were available. Washing behaviors were significantly overreported for hand (p = .000; (p = .000) and face (p = .000; (p = .058) washing before eating and drinking in the field. This study documents that protective clothing behaviors that the WPS requires be taught, plus a few others are commonly practiced by Latino migrant and seasonal farmworkers, but washing behaviors in the field are not. Targeted strategies to improve washing behaviors in the field are needed. PMID:26918841
Schröder, Claudia; Chaaya, Monique; Saab, Dahlia; Mahfoud, Ziyad
2016-03-01
The phenomenon of waterpipe smoking (WPS) among adolescents has become eminent, and it is especially prevalent in Lebanon. Unlike cigarette smoking, WPS is parentally and socially acceptable. This study aims at examining the association between intention to smoke waterpipe in the next 6 months, and knowledge, attitudes and parental and social influences. This is a secondary data analysis from a national survey in 2007 on 1028 households. This study addresses 258 non-smoking adolescents and their parents. Consent was sought and the study was approved by the Institutional Review Board at the American University of Beirut. Face-to-face interviews were conducted. Descriptive analysis, crude and adjusted odds ratios (ORs) were generated. At the bivariate level, late adolescence, mothers without university education, prior adolescents' WPS experiences, best friends' and parents' WPS habits and adolescents' and parents' lower attitude scores were associated with smoking intention. In the adjusted model, adolescents' beliefs about positive effects, best friends' similar habits and prior smoking remained significant (respective ORs [95% confidence interval ]: 1.81 [1.33-2.45], 2.51 [1.24-5.10], 4.91 [2.35-10.36]). Parents' perceived attitude against smoking was protective (OR: 0.57 [0.39-0.83]). Adolescents' intention to smoke is highly influenced by parents' permissive attitudes and peer pressure. Interventions targeting these two groups and limiting access to smoking by adolescents should be instigated. © The Author 2015. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Local Regularity Analysis with Wavelet Transform in Gear Tooth Failure Detection
NASA Astrophysics Data System (ADS)
Nissilä, Juhani
2017-09-01
Diagnosing gear tooth and bearing failures in industrial power transition situations has been studied a lot but challenges still remain. This study aims to look at the problem from a more theoretical perspective. Our goal is to find out if the local regularity i.e. smoothness of the measured signal can be estimated from the vibrations of epicyclic gearboxes and if the regularity can be linked to the meshing events of the gear teeth. Previously it has been shown that the decreasing local regularity of the measured acceleration signals can reveal the inner race faults in slowly rotating bearings. The local regularity is estimated from the modulus maxima ridges of the signal's wavelet transform. In this study, the measurements come from the epicyclic gearboxes of the Kelukoski water power station (WPS). The very stable rotational speed of the WPS makes it possible to deduce that the gear mesh frequencies of the WPS and a frequency related to the rotation of the turbine blades are the most significant components in the spectra of the estimated local regularity signals.
Value Production in a Collaborative Environment. Sociophysical Studies of Wikipedia
NASA Astrophysics Data System (ADS)
Yasseri, Taha; Kertész, János
2013-05-01
We review some recent endeavors and add some new results to characterize and understand underlying mechanisms in Wikipedia (WP), the paradigmatic example of collaborative value production. We analyzed the statistics of editorial activity in different languages and observed typical circadian and weekly patterns, which enabled us to estimate the geographical origins of contributions to WPs in languages spoken in several time zones. Using a recently introduced measure we showed that the editorial activities have intrinsic dependencies in the burstiness of events. A comparison of the English and Simple English WPs revealed important aspects of language complexity and showed how peer cooperation solved the task of enhancing readability. One of our focus issues was characterizing the conflicts or edit wars in WPs, which helped us to automatically filter out controversial pages. When studying the temporal evolution of the controversiality of such pages we identified typical patterns and classified conflicts accordingly. Our quantitative analysis provides the basis of modeling conflicts and their resolution in collaborative environments and contribute to the understanding of this issue, which becomes increasingly important with the development of information communication technology.
Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo
2010-05-01
In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and international projects investigated different aspects of this integration, developing demonstrators and Proof-of-Concepts; In this context, "gLite enablement of OpenGeospatial Web Services" (G-OWS) is an initiative started in 2008 by the European CYCLOPS, GENESI-DR, and DORII Projects Consortia in order to collect/coordinate experiences on the enablement of OWS on top of the gLite middleware [GOWS]. Currently G-OWS counts ten member organizations from Europe and beyond, and four European Projects involved. It broadened its scope to the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Its operational objectives are the following: i) to contribute to the OGC-OGF initiative; ii) to release a reference implementation as standard gLite APIs (under the gLite software license); iii) to release a reference model (including procedures and guidelines) for OWS Grid-ification, as far as gLite is concerned; iv) to foster and promote the formation of consortiums for participation to projects/initiatives aimed at building Grid-enabled SDIs To achieve this objectives G-OWS bases its activities on two main guiding principles: a) the adoption of a service-oriented architecture based on the information modelling approach, and b) standardization as a means of achieving interoperability (i.e. adoption of standards from ISO TC211, OGC OWS, OGF). In the first year of activity G-OWS has designed a general architectural framework stemming from the FP6 CYCLOPS studies and enriched by the outcomes of other projects and initiatives involved (i.e. FP7 GENESI-DR, FP7 DORII, AIST GeoGrid, etc.). Some proof-of-concepts have been developed to demonstrate the flexibility and scalability of such architectural framework. The G-OWS WG developed implementations of gLite-enabled Web Coverage Service (WCS) and Web Processing Service (WPS), and an implementation of a Shibboleth authentication for gLite-enabled OWS in order to evaluate the possible integration of Web and Grid security models. The presentation will aim to communicate the G-OWS organization, activities, future plans and means to involve the ESSI community. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Foster 2001] I. Foster, C. Kesselman and S. Tuecke, "The Anatomy of the Grid. The International Journal ofHigh Performance Computing Applications", 15(3):200-222, Fall 2001 [GOWS] G-OWS WG, https://www.g-ows.org/, accessed: 15 January 2010
Geospatial Data as a Service: The GEOGLAM Rangelands and Pasture Productivity Map Experience
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Antony, J.; Guerschman, J. P.; Larraondo, P. R.; Richards, C. J.
2017-12-01
Empowering end-users like pastoralists, land management specialists and land policy makers in the use of earth observation data for both day-to-day and seasonal planning needs both interactive delivery of multiple geospatial datasets and the capability of supporting on-the-fly dynamic queries while simultaneously fostering a community around the effort. The use of and wide adoption of large data archives, like those produced by earth observation missions, are often limited by compute and storage capabilities of the remote user. We demonstrate that wide-scale use of large data archives can be facilitated by end-users dynamically requesting value-added products using open standards (WCS, WMS, WPS), with compute running in the cloud or dedicated data-centres and visualizing outputs on web-front ends. As an example, we will demonstrate how a tool called GSKY can empower a remote end-user by providing the data delivery and analytics capabilities for the GEOGLAM Rangelands and Pasture Productivity (RAPP) Map tool. The GEOGLAM RAPP initiative from the Group on Earth Observations (GEO) and its Agricultural Monitoring subgroup aims at providing practical tools to end-users focusing on the important role of rangelands and pasture systems in providing food production security from both agricultural crops and animal protein. Figure 1, is a screen capture from the RAPP Map interface for an important pasture area in the Namibian rangelands. The RAPP Map has been in production for six months and has garnered significant interest from groups and users all over the world. GSKY, being formulated around the theme of Open Geospatial Data-as-a-Service capabilities uses distributed computing and storage to facilitate this. It works behind the scenes, accepting OGC standard requests in WCS, WMS and WPS. Results from these requests are rendered on a web-front end. In this way, the complexities of data locality and compute execution are masked from an end user. On-the-fly computation of products such as NDVI, Leaf Area Index, vegetation cover and others from original source data including MODIS are achived, with Landsat and Sentinel-2 on the horizon. Innovative use of cloud computing and storage along with flexible front-ends, allow the democratization of data dissemination and we hope better outcomes for the planet.
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.
2017-12-01
NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing
Shipp, E M; Cooper, S P; Burau, K D; Bolin, J N
2005-02-01
Very little published research describes employer compliance with EPA-mandated Worker Protection Standard (WPS) pesticide safety training and the OSHA Field Sanitation Standard among farmworker women in general and mothers specifically. A goal of both standards is limiting farmworkers' exposure to potentially hazardous agricultural pesticides. Data from a NIOSH-supported cohort study ("Injury and Illness Surveillance in Migrant Farmworker Families") allowed for examining these issues. The cohort included 267 migrant farmworker families who usually reside along the Texas-Mexico border (Starr County, Texas). Data were collected in Starr County during in-home interviews. Of 102 mothers who participated in migrant farm work during summer 2001, 57 (55.9%) reported having ever received training/instruction in the safe use of pesticides, while 47 (46.1%) reported having received training within the previous five years, as required by WPS. Of trained mothers, 91.5% to 93.6% reported that their training covered key WPS areas: (1) entry into a recently treated field, (2) pesticide related injuries/illnesses, and (3) where to go and who to contact for emergency care following exposure. Regarding access to field sanitation, 67.5% to 84.2% of 77 mothers who worked outside Texas reported employer-provided decontamination supplies (e.g., soap, wash water, towels, and toilet facilities). However, a strikingly smaller proportion (12% to 28%) of 25 mothers who worked within Texas reported access to the same resources, suggesting discrepancies in compliance across the U.S. Due to the low level of employer compliance with both WPS and OSHA mandated standards, increased enforcement and an alternate delivery of pesticide training is recommended.
Leonardi, Matilde; Martinuzzi, Andrea; Meucci, Paolo; Sala, Marina; Russo, Emanuela; Buffoni, Mara; Raggi, Alberto
2012-01-01
Aim of this paper is to describe functioning of subjects with “severe disability” collected with a protocol based on the International Classification of Functioning, Disability, and Health. It included sections on body functions and structures (BF and BS), activities and participation (A&P), and environmental factors (EF). In A&P, performance without personal support (WPS) was added to standard capacity and performance. Persons with severe disability were those reporting a number of very severe/complete problems in BF or in A&P-capacity superior to mean + 1SD. Correlations between BF and A&P and differences between capacity, performance-WPS, and performance were assessed with Spearman's coefficient. Out of 1051, 200 subjects were considered as severely disabled. Mild to moderate correlations between BF and A&P were reported (between 0.148 and 0.394 when the full range of impairments/limitations was taken into account; between 0.198 and 0.285 when only the severe impairments/limitations were taken into account); performance-WPS was less similar to performance than to capacity. Our approach enabled identifying subjects with “severe disability” and separating the effect of personal support from that of devices, policies, and service provision. PMID:22454601
Validation of an Arabic version of an instrument to measure waterpipe smoking behavior.
Abou Arbid, S; Al Mulla, A; Ghandour, B; Ammar, N; Adawi, M; Daher, R; Younes, N; Chami, H A
2017-04-01
Reliable and valid measures of waterpipe smoking are essential to study its health effects. The purpose of this study was to examine the reliability and validity of an Arabic translation of Maziak questionnaire that assesses various aspects of waterpipe smoking in epidemiological studies. A cross-sectional study. This questionnaire was translated, back translated, and culturally adapted to the local Arabic dialect. Construct and convergent validity were assessed in a sample of 119 daily waterpipe smokers (WPS) and 30 occasional WPS, defined as smoking at least one waterpipe per week but less than daily from Beirut and Doha (mean age = 52.4 years, males = 61.7%). Construct validity was assessed by comparing the smoking behavior of daily and occasional WPS. Convergent validity was assessed by correlating daily smoking intensity ('number of waterpipe smoked per day') with 'number of waterpipe smoked yesterday' and by correlating lifetime smoking exposure (waterpipe-year) calculated by multiplying number of waterpipe smoked per day × duration of waterpipe smoking with alternate measures obtained graphically (graphical waterpipe-year) or adjusted (adjusted waterpipe-year). Criterion validity was assessed by correlating daily smoking intensity and lifetime smoking exposure with serum cotinine level. Test-retest reliability was analyzed by re-administering the questionnaire to 30 daily and 30 occasional WPS after 2 weeks. Smoking intensity, patterns of use, and willingness to quit differed significantly between daily and occasional WPS. Daily smoking intensity correlated strongly with the number of waterpipe smoked yesterday (r s = 0.68, P < 0.001), but not in the occasional WPS (r s = 0.13, P = 0.70). Waterpipe-year correlated very strongly with adjusted waterpipe-year and graphical waterpipe-year (r s = 0.98, P < 0.001 and r s = 0.92, P < 0.001, respectively). Waterpipe-year, daily smoking intensity, and number of waterpipe smoked yesterday, correlated weakly but significantly with serum cotinine levels (r s = 0.243, P = 0.01; r s = 0.359, P < 0.01 and r s = 0.387, P < 0.01, respectively). The type and pattern of waterpipe use items showed high test-retest reliability with near perfect agreement (k > 0.9), the sharing and intention to quit waterpipe items had substantial agreement (k > 0.6), and the intent to quit item showed moderate agreement (k > 0.4). The questionnaire showed strong reliability, face validity, construct and convergent validity, and a weak but statistically significant criterion validity. Maziak questionnaire is valid and reliable for assessing waterpipe smoking patterns, intensity, and willingness to quit. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Wutzke, K D; Lotz, M; Zipprich, C
2010-10-01
The evaluation of ammonia detoxification by pre- and probiotics by means of colonic lactose-[(15)N(2)]ureide ((15)N-LU) degradation is of great interest both scientifically and in terms of nutrition physiology. Pre- and probiotics were supplemented in healthy adults to evaluate the effect of the ammonia metabolism in the human colon by means of (15)N-LU. A total of 14 participants aged 20-28 years daily received a regular diet either without (no treatment) or with supplementation of 30 g fibre of potatoes (FPs), 30 g wrinkle pea starch (WPS, resistant starch content: 12 and 70%, respectively) and 375 g Lactobacillus acidophilus (LC1) yoghurt, over a 10-day period in a randomised order. After 1 week, 5.7 mg/kg body weight (15)N-LU was administered together with breakfast. A venous blood sample was taken after 6 h. Urine and faeces were collected over a period of 48 and 72 h, respectively. The (15)N abundances were measured by isotope ratio mass spectrometry. The mean renal (15)N-excretion differed significantly between the supplementation of FP and no treatment (32.5 versus 46.3%, P=0.034), FP and LC1 (32.5 versus 51.6%, P=0.001), and WPS and LC1 (38.5 versus 51.6%, P=0.048). The mean faecal (15)N-excretion amounted to 42.7% (no treatment), 59.7% (FP), 41.8% (WPS) and 44.0% (LC1). In comparison with no treatment, the urinary (15)NH(3)-enrichment was significantly decreased at 16 h after FP supplementation. The prebiotic intake of FP and WPS lowered the colonic generation and the renal excretion of toxic (15)NH(3), respectively, when using (15)N-LU as a xenobiotic marker.
NASA Astrophysics Data System (ADS)
Sokolovski, D.; Connor, J. N. L.
1990-12-01
The wave-packet simulation (WPS) method for calculating the time a tunneling particle spends inside a one-dimensional potential barrier is reexamined using the Feynman path-integral technique. Following earlier work by Sokolovski and Baskin [Phys. Rev. A 36, 4604 (1987)], the tunneling (or traversal) time tTpack is defined as a matrix element of a classical nonlocal functional between two states that represent the initial and transmitted wave packets. These states do not lie on the same orbit in Hilbert space; as a result, tTpack is complex-valued. It is shown that RetTpack reduces to the standard WPS result, tTphase, for conditions similar to those employed in the conventional WPS analysis. Similarly, ImtTpack is shown to contain information about the energy dependence of the transmission probability. Under semiclassical conditions, ImtTpack reduces to the well-known Wentzel-Kramers-Brillouin expression for the tunneling time. It is shown there are different definitions for the traversal time of a classical moving object, whose size is comparable to the width of the region of interest. In the quantum case, these different definitions correspond to different ways of analyzing the WPS experiment. The path-integral approach demonstrates that the tunneling-time problem is one of understanding the physical significance of complex-valued off-orbit matrix elements of an operator or functional. The physical content of complex-valued tunneling times is discussed. It is emphasized that the use of complex tunneling times includes real-time approaches as a special case. Nevertheless, there is a limitation in the description of tunneling experiments using tunneling times, whether real or complex. The path-integral approach does not supply a universal traversal time, analogous to a classical time, that can be used in quantum situations. It is demonstrated that the often expressed hope of finding a well-defined and universal real tunneling time is erroneous.
Tethys: A Platform for Water Resources Modeling and Decision Support Apps
NASA Astrophysics Data System (ADS)
Swain, N. R.; Christensen, S. D.; Jones, N.; Nelson, E. J.
2014-12-01
Cloud-based applications or apps are a promising medium through which water resources models and data can be conveyed in a user-friendly environment—making them more accessible to decision-makers and stakeholders. In the context of this work, a water resources web app is a web application that exposes limited modeling functionality for a scenario exploration activity in a structured workflow (e.g.: land use change runoff analysis, snowmelt runoff prediction, and flood potential analysis). The technical expertise required to develop water resources web apps can be a barrier to many potential developers of water resources apps. One challenge that developers face is in providing spatial storage, analysis, and visualization for the spatial data that is inherent to water resources models. The software projects that provide this functionality are non-standard to web development and there are a large number of free and open source software (FOSS) projects to choose from. In addition, it is often required to synthesize several software projects to provide all of the needed functionality. Another challenge for the developer will be orchestrating the use of several software components. Consequently, the initial software development investment required to deploy an effective water resources cloud-based application can be substantial. The Tethys Platform has been developed to lower the technical barrier and minimize the initial development investment that prohibits many scientists and engineers from making use of the web app medium. Tethys synthesizes several software projects including PostGIS for spatial storage, 52°North WPS for spatial analysis, GeoServer for spatial publishing, Google Earth™, Google Maps™ and OpenLayers for spatial visualization, and Highcharts for plotting tabular data. The software selection came after a literature review of software projects being used to create existing earth sciences web apps. All of the software is linked via a Python-powered software development kit (SDK). Tethys developers use the SDK to build their apps and incorporate the needed functionality from the software suite. The presentation will include several apps that have been developed using Tethys to demonstrate its capabilities. Based upon work supported by the National Science Foundation under Grant No. 1135483.
Thermal comfort zone of the hands, feet and head in males and females.
Ciuha, Urša; Mekjavic, Igor B
2017-10-01
The present study compared the thermal comfort zones (TCZ) of the hands, feet and head in eight male and eight female participants, assessed with water-perfused segments (WPS). On separate occasions, and separated by a minimum of one day, participants were requested to regulate the temperature of three distal skin regions (hands, feet and head) within their TCZ. On each occasion they donned a specific water-perfused segment (WPS), either gloves, socks or hood for assessing the TCZ of the hands, feet and head, respectively. In the absence of regulation, the temperature of the water perfusing the WPS changed in a saw-tooth manner from 10 to 50°C; by depressing a switch and reversing the direction of the temperature at the limits of the TCZ, each participant defined the TCZ for each skin region investigated. The range of regulated temperatures (upper and lower limits of the TCZ) did not differ between studied skin regions or between genders. Participants however maintained higher head (35.7±0.4°C; p˂0.001) skin temperature (Tsk) compared to hands (34.5±0.8°C) and feet (33.8±1.1°C). When exposed to normothermic conditions, distal skin regions do not differ in ranges of temperatures, perceived as thermally comfortable. Copyright © 2017. Published by Elsevier Inc.
White, M.A.; Schmidt, J.C.; Topping, D.J.
2005-01-01
Wavelet analysis is a powerful tool with which to analyse the hydrologic effects of dam construction and operation on river systems. Using continuous records of instantaneous discharge from the Lees Ferry gauging station and records of daily mean discharge from upstream tributaries, we conducted wavelet analyses of the hydrologic structure of the Colorado River in Grand Canyon. The wavelet power spectrum (WPS) of daily mean discharge provided a highly compressed and integrative picture of the post-dam elimination of pronounced annual and sub-annual flow features. The WPS of the continuous record showed the influence of diurnal and weekly power generation cycles, shifts in discharge management, and the 1996 experimental flood in the post-dam period. Normalization of the WPS by local wavelet spectra revealed the fine structure of modulation in discharge scale and amplitude and provides an extremely efficient tool with which to assess the relationships among hydrologic cycles and ecological and geomorphic systems. We extended our analysis to sections of the Snake River and showed how wavelet analysis can be used as a data mining technique. The wavelet approach is an especially promising tool with which to assess dam operation in less well-studied regions and to evaluate management attempts to reconstruct desired flow characteristics. Copyright ?? 2005 John Wiley & Sons, Ltd.
Automated Work Package: Conceptual Design and Data Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al Rashdan, Ahmad; Oxstrand, Johanna; Agarwal, Vivek
The automated work package (AWP) is one of the U.S. Department of Energy’s (DOE) Light Water Reactor Sustainability Program efforts to enhance the safety and economics of the nuclear power industry. An AWP is an adaptive and interactive work package that intelligently drives the work process according to the plant condition, resources status, and users progress. The AWP aims to automate several manual tasks of the work process to enhance human performance and reduce human errors. Electronic work packages (eWPs), studied by the Electric Power Research Institute (EPRI), are work packages that rely to various extent on electronic data processingmore » and presentation. AWPs are the future of eWPs. They are envisioned to incorporate the advanced technologies of the future, and thus address the unresolved deficiencies associated with the eWPs in a nuclear power plant. In order to define the AWP, it is necessary to develop an ideal envisioned scenario of the future work process without any current technology restriction. The approach followed to develop this scenario is specific to every stage of the work process execution. The scenario development resulted in fifty advanced functionalities that can be part of the AWP. To rank the importance of these functionalities, a survey was conducted involving several U.S. nuclear utilities. The survey aimed at determining the current need of the nuclear industry with respect to the current work process, i.e. what the industry is satisfied with, and where the industry envisions potential for improvement. The survey evaluated the most promising functionalities resulting from the scenario development. The results demonstrated a significant desire to adopt the majority of these functionalities. The results of the survey are expected to drive the Idaho National Laboratory (INL) AWP research and development (R&D). In order to facilitate this mission, a prototype AWP is needed. Since the vast majority of earlier efforts focused on the frontend aspects of the AWP, the backend data architecture was researched and developed in this effort. The backend design involved data architecture aspects. It was realized through this effort that the key aspects of this design are hierarchy, data configuration and live information, data templates and instances, the flow of work package execution, the introduction of properties, and the means to interface the backend to the frontend. After the backend design was developed, a data structure was built to reflect the developed data architecture. The data structure was developed to accommodate the fifty functionalities identified by the envisioned scenario development. The data structure was evaluated by incorporating an example work order from the nuclear power industry. The implementation resulted in several optimization iterations of the data structure. In addition, the rearrangement of the work order information to fit the data structure highlighted several possibilities for improvement in the current work order design, and significantly reduced the size of the work order.« less
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.
2016-12-01
We are extending climate analytics-as-a-service, including: (1) A high-performance Virtual Real-Time Analytics Testbed supporting six major reanalysis data sets using advanced technologies like the Cloudera Impala-based SQL and Hadoop-based MapReduce analytics over native NetCDF files. (2) A Reanalysis Ensemble Service (RES) that offers a basic set of commonly used operations over the reanalysis collections that are accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib. (3) An Open Geospatial Consortium (OGC) WPS-compliant Web service interface to CDSLib to accommodate ESGF's Web service endpoints. This presentation will report on the overall progress of this effort, with special attention to recent enhancements that have been made to the Reanalysis Ensemble Service, including the following: - An CDSlib Python library that supports full temporal, spatial, and grid-based resolution services - A new reanalysis collections reference model to enable operator design and implementation - An enhanced library of sample queries to demonstrate and develop use case scenarios - Extended operators that enable single- and multiple reanalysis area average, vertical average, re-gridding, and trend, climatology, and anomaly computations - Full support for the MERRA-2 reanalysis and the initial integration of two additional reanalyses - A prototype Jupyter notebook-based distribution mechanism that combines CDSlib documentation with interactive use case scenarios and personalized project management - Prototyped uncertainty quantification services that combine ensemble products with comparative observational products - Convenient, one-stop shopping for commonly used data products from multiple reanalyses, including basic subsetting and arithmetic operations over the data and extractions of trends, climatologies, and anomalies - The ability to compute and visualize multiple reanalysis intercomparisons
A tool for the estimation of the distribution of landslide area in R
NASA Astrophysics Data System (ADS)
Rossi, M.; Cardinali, M.; Fiorucci, F.; Marchesini, I.; Mondini, A. C.; Santangelo, M.; Ghosh, S.; Riguer, D. E. L.; Lahousse, T.; Chang, K. T.; Guzzetti, F.
2012-04-01
We have developed a tool in R (the free software environment for statistical computing, http://www.r-project.org/) to estimate the probability density and the frequency density of landslide area. The tool implements parametric and non-parametric approaches to the estimation of the probability density and the frequency density of landslide area, including: (i) Histogram Density Estimation (HDE), (ii) Kernel Density Estimation (KDE), and (iii) Maximum Likelihood Estimation (MLE). The tool is available as a standard Open Geospatial Consortium (OGC) Web Processing Service (WPS), and is accessible through the web using different GIS software clients. We tested the tool to compare Double Pareto and Inverse Gamma models for the probability density of landslide area in different geological, morphological and climatological settings, and to compare landslides shown in inventory maps prepared using different mapping techniques, including (i) field mapping, (ii) visual interpretation of monoscopic and stereoscopic aerial photographs, (iii) visual interpretation of monoscopic and stereoscopic VHR satellite images and (iv) semi-automatic detection and mapping from VHR satellite images. Results show that both models are applicable in different geomorphological settings. In most cases the two models provided very similar results. Non-parametric estimation methods (i.e., HDE and KDE) provided reasonable results for all the tested landslide datasets. For some of the datasets, MLE failed to provide a result, for convergence problems. The two tested models (Double Pareto and Inverse Gamma) resulted in very similar results for large and very large datasets (> 150 samples). Differences in the modeling results were observed for small datasets affected by systematic biases. A distinct rollover was observed in all analyzed landslide datasets, except for a few datasets obtained from landslide inventories prepared through field mapping or by semi-automatic mapping from VHR satellite imagery. The tool can also be used to evaluate the probability density and the frequency density of landslide volume.
Pouplin, Samuel; Roche, Nicolas; Hugeron, Caroline; Vaugier, Isabelle; Bensmail, Djamel
2016-02-01
For people with cervical spinal cord injury (SCI), access to computers can be difficult, thus several devices have been developed to facilitate their use. However, text input speed remains very slow compared to users who do not have a disability, even with these devices. Several methods have been developed to increase text input speed, such as word prediction software (WPS). Health-related professionals (HRP) often recommend this type of software to people with cervical SCI. WPS can be customized using different settings. It is likely that the settings used will influence the effectiveness of the software on text input speed. However, there is currently a lack of literature regarding professional practices for the setting of WPS as well as the impact for users. To analyze word prediction software settings used by HRP for people with cervical SCI. Prospective observational study. Garches, France; health-related professionals who recommend Word Prediction Software. A questionnaire was submitted to HRP who advise tetraplegic people regarding the use of communication devices. A total of 93 professionals responded to the survey. The most frequently recommended software was Skippy, a commercially available software. HRP rated the importance of the possibility to customise the settings as high. Moreover, they rated some settings as more important than others (P<0.001). However, except for the number of words displayed, each setting was configured by less than 50% of HRP. The results showed that there was a difference between the perception of the importance of some settings and data in the literature regarding the optimization of settings. Moreover, although some parameters were considered as very important, they were rarely specifically configured. Confidence in default settings and lack of information regarding optimal settings seem to be the main reasons for this discordance. This could also explain the disparate results of studies which evaluated the impact of WPS on text input speed in people with cervical SCI. The results showed that there was a difference between the perception of the importance of some settings and data in the literature regarding the optimization of settings. Moreover, although some parameters were considered as very important, they were rarely specifically configured. Confidence in default settings and lack of information regarding optimal settings seem to be the main reasons for this discordance. This could also explain the disparate results of studies which evaluated the impact of WPS on text input speed in people with cervical SCI. Professionals tend to have confidence in default settings, despite the fact they are not always appropriate for users. It thus seems essential to develop information networks and training to disseminate the results of studies and in consequence possibly improve communication for people with cervical SCI who use such devices.
LandEx - Fast, FOSS-Based Application for Query and Retrieval of Land Cover Patterns
NASA Astrophysics Data System (ADS)
Netzel, P.; Stepinski, T.
2012-12-01
The amount of satellite-based spatial data is continuously increasing making a development of efficient data search tools a priority. The bulk of existing research on searching satellite-gathered data concentrates on images and is based on the concept of Content-Based Image Retrieval (CBIR); however, available solutions are not efficient and robust enough to be put to use as deployable web-based search tools. Here we report on development of a practical, deployable tool that searches classified, rather than raw image. LandEx (Landscape Explorer) is a GeoWeb-based tool for Content-Based Pattern Retrieval (CBPR) contained within the National Land Cover Dataset 2006 (NLCD2006). The USGS-developed NLCD2006 is derived from Landsat multispectral images; it covers the entire conterminous U.S. with the resolution of 30 meters/pixel and it depicts 16 land cover classes. The size of NLCD2006 is about 10 Gpixels (161,000 x 100,000 pixels). LandEx is a multi-tier GeoWeb application based on Open Source Software. Main components are: GeoExt/OpenLayers (user interface), GeoServer (OGC WMS, WCS and WPS server), and GRASS (calculation engine). LandEx performs search using query-by-example approach: user selects a reference scene (exhibiting a chosen pattern of land cover classes) and the tool produces, in real time, a map indicating a degree of similarity between the reference pattern and all local patterns across the U.S. Scene pattern is encapsulated by a 2D histogram of classes and sizes of single-class clumps. Pattern similarity is based on the notion of mutual information. The resultant similarity map can be viewed and navigated in a web browser, or it can download as a GeoTiff file for more in-depth analysis. The LandEx is available at http://sil.uc.edu
NASA Astrophysics Data System (ADS)
Vahidi, H.; Mobasheri, A.; Alimardani, M.; Guan, Q.; Bakillah, M.
2014-04-01
Providing early mental health services during disaster is a great challenge in the disaster response phase. Lack of access to adequate mental-health professionals in the early stages of large-scale disasters dramatically influences the trend of a successful mental health aid. In this paper, a conceptual framework has been suggested for adopting cellphone-type tele-operated android robots in the early stages of disasters for providing the early mental health services for disaster survivors by developing a locationbased and participatory approach. The techniques of enabling GI-services in a Peer-to-Peer (P2P) environment were studied to overcome the limitations of current centralized services. Therefore, the aim of this research study is to add more flexibility and autonomy to GI web services (WMS, WFS, WPS, etc.) and alleviate to some degree the inherent limitations of these centralized systems. A P2P system Architecture is presented for the location-based service using minimalistic tele-operated android robots, and some key techniques of implementing this service using BestPeer were studied for developing this framework.
. UPCOMING CHANGES...THE CHANGES LISTED BELOW WILL BE INCORPORATED INTO THE MODEL ON THE EFFECTIVE CHANGE /WPS/. IF YOU HAVE ANY QUESTIONS CONCERNING THESE CHANGES...PLEASE CONTACT: GEOFF DIMEGO NCEP
Pesticide Worker Protection Standard “How to Comply” Manual
The “How to Comply” manual provides information for employers on complying with the worker protection standard for agricultural workers. It has been updated to cover the 2015 revisions to the WPS.
Methyl Bromide and Chloropicrin Safety Information for Handlers
Labels for these kind of pesticides require that soil fumigant handlers receive safe handling training before participating in field fumigation, according to the Worker Protection Standard (WPS) and Good Agricultural Practices (GAPs).
Midas® Fumigant Safe Handling Guide
Handlers or applicators should wear personal protective equipment including respirator and chemical-resistant gloves when working with this soil fumigant, be trained according to the Worker Protection Standard (WPS), and know signs of pesticide exposure.
Sparling, Alica Stubnova; Martin, David W; Posey, Lillian B
2017-06-14
Citing a lack of information, the U.S. Environmental Protection Agency prudently did not account for the benefits of averting many chronic diseases in analyzing the Worker Protection Standards (WPS) revisions. We demonstrate that sufficient information can exist, using the example of the benefits to agricultural workers of reduced Parkinson's disease (PD) due to reduced pesticide exposure. We define the benefits as the monetary value gained by improving quality of lives of people who would otherwise develop PD, plus the value of medical care cost averted and income not lost due to being healthy. For estimation, we use readily available parameters and obtain odds ratios of developing PD by conducting a meta-analysis of studies linking pesticide exposure to PD. The sensitivity analysis varies the number of agricultural workers affected by the regulation, the probability of being diagnosed with PD, the measurement and the timing of the benefits. Our initial assessment is that the reduced PD benefits would be a small fraction of the total WPS revision costs. However, if we define benefits as the common environmental economics willingness to pay to avoid PD incidence, then they become a substantial fraction of the costs. Our analysis demonstrates that the benefits of averting PD from the WPS revisions can be estimated using existing information, and that the results are most sensitive to the choice of valuation of benefits to the worker. We encourage other researchers to extend our framework to other chronic ailments.
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
1,3-Dichloropropene and Chloropicrin Combination Products Fumigant Safe Handling Guide
These soil fumigant pesticide products' labels require safety training according to the Worker Protection Standard WPS. Steps to mitigate exposure include air monitoring, respiratory protection, and proper tarp perforation and removal.
Hall, Joanne M.; Fields, Becky
2015-01-01
Perceived racism contributes to persistent health stress leading to health disparities. African American/Black persons (BPs) believe subtle, rather than overt, interpersonal racism is increasing. Sue and colleagues describe interpersonal racism as racial microaggressions: “routine” marginalizing indignities by White persons (WPs) toward BPs that contribute to health stress. In this narrative, exploratory study, Black adults (n = 10) were asked about specific racial microaggressions; they all experienced multiple types. Categorical and narrative analysis captured interpretations, strategies, and health stress attributions. Six iconic narratives contextualized health stress responses. Diverse mental and physical symptoms were attributed to racial microaggressions. Few strategies in response had positive outcomes. Future research includes development of coping strategies for BPs in these interactions, exploration of WPs awareness of their behaviors, and preventing racial microaggressions in health encounters that exacerbate health disparities. PMID:28462310
Comprehensive genetic dissection of wood properties in a widely-grown tropical tree: Eucalyptus
2011-01-01
Background Eucalyptus is an important genus in industrial plantations throughout the world and is grown for use as timber, pulp, paper and charcoal. Several breeding programmes have been launched worldwide to concomitantly improve growth performance and wood properties (WPs). In this study, an interspecific cross between Eucalyptus urophylla and E. grandis was used to identify major genomic regions (Quantitative Trait Loci, QTL) controlling the variability of WPs. Results Linkage maps were generated for both parent species. A total of 117 QTLs were detected for a series of wood and end-use related traits, including chemical, technological, physical, mechanical and anatomical properties. The QTLs were mainly clustered into five linkage groups. In terms of distribution of QTL effects, our result agrees with the typical L-shape reported in most QTL studies, i.e. most WP QTLs had limited effects and only a few (13) had major effects (phenotypic variance explained > 15%). The co-locations of QTLs for different WPs as well as QTLs and candidate genes are discussed in terms of phenotypic correlations between traits, and of the function of the candidate genes. The major wood property QTL harbours a gene encoding a Cinnamoyl CoA reductase (CCR), a structural enzyme of the monolignol-specific biosynthesis pathway. Conclusions Given the number of traits analysed, this study provides a comprehensive understanding of the genetic architecture of wood properties in this Eucalyptus full-sib pedigree. At the dawn of Eucalyptus genome sequence, it will provide a framework to identify the nature of genes underlying these important quantitative traits. PMID:21651758
A Walk through TRIDEC's intermediate Tsunami Early Warning System
NASA Astrophysics Data System (ADS)
Hammitzsch, M.; Reißland, S.; Lendholt, M.
2012-04-01
The management of natural crises is an important application field of the technology developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme. TRIDEC is based on the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the Distant Early Warning System (DEWS) providing a service platform for both sensor integration and warning dissemination. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing challenges, such as the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulation tools and data fusion tools. In addition to conventional sensors also unconventional sensors and sensor networks play an important role in TRIDEC. The system version presented is based on service-oriented architecture (SOA) concepts and on relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). The first system demonstrator has been designed and implemented to support plausible scenarios demonstrating the treatment of simulated tsunami threats with an essential subset of a National Tsunami Warning Centre (NTWC). The feasibility and the potentials of the implemented approach are demonstrated covering standard operations as well as tsunami detection and alerting functions. The demonstrator presented addresses information management and decision-support processes in a hypothetical natural crisis situation caused by a tsunami in the Eastern Mediterranean. Developments of the system are based to the largest extent on free and open source software (FOSS) components and industry standards. Emphasis has been and will be made on leveraging open source technologies that support mature system architecture models wherever appropriate. All open source software produced is foreseen to be published on a publicly available software repository thus allowing others to reuse results achieved and enabling further development and collaboration with a wide community including scientists, developers, users and stakeholders. This live demonstration is linked with the talk "TRIDEC Natural Crisis Management Demonstrator for Tsunamis" (EGU2012-7275) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.7/ESSI1.7).
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Lendholt, Matthias; Reißland, Sven; Schulz, Jana
2013-04-01
On November 27-28, 2012, the Kandilli Observatory and Earthquake Research Institute (KOERI) and the Portuguese Institute for the Sea and Atmosphere (IPMA) joined other countries in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region as participants in an international tsunami response exercise. The exercise, titled NEAMWave12, simulated widespread Tsunami Watch situations throughout the NEAM region. It is the first international exercise as such, in this region, where the UNESCO-IOC ICG/NEAMTWS tsunami warning chain has been tested to a full scale for the first time with different systems. One of the systems is developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) and has been validated in this exercise among others by KOERI and IPMA. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing related challenges. The first and second phase system demonstrator, deployed at KOERI's crisis management room and deployed at IPMA has been designed and implemented, firstly, to support plausible scenarios for the Turkish NTWC and for the Portuguese NTWC to demonstrate the treatment of simulated tsunami threats with an essential subset of a NTWC. Secondly, the feasibility and the potentials of the implemented approach are demonstrated covering ICG/NEAMTWS standard operations as well as tsunami detection and alerting functions beyond ICG/NEAMTWS requirements. The demonstrator presented addresses information management and decision-support processes for hypothetical tsunami-related crisis situations in the context of the ICG/NEAMTWS NEAMWave12 exercise for the Turkish and Portuguese tsunami exercise scenarios. Impressions gained with the standards compliant TRIDEC system during the exercise will be reported. The system version presented is based on event-driven architecture (EDA) and service-oriented architecture (SOA) concepts and is making use of relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). This demonstration is linked with the talk 'Experiences with TRIDEC's Crisis Management Demonstrator in the Turkish NEAMWave12 exercise tsunami scenario' (EGU2013-2833) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.6).
NASA Astrophysics Data System (ADS)
Mihajlovski, Andrej; Plieger, Maarten; Som de Cerff, Wim; Page, Christian
2016-04-01
The CLIPC project is developing a portal to provide a single point of access for scientific information on climate change. This is made possible through the Copernicus Earth Observation Programme for Europe, which will deliver a new generation of environmental measurements of climate quality. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses (syntheses of all available observations constrained with numerical weather prediction systems). These data categories are managed by different communities: CLIPC will provide a single point of access for the whole range of data. The CLIPC portal will provide a number of indicators showing impacts on specific sectors which have been generated using a range of factors selected through structured expert consultation. It will also, as part of the transformation services, allow users to explore the consequences of using different combinations of driving factors which they consider to be of particular relevance to their work or life. The portal will provide information on the scientific quality and pitfalls of such transformations to prevent misleading usage of the results. The CLIPC project will develop an end to end processing chain (indicator tool kit), from comprehensive information on the climate state through to highly aggregated decision relevant products. Indicators of climate change and climate change impact will be provided, and a tool kit to update and post process the collection of indicators will be integrated into the portal. The CLIPC portal has a distributed architecture, making use of OGC services provided by e.g., climate4impact.eu and CEDA. CLIPC has two themes: 1. Harmonized access to climate datasets derived from models, observations and re-analyses 2. A climate impact tool kit to evaluate, rank and aggregate indicators Key is the availability of standardized metadata, describing indicator data and services. This will enable standardization and interoperability between the different distributed services of CLIPC. To disseminate CLIPC indicator data, transformed data products to enable impacts assessments and climate change impact indicators a standardized meta-data infrastructure is provided. The challenge is that compliance of existing metadata to INSPIRE ISO standards and GEMINI standards needs to be extended to further allow the web portal to be generated from the available metadata blueprint. The information provided in the headers of netCDF files available through multiple catalogues, allow us to generate ISO compliant meta data which is in turn used to generate web based interface content, as well as OGC compliant web services such as WCS and WMS for front end and WPS interactions for the scientific users to combine and generate new datasets. The goal of the metadata infrastructure is to provide a blueprint for creating a data driven science portal, generated from the underlying: GIS data, web services and processing infrastructure. In the presentation we will present the results and lessons learned.
Prototyping an online wetland ecosystem services model using open model sharing standards
Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.
2011-01-01
Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.
Integrating Socioeconomic Data into GEOSS to Enable Societal Benefits
NASA Astrophysics Data System (ADS)
Chen, R. S.; Yetman, G.
2009-12-01
Achieving the GEOSS vision of societal benefits from Earth observation data is a multi-faceted challenge. Linking Earth observation systems into an interoperable system of systems is an important first step, but not sufficient on its own to fulfill the ambitious GEOSS goal of improving decision making for disaster mitigation, public health, ecosystem and resource management, agriculture, and the other societal benefit areas. Significant attention needs to be given to interdisciplinary data integration, especially with regard to incorporating data and information on human activities and welfare into monitoring, modeling, and prediction activities. For example, the ability to assess, monitor, and predict the risks posed by different natural hazards is predicated on an understanding of the underlying exposure and vulnerability of different human populations and their economic assets to past, present, and future hazardous events. The NASA Socioeconomic Data and Applications Center (SEDAC) has pioneered the integration of socioeconomic data with remote sensing data within the NASA Earth Observing System Data and Information System (EOSDIS) and has contributed actively to both phase 1 and 2 of the GEOSS Architecture Implementation Pilot. We present here several use cases for socioeconomic data integration in GEOSS and recent experience in developing an interoperable Web Processing Service (WPS) for estimating population exposure as part of the GEOSS initial operating capability. We also discuss key scientific, technical, and policy challenges to developing GEOSS products and services that will be able to meet the needs of both interdisciplinary and applied users and in so doing help achieve the GEOSS goal of generating significant societal benefits.
Havla, Lukas; Schneider, Moritz J; Thierfelder, Kolja M; Beyer, Sebastian E; Ertl-Wagner, Birgit; Reiser, Maximilian F; Sommer, Wieland H; Dietrich, Olaf
2016-02-01
The purpose of this study was to propose and evaluate a new wavelet-based technique for classification of arterial and venous vessels using time-resolved cerebral CT perfusion data sets. Fourteen consecutive patients (mean age 73 yr, range 17-97) with suspected stroke but no pathology in follow-up MRI were included. A CT perfusion scan with 32 dynamic phases was performed during intravenous bolus contrast-agent application. After rigid-body motion correction, a Paul wavelet (order 1) was used to calculate voxelwise the wavelet power spectrum (WPS) of each attenuation-time course. The angiographic intensity A was defined as the maximum of the WPS, located at the coordinates T (time axis) and W (scale/width axis) within the WPS. Using these three parameters (A, T, W) separately as well as combined by (1) Fisher's linear discriminant analysis (FLDA), (2) logistic regression (LogR) analysis, or (3) support vector machine (SVM) analysis, their potential to classify 18 different arterial and venous vessel segments per subject was evaluated. The best vessel classification was obtained using all three parameters A and T and W [area under the curve (AUC): 0.953 with FLDA and 0.957 with LogR or SVM]. In direct comparison, the wavelet-derived parameters provided performance at least equal to conventional attenuation-time-course parameters. The maximum AUC obtained from the proposed wavelet parameters was slightly (although not statistically significantly) higher than the maximum AUC (0.945) obtained from the conventional parameters. A new method to classify arterial and venous cerebral vessels with high statistical accuracy was introduced based on the time-domain wavelet transform of dynamic CT perfusion data in combination with linear or nonlinear multidimensional classification techniques.
PRN 93-11: Supplemental Guidance for PR Notice 93-7 - Labeling Revisions Required by the WPS
This pesticide registration notice augments the guidance provided by PR notice 97-3 to provide options that you may choose to allow efficient production and distribution of products that comply with PR Notice 93-7.
77 FR 25161 - Combined Notice of Filings #2
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-27
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice that the Commission received the following electric rate filings: Docket Numbers: ER10-1484-002... Numbers: ER12-1573-000. Applicants: The Detroit Edison Company. Description: Thumb Electric WPS-2 Service...
Project #OPE-FY17-0008, Feb 9, 2017.The EPA OIG plans to begin research to evaluate EPA’s management controls implementing the revised Worker Protection Standards (WPS) requirements to reduce pesticide exposure and risks to agricultural workers.
Subsurface Contamination Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Y. Yuan
There are two objectives of this report, ''Subsurface Contamination Control''. The first is to provide a technical basis for recommending limiting radioactive contamination levels (LRCL) on the external surfaces of waste packages (WP) for acceptance into the subsurface repository. The second is to provide an evaluation of the magnitude of potential releases from a defective WP and the detectability of the released contents. The technical basis for deriving LRCL has been established in ''Retrieval Equipment and Strategy for Wp on Pallet'' (CRWMS M and O 2000g, 6.3.1). This report updates the derivation by incorporating the latest design information of themore » subsurface repository for site recommendation. The derived LRCL on the external surface of WPs, therefore, supercede that described in CRWMS M and O 2000g. The derived LRCL represent the average concentrations of contamination on the external surfaces of each WP that must not be exceeded before the WP is to be transported to the subsurface facility for emplacement. The evaluation of potential releases is necessary to control the potential contamination of the subsurface repository and to detect prematurely failed WPs. The detection of failed WPs is required in order to provide reasonable assurance that the integrity of each WP is intact prior to MGR closure. An emplaced WP may become breached due to manufacturing defects or improper weld combined with failure to detect the defect, by corrosion, or by mechanical penetration due to accidents or rockfall conditions. The breached WP may release its gaseous and volatile radionuclide content to the subsurface environment and result in contaminating the subsurface facility. The scope of this analysis is limited to radioactive contaminants resulting from breached WPs during the preclosure period of the subsurface repository. This report: (1) documents a method for deriving LRCL on the external surfaces of WP for acceptance into the subsurface repository; (2) provides a table of derived LRCL for nuclides of radiological importance; (3) Provides an as low as is reasonably achievable (ALARA) evaluation of the derived LRCL by comparing potential onsite and offsite doses to documented ALARA requirements; (4) Provides a method for estimating potential releases from a defective WP; (5) Provides an evaluation of potential radioactive releases from a defective WP that may become airborne and result in contamination of the subsurface facility; and (6) Provides a preliminary analysis of the detectability of a potential WP leak to support the design of an airborne release monitoring system.« less
On January 11, 1995, EPA published a draft policy on Reduced Restricted Entry Intervals for Certain Pesticides, in the Federal Register. The final policy was published in the Federal Register on May 3, 1995. This Notice contains the final policy.
78 FR 1883 - Notice of Lodging of Proposed Consent Decree Under the Clean Air Act
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-09
... require WPS to reduce harmful emissions of sulfur dioxide (``SO 2 ''), nitrogen oxides (``NO X ''), and... addressed to the Assistant Attorney General, Environment and Natural Resources Division, and should refer to... Chief, Environmental Enforcement Section, Environment and Natural Resources Division. [FR Doc. 2013...
Virtual Exploitation Environment Demonstration for Atmospheric Missions
NASA Astrophysics Data System (ADS)
Natali, Stefano; Mantovani, Simone; Hirtl, Marcus; Santillan, Daniel; Triebnig, Gerhard; Fehr, Thorsten; Lopes, Cristiano
2017-04-01
The scientific and industrial communities are being confronted with a strong increase of Earth Observation (EO) satellite missions and related data. This is in particular the case for the Atmospheric Sciences communities, with the upcoming Copernicus Sentinel-5 Precursor, Sentinel-4, -5 and -3, and ESA's Earth Explorers scientific satellites ADM-Aeolus and EarthCARE. The challenge is not only to manage the large volume of data generated by each mission / sensor, but to process and analyze the data streams. Creating synergies among the different datasets will be key to exploit the full potential of the available information. As a preparation activity supporting scientific data exploitation for Earth Explorer and Sentinel atmospheric missions, ESA funded the "Technology and Atmospheric Mission Platform" (TAMP) [1] [2] project; a scientific and technological forum (STF) has been set-up involving relevant European entities from different scientific and operational fields to define the platforḿs requirements. Data access, visualization, processing and download services have been developed to satisfy useŕs needs; use cases defined with the STF, such as study of the SO2 emissions for the Holuhraun eruption (2014) by means of two numerical models, two satellite platforms and ground measurements, global Aerosol analyses from long time series of satellite data, and local Aerosol analysis using satellite and LIDAR, have been implemented to ensure acceptance of TAMP by the atmospheric sciences community. The platform pursues the "virtual workspace" concept: all resources (data, processing, visualization, collaboration tools) are provided as "remote services", accessible through a standard web browser, to avoid the download of big data volumes and for allowing utilization of provided infrastructure for computation, analysis and sharing of results. Data access and processing are achieved through standardized protocols (WCS, WPS). As evolution toward a pre-operational environment, the "Virtual Exploitation Environment Demonstration for Atmospheric Missions" (VEEDAM) aims at maintaining, running and evolving the platform, demonstrating e.g. the possibility to perform massive processing over heterogeneous data sources. This work presents the VEEDAM concepts, provides pre-operational examples, stressing on the interoperability achievable exposing standardized data access and processing services (e.g. making accessible data and processing resources from different VREs). [1] TAMP platform landing page http://vtpip.zamg.ac.at/ [2] TAMP introductory video https://www.youtube.com/watch?v=xWiy8h1oXQY
AWS-Glacier As A Storage Foundation For AWS-EC2 Hosted Scientific Data Services
NASA Astrophysics Data System (ADS)
Gallagher, J. H. R.; Potter, N.
2016-12-01
Using AWS Glacier as a base level data store for a scientific data service presents new challenges for the web accessible data services, along with their software clients and human operators. All meaningful Glacier transactions take at least 4 hours to complete. This is in contrast to the various web APIs for data such as WMS, WFS, WCS, DAP2, and Netcdf tools which were all written based on the premise that the response will be (nearly) immediate. Only DAP4 and WPS contain an explicit asynchronous component to their respective protocols which allows for "return later" behaviors. We were able to put Hyrax (a DAP4 server) in front of Glacier-held resources, but there were significant issues. Any kind of probing of the datasets happens at the cost of the Glacier retrieval period, 4 hours. A couple of crucial things fall out of this: The first is that the service must cache metadata, including coordinate map arrays, so that a client can have enough information available in the "immediate" time frame to make a decisions about what to ask for from the dataset. This type of request planning is important because a data access request will take 4 hours to complete unless the data resource has been cached. The second thing is that the clients need to change their behavior when accessing datasets in an asynchronous system, even if the metadata is cached. Commonly, client applications will request a number of data components from a DAP2 service in the course of "discovering" the dataset. This may not be a well-supported model of interaction with Glacier or any other high latency data store.
2009-01-01
Raw correlation. rc = Correlation corrected for direct range restriction on the WPS (Case 2; Thorndike , 1949). Boldface text indicates raw...first term attrition and reenlistment among FY1999 enlisted accessions (DR-04-14). Alexandria, VA: Human Resources Research Organization. Thorndike
Empowering ELLs through Strong Community-School District Partnerships for Enrichment
ERIC Educational Resources Information Center
Rivera, Jessica; Donovan-Pendzic, Esperanza; Marion, Mary Jo
2015-01-01
The English Language Learner (ELL) Summer Camp in Worcester, Massachusetts--an intensive six-week program that served middle school and high school students from Worcester Public Schools (WPS)--was the product of a five-way partnership that included the school district, higher education institutions (Latino Education Institute [LEI] at Worcester…
NASA Astrophysics Data System (ADS)
Wang, Lei-Ming; Zhang, Lingxiao; Seideman, Tamar; Petek, Hrvoje
2012-10-01
We study by numerical simulations the excitation and propagation dynamics of coupled surface plasmon polariton (SPP) wave packets (WPs) in optically thin Ag films and a bulk Ag/vacuum interface under the illumination of a subwavelength slit by 400 nm continuous wave (cw) and femtosecond pulsed light. The generated surface fields include contributions from both SPPs and quasicylindrical waves, which dominate in different regimes. We explore aspects of the coupled SPP modes in Ag thin films, including symmetry, propagation, attenuation, and the variation of coupling with incident angle and film thickness. Simulations of the electromagnetic transients initiated with femtosecond pulses reveal new features of coupled SPP WP generation and propagation in thin Ag films. Our results show that, under pulsed excitation, the SPP modes in an Ag thin film break up into two distinct bound surface wave packets characterized by marked differences in symmetries, group velocities, attenuation lengths, and dispersion properties. The nanometer spatial and femtosecond temporal scale excitation and propagation dynamics of the coupled SPP WPs are revealed in detail by movies recording the evolution of their transient field distributions.
Liu, Fei; Jiang, Yanfeng; Du, Bingjian; Chai, Zhi; Jiao, Tong; Zhang, Chunyue; Ren, Fazheng; Leng, Xiaojing
2013-06-19
This paper describes an investigation into the properties of a doubly emulsified film incorporated with protein-polysaccharide microcapsules, which serves as a multifunctional food packaging film prepared using common edible materials in place of petroleum--based plastics. The relationships between the microstructural properties and controlled release features of a series of water-in-oil-in-water (W/O/W) microcapsulated edible films prepared in thermodynamically incompatible conditions were analyzed. The hydrophilic riboflavin (V(B2)) nano-droplets (13-50 nm) dispersed in α-tocopherol (V(E)) oil phase were embedded in whey protein-polysaccharide (WPs) microcapsules with a shell thickness of 20-56 nm. These microcapsules were then integrated in 103 μm thick WPs films. Different polysaccharides, including gum arabic (GA), low-methoxyl pectin (LMP), and κ-carrageenan (KCG), exhibited different in vitro synergistic effects on the ability of both films to effect enteric controlled release of both vitamins. GA, which showed a strong emulsifying ability, also showed better control of V(E) than other polysaccharides, and the highly charged KCG showed better control of V(B2) than GA did.
NASA Astrophysics Data System (ADS)
Wu, Fu-Chun; Chang, Ching-Fu; Shiau, Jenq-Tzong
2015-05-01
The full range of natural flow regime is essential for sustaining the riverine ecosystems and biodiversity, yet there are still limited tools available for assessment of flow regime alterations over a spectrum of temporal scales. Wavelet analysis has proven useful for detecting hydrologic alterations at multiple scales via the wavelet power spectrum (WPS) series. The existing approach based on the global WPS (GWPS) ratio tends to be dominated by the rare high-power flows so that alterations of the more frequent low-power flows are often underrepresented. We devise a new approach based on individual deviations between WPS (DWPS) that are root-mean-squared to yield the global DWPS (GDWPS). We test these two approaches on the three reaches of the Feitsui Reservoir system (Taiwan) that are subjected to different classes of anthropogenic interventions. The GDWPS reveal unique features that are not detected with the GWPS ratios. We also segregate the effects of individual subflow components on the overall flow regime alterations using the subflow GDWPS. The results show that the daily hydropeaking waves below the reservoir not only intensified the flow oscillations at daily scale but most significantly eliminated subweekly flow variability. Alterations of flow regime were most severe below the diversion weir, where the residual hydropeaking resulted in a maximum impact at daily scale while the postdiversion null flows led to large hydrologic alterations over submonthly scales. The smallest impacts below the confluence reveal that the hydrologic alterations at scales longer than 2 days were substantially mitigated with the joining of the unregulated tributary flows, whereas the daily-scale hydrologic alteration was retained because of the hydropeaking inherited from the reservoir releases. The proposed DWPS approach unravels for the first time the details of flow regime alterations at these intermediate scales that are overridden by the low-frequency high-power flows when the long-term averaged GWPS are used.
Working together to ensure safety at hydro projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartel, J.W.
Providing for public safety around a hydroelectric facility can be critically important to the welfare of a hydro-power producer. With this in mind, Wisconsin Electric Power Company and Wisconsin Public Service Corporation have worked together to develop consistent safety signage and several for their hydro projects. Although the two utilities sometimes compete for electric customers, they cooperate to ensure the safety to those customers. Both WE and WPS took steps in 1986 to make their operations safer through involvement in the Wisconsin/Michigan Hydro User Group. The organization has 25 members-primarily of electric utilities and paper companies-who operate hydro facilities inmore » the two states. The two areas that the HUG studied in public safety were signs and warning systems. HUG established a sign committee to study how to increase safety of people around hydro plants through signs, explained Ted Handrick, hydro plant superintendent at WPS. The committee's recommendations led to development of a statewide uniform sign system adopted by all HUG members. The committee used Wisconsin Department of Natural Resources' guidelines for warning signs and portages in developing the signage standards. HUG members are converting to these new sign standards as they replace old signs and/or install new signs. Notices describing the new signage system have been placed near each hydro plant, at boat landings, and in campgrounds. The signs are mounted well above ground level so they can be seen and easily read by recreationalists. Warning systems, in accordance with HUG warning standards, were installed at WE and WPS hydro facilities. These systems alert nearby recreational users of rapid increases in water flow when generating units are turned on or when spillway gates are opened. Soon after the authors installed equipment to remotely operate its hydro facilities, the utility experienced a dramatic increase increase in intrusion on dams and other structures at the projects.« less
NASA Astrophysics Data System (ADS)
Ohba, Masamichi; Nohara, Daisuke; Kadokura, Shinji
2016-04-01
Severe storms or other extreme weather events can interrupt the spin of wind turbines in large scale that cause unexpected "wind ramp events". In this study, we present an application of self-organizing maps (SOMs) for climatological attribution of the wind ramp events and their probabilistic prediction. The SOM is an automatic data-mining clustering technique, which allows us to summarize a high-dimensional data space in terms of a set of reference vectors. The SOM is applied to analyze and connect the relationship between atmospheric patterns over Japan and wind power generation. SOM is employed on sea level pressure derived from the JRA55 reanalysis over the target area (Tohoku region in Japan), whereby a two-dimensional lattice of weather patterns (WPs) classified during the 1977-2013 period is obtained. To compare with the atmospheric data, the long-term wind power generation is reconstructed by using a high-resolution surface observation network AMeDAS (Automated Meteorological Data Acquisition System) in Japan. Our analysis extracts seven typical WPs, which are linked to frequent occurrences of wind ramp events. Probabilistic forecasts to wind power generation and ramps are conducted by using the obtained SOM. The probability are derived from the multiple SOM lattices based on the matching of output from TIGGE multi-model global forecast to the WPs on the lattices. Since this method effectively takes care of the empirical uncertainties from the historical data, wind power generation and ramp is probabilistically forecasted from the forecasts of global models. The predictability skill of the forecasts for the wind power generation and ramp events show the relatively good skill score under the downscaling technique. It is expected that the results of this study provides better guidance to the user community and contribute to future development of system operation model for the transmission grid operator.
Gas tungsten arc welding of aluminum alloys 6XXX. Welding procedure specification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wodtke, C.H.; Frizzell, D.R.; Plunkett, W.A.
1985-08-01
Procedure WPS-1003 is qualified under Section IX of the ASME Boiler and Pressure Vessel Code for gas tungsten arc welding of aluminum alloys 6061 and 6063 (P-23), in thickness range 0.035 to 0.516 in.; filler metal is ER4043 (F-23) or ER5356 (F-22); shielding gas is argon.
ERIC Educational Resources Information Center
Seethaler, Pamela M.; Fuchs, Lynn S.; Fuchs, Douglas; Compton, Donald L.
2012-01-01
The purpose of this study was to assess the value of dynamic assessment (DA; degree of scaffolding required to learn unfamiliar mathematics content) for predicting 1st-grade calculations (CAs) and word problems (WPs) development, while controlling for the role of traditional assessments. Among 184 1st graders, predictors (DA, Quantity…
NASA Astrophysics Data System (ADS)
Arnhardt, C.; Fernandez-Steeger, T. M.; Walter, K.; Kallash, A.; Niemeyer, F.; Azzam, R.; Bill, R.
2007-12-01
The joint project Sensor based Landslide Early Warning System (SLEWS) aims at a systematic development of a prototyping alarm- and early warning system for the detection of mass movements by application of an ad hoc wireless sensor network (WSN). Next to the development of suitable sensor setups, sensor fusion and network fusion are applied to enhance data quality and reduce false alarm rates. Of special interest is the data retrieval, processing and visualization in GI-Systems. Therefore a suitable serviced based Spatial Data Infrastructure (SDI) will be developed with respect to existing and upcoming Open Geospatial Consortium (OGC) standards.The application of WSN provides a cheap and easy to set up solution for special monitoring and data gathering in large areas. Measurement data from different low-cost transducers for deformation observation (acceleration, displacement, tilting) is collected by distributed sensor nodes (motes), which interact separately and connect each other in a self-organizing manner. Data are collected and aggregated at the beacon (transmission station) and further operations like data pre-processing and compression can be performed. The WSN concept provides next to energy efficiency, miniaturization, real-time monitoring and remote operation, but also new monitoring strategies like sensor and network fusion. Since not only single sensors can be integrated at single motes either cross-validation or redundant sensor setups are possible to enhance data quality. The planned monitoring and information system will include a mobile infrastructure (information technologies and communication components) as well as methods and models to estimate surface deformation parameters (positioning systems). The measurements result in heterogeneous observation sets that have to be integrated in a common adjustment and filtering approach. Reliable real-time information will be obtained using a range of sensor input and algorithms, from which early warnings and prognosis may be derived. Implementation of sensor algorithms is an important task to form the business logic. This will be represented in self-contained web-based processing services (WPS). In the future different types of sensor networks can communicate via an infrastructure of OGC services using an interoperable way by standardized protocols as the Sensor Markup Language (SensorML) and Observations & Measurements Schema (O&M). Synchronous and asynchronous information services as the Sensor Alert Service (SAS) and the Web Notification Services (WNS) will provide defined users and user groups with time-critical readings from the observation site. Techniques using services for visualizing mapping data (WMS), meta data (CSW), vector (WFS) and raster data (WCS) will range from high detailed expert based output to fuzzy graphical warning elements.The expected results will be an advancement regarding classical alarm and early warning systems as the WSN are free scalable, extensible and easy to install.
ERIC Educational Resources Information Center
Pavelko, Stacey L.; Owens, Robert E., Jr.
2017-01-01
Purpose: The purpose of this study was to document whether mean length of utterance (MLU[subscript S]), total number of words (TNW), clauses per sentence (CPS), and/or words per sentence (WPS) demonstrated age-related changes in children with typical language and to document the average time to collect, transcribe, and analyze conversational…
In Situ Synthesis of Gold Nanoparticles on Wool Powder and Their Catalytic Application.
Tang, Bin; Zhou, Xu; Zeng, Tian; Lin, Xia; Zhou, Ji; Ye, Yong; Wang, Xungai
2017-03-15
Gold nanoparticles (AuNPs) were synthesized in situ on wool powder (WP) under heating conditions. Wool powder not only reduced Au ions to AuNPs, but also provided a support for as-synthesized AuNPs. WPs were treated under different concentrations of Au ions, and corresponding optical features and morphologies of the treated WPs were investigated by UV-VIS diffuse reflectance absorption spectroscopy and scanning electron microscopy (SEM). X-ray diffraction (XRD), X-ray photoelectron spectroscopy (XPS), and transmission electron microscope (TEM) were also employed to characterize the WP treated with AuNPs. The results demonstrate that AuNPs were produced in the presence of WP and distributed over the wool particles. The porous structure led to the synthesis of AuNPs in the internal parts of WP. Acid conditions and high temperature facilitated the synthesis of AuNPs by WP in aqueous solution. The reducibility of wool was improved after being converted to powder from fibers, due to exposure of more active groups. Moreover, the obtained AuNP-WP complexes showed significant catalytic activity to accelerate the reduction reaction of 4-nitrophenol (4-NP) by sodium borohydride (NaBH₄).
NASA Astrophysics Data System (ADS)
Zhang, Shaotong; Jia, Yonggang; Zhang, Yaqi; Liu, Xiaolei; Shan, Hongxian
2018-03-01
A specially designed benthic chamber for the field observation of sediment resuspension that is caused by the wave-induced oscillatory seepage effect (i.e., the wave pumping of sediments) is newly developed. Observational results from the first sea trial prove that the geometry design and skillful instrumentation of the chamber well realize the goal of monitoring the wave pumping of sediments (WPS) continuously. Based on this field dataset, the quantitative contribution of the WPS to the total sediment resuspension is estimated to be 20-60% merely under the continuous action of normal waves (Hs ≤ 1.5 m) in the subaqueous Yellow River Delta (YRD). Such a large contribution invalidates a commonly held opinion that sediments are purely eroded from the seabed surface by the horizontal "shearing effect" from the wave orbital or current velocities. In fact, a considerable amount of sediments could originate from the shallow subsurface of seabed driven by the vertical "pumping effect" of the wave-generated seepage flows during wavy periods. According to the new findings, an improved conceptual model for the resuspension mechanisms of silty sediments under various hydrodynamics is proposed for the first time.
Coexistent three-component and two-component Weyl phonons in TiS, ZrSe, and HfTe
NASA Astrophysics Data System (ADS)
Li, Jiangxu; Xie, Qing; Ullah, Sami; Li, Ronghan; Ma, Hui; Li, Dianzhong; Li, Yiyi; Chen, Xing-Qiu
2018-02-01
In analogy to various fermions of electrons in topological semimetals, topological mechanical states with two types of bosons, Dirac and Weyl bosons, were reported in some macroscopic systems of kHz frequency, and those with a type of doubly-Weyl phonons in atomic vibrational framework of THz frequency of solid crystals were recently predicted. Here, through first-principles calculations, we have reported that the phonon spectra of the WC-type TiS, ZrSe, and HfTe commonly host the unique triply degenerate nodal points (TDNPs) and single two-component Weyl points (WPs) in THz frequency. Quasiparticle excitations near TDNPs of phonons are three-component bosons, beyond the conventional and known classifications of Dirac, Weyl, and doubly-Weyl phonons. Moreover, we have found that both TiS and ZrSe have five pairs of type-I Weyl phonons and a pair of type-II Weyl phonons, whereas HfTe only has four pairs of type-I Weyl phonons. They carry nonzero topological charges. On the (10 1 ¯0 ) crystal surfaces, we observe topological protected surface arc states connecting two WPs with opposite charges, which host modes that propagate nearly in one direction on the surface.
Electromagnetic crystal based terahertz thermal radiators and components
NASA Astrophysics Data System (ADS)
Wu, Ziran
This dissertation presents the investigation of thermal radiation from three-dimensional electromagnetic crystals (EMXT), as well as the development of a THz rapid prototyping fabrication technique and its application in THz EMXT components and micro-system fabrication and integration. First, it is proposed that thermal radiation from a 3-D EMXT would be greatly enhanced at the band gap edge frequency due to the redistribution of photon density of states (DOS) within the crystal. A THz thermal radiator could thus be built upon a THz EMXT by utilizing the exceptional emission peak(s) around its band gap frequency. The thermal radiation enhancement effects of various THz EMXT including both silicon and tungsten woodpile structures (WPS) and cubic photonic cavity (CPC) array are explored. The DOS of all three structures are calculated, and their thermal radiation intensities are predicted using Planck's Equation. These calculations show that the DOS of the silicon and tungsten WPS can be enhanced by a factor of 11.8 around 364 GHz and 2.6 around 406 GHz respectively, in comparison to the normal blackbody radiation at same frequencies. An enhancement factor of more than 100 is obtained in calculation from the CPC array. A silicon WPS with a band gap around 200 GHz has been designed and fabricated. Thermal emissivity of the silicon WPS sample is measured with a control blackbody as reference. And enhancements of the emission from the WPS over the control blackbody are observed at several frequencies quite consistent with the theoretical predictions. Second, the practical challenge of THz EMXT component and system fabrication is met by a THz rapid prototyping technique developed by us. Using this technique, the fabrications of several EMXTs with 3D electromagnetic band gaps in the 100-400 GHz range are demonstrated. Characterization of the samples via THz Time-domain Spectroscopy (THz-TDS) shows very good agreement with simulation, confirming the build accuracy of this prototyping approach. Third, an all-dielectric THz waveguide is designed, fabricated and characterized. The design is based on hollow-core EMXT waveguide, and the fabrication is implemented with the THz prototyping method. Characterization results of the waveguide power loss factor show good consistency with the simulation, and waveguide propagation loss as low as 0.03 dB/mm at 105 GHz is demonstrated. Several design parameters are also varied and their impacts on the waveguide performance investigated theoretically. Finally, a THz EMXT antenna based on expanding the defect radius of the EMXT waveguide to a horn shape is proposed and studied. The boresight directivity and main beam angular width of the optimized EMXT horn antenna is comparable with a copper horn antenna of the same dimensions at low frequencies, and much better than the copper horn at high frequencies. The EMXT antenna has been successfully fabricated via the same THz prototyping, and we believe this is the first time an EMXT antenna of this architecture is fabricated. Far-field measurement of the EMXT antenna radiation pattern is undergoing. Also, in order to integrate planar THz solid-state devices (especially source and detector) and THz samples under test with the potential THz micro-system fabricate-able by the prototyping approach, an EMXT waveguide-to-microstrip line transition structure is designed. The structure uses tapered solid dielectric waveguides on both ends to transit THz energy from the EMXT waveguide defect onto the microstrip line. Simulation of the transition structure in a back-to-back configuration yields about -15 dB insertion loss mainly due to the dielectric material loss. The coupling and radiation loss of the transition structure is estimated to be -2.115 dB. The fabrication and characterization of the transition system is currently underway. With all the above THz components realized in the future, integrated THz micro-systems manufactured by the same prototyping technique will be achieved, with low cost, high quality, self-sufficiency, and great customizability.
ERIC Educational Resources Information Center
Leh, Jayne M.; Jitendra, Asha K.; Caskie, Grace I. L.; Griffin, Cynthia C.
2007-01-01
The purpose of this study was to examine the tenability of a curriculum-based mathematical word problem-solving (WPS) measure as a progress-monitoring tool to index students' rate of growth or slope of achievement over time. Participants consisted of 58 third-grade students, who were assessed repeatedly over 16 school weeks. Students were measured…
Thermal Performance Analysis of a Geologic Borehole Repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reagin, Lauren
2016-08-16
The Brazilian Nuclear Research Institute (IPEN) proposed a design for the disposal of Disused Sealed Radioactive Sources (DSRS) based on the IAEA Borehole Disposal of Sealed Radioactive Sources (BOSS) design that would allow the entirety of Brazil’s inventory of DSRS to be disposed in a single borehole. The proposed IPEN design allows for 170 waste packages (WPs) containing DSRS (such as Co-60 and Cs-137) to be stacked on top of each other inside the borehole. The primary objective of this work was to evaluate the thermal performance of a conservative approach to the IPEN proposal with the equivalent of twomore » WPs and two different inside configurations using Co-60 as the radioactive heat source. The current WP configuration (heterogeneous) for the IPEN proposal has 60% of the WP volume being occupied by a nuclear radioactive heat source and the remaining 40% as vacant space. The second configuration (homogeneous) considered for this project was a homogeneous case where 100% of the WP volume was occupied by a nuclear radioactive heat source. The computational models for the thermal analyses of the WP configurations with the Co-60 heat source considered three different cooling mechanisms (conduction, radiation, and convection) and the effect of mesh size on the results from the thermal analysis. The results of the analyses yielded maximum temperatures inside the WPs for both of the WP configurations and various mesh sizes. The heterogeneous WP considered the cooling mechanisms of conduction, convection, and radiation. The temperature results from the heterogeneous WP analysis suggest that the model is cooled predominantly by conduction with effect of radiation and natural convection on cooling being negligible. From the thermal analysis comparing the two WP configurations, the results suggest that either WP configuration could be used for the design. The mesh sensitivity results verify the meshes used, and results obtained from the thermal analyses were close to being independent of mesh size. The results from the computational case and analytically-calculated case for the homogeneous WP in benchmarking were almost identical, which indicates that the computational approach used here was successfully verified by the analytical solution.« less
Camanocha, Anuj; Dewhirst, Floyd E.
2014-01-01
Background and objective In addition to the well-known phyla Firmicutes, Proteobacteria, Bacteroidetes, Actinobacteria, Spirochaetes, Fusobacteria, Tenericutes, and Chylamydiae, the oral microbiomes of mammals contain species from the lesser-known phyla or candidate divisions, including Synergistetes, TM7, Chlorobi, Chloroflexi, GN02, SR1, and WPS-2. The objectives of this study were to create phyla-selective 16S rDNA PCR primer pairs, create selective 16S rDNA clone libraries, identify novel oral taxa, and update canine and human oral microbiome databases. Design 16S rRNA gene sequences for members of the lesser-known phyla were downloaded from GenBank and Greengenes databases and aligned with sequences in our RNA databases. Primers with potential phylum level selectivity were designed heuristically with the goal of producing nearly full-length 16S rDNA amplicons. The specificity of primer pairs was examined by making clone libraries from PCR amplicons and determining phyla identity by BLASTN analysis. Results Phylum-selective primer pairs were identified that allowed construction of clone libraries with 96–100% specificity for each of the lesser-known phyla. From these clone libraries, seven human and two canine novel oral taxa were identified and added to their respective taxonomic databases. For each phylum, genome sequences closest to human oral taxa were identified and added to the Human Oral Microbiome Database to facilitate metagenomic, transcriptomic, and proteomic studies that involve tiling sequences to the most closely related taxon. While examining ribosomal operons in lesser-known phyla from single-cell genomes and metagenomes, we identified a novel rRNA operon order (23S-5S-16S) in three SR1 genomes and the splitting of the 23S rRNA gene by an I-CeuI-like homing endonuclease in a WPS-2 genome. Conclusions This study developed useful primer pairs for making phylum-selective 16S rRNA clone libraries. Phylum-specific libraries were shown to be useful for identifying previously unrecognized taxa in lesser-known phyla and would be useful for future environmental and host-associated studies. PMID:25317252
Resource Management Scheme Based on Ubiquitous Data Analysis
Lee, Heung Ki; Jung, Jaehee
2014-01-01
Resource management of the main memory and process handler is critical to enhancing the system performance of a web server. Owing to the transaction delay time that affects incoming requests from web clients, web server systems utilize several web processes to anticipate future requests. This procedure is able to decrease the web generation time because there are enough processes to handle the incoming requests from web browsers. However, inefficient process management results in low service quality for the web server system. Proper pregenerated process mechanisms are required for dealing with the clients' requests. Unfortunately, it is difficult to predict how many requests a web server system is going to receive. If a web server system builds too many web processes, it wastes a considerable amount of memory space, and thus performance is reduced. We propose an adaptive web process manager scheme based on the analysis of web log mining. In the proposed scheme, the number of web processes is controlled through prediction of incoming requests, and accordingly, the web process management scheme consumes the least possible web transaction resources. In experiments, real web trace data were used to prove the improved performance of the proposed scheme. PMID:25197692
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wodtke, C.H.; Frizzell, D.R.; Plunkett, W.A.
1986-06-01
Procedure WPS-1003 is qualified under Section IX of the ASME Boiler and Pressure Vessel Code for gas tungsten arc welding of aluminum alloys 6061 and 6063 (P-23), in thickness range 0.035 to 0.516 inch; filler metal is ER4043 (F-23) or ER5356 (F-22); shielding gas is argon.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wodtke, C.H.; Frizzell, D.R.; Plunkett, W.A.
1985-08-01
Procedure WPS-2201 is qualified under Section IX of the ASME Boiler and Pressure Vessel for gas tungsten arc welding of aluminum alloys 1060, 1100, and 3003 (P-21) to 3004, 5052, 5154, and 5454 (P-22), in thickness range 0.062 to 0.5 in.; filler metal is ER5356 (F-22); shielding gas is argon.
Gas tungsten arc welding of aluminum alloys 3004, 5052, and 5X54. Welding procedure specification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wodtke, C.H.; Frizzell, D.R.; Plunkett, W.A.
1985-08-01
Procedure WPS-1002 is qualified under Section IX of the ASME Boiler and Pressure Vessel Code for gas tungsten arc welding of aluminum alloys 3004, 5052, 5154, and 5454 (P-22), in thickness range 0.062 to 0.5 in.; filler metal is ER4043 (F-23) for 3004, and ER5356 (F-22) for other alloys; shielding gas is argon.
NASA Astrophysics Data System (ADS)
Lun Nam, Wai; Huan Su, Man; Phang, Xue Yee; Chong, Min Yee; Keey Liew, Rock; Ma, Nyuk Ling; Lam, Su Shiung
2017-11-01
Microwave vacuum pyrolysis of waste palm shell (WPS) was performed to produce biochar, which was then tested as bio-fertilizer in growing Oyster mushroom (Pleurotus ostreatus). The pyrolysis approach generated a biochar containing a highly porous structure with a high BET surface area (up to 1250 m2/g) and a low moisture content (≤ 10 wt%), exhibiting desirable adsorption properties to be used as bio-fertilizer since it can act as a housing that provides many sites on which living microorganisms (mycelium or plant-growth promoting bacteria) and organic nutrients can be attached or adsorbed onto. This could in turn stimulate plant growth by increasing the availability and supply of nutrients to the targeted host plant. The results from growing Oyster mushroom using the biochar record an impressive growth rate and a monthly production of up to about 550 g of mushroom. The shorter time for mycelium growth on whole baglog (30 days) and the highest yield of Oyster mushroom (550 g) was obtained from the cultivation medium added with 20 g of biochar. Our results demonstrate that the biochar-based bio-fertilizer produce from microwave vacuum pyrolysis of WPS show exceptional promise as an alternative growing substrate for mushroom cultivation.
Fuchs, Lynn S.; Geary, David C.; Compton, Donald L.; Fuchs, Douglas; Hamlett, Carol L.; Seethaler, Pamela M.; Bryant, Joan D.; Schatschneider, Christopher
2010-01-01
The purpose of this study was to examine the interplay between basic numerical cognition and domain-general abilities (such as working memory) in explaining school mathematics learning. First graders (n=280; 5.77 years) were assessed on 2 types of basic numerical cognition, 8 domain-general abilities, procedural calculations (PCs), and word problems (WPs) in fall and then reassessed on PCs and WPs in spring. Development was indexed via latent change scores, and the interplay between numerical and domain-general abilities was analyzed via multiple regression. Results suggest that the development of different types of formal school mathematics depends on different constellations of numerical versus general cognitive abilities. When controlling for 8 domain-general abilities, both aspects of basic numerical cognition were uniquely predictive of PC and WP development. Yet, for PC development, the additional amount of variance explained by the set of domain-general abilities was not significant, and only counting span was uniquely predictive. By contrast, for WP development, the set of domain- general abilities did provide additional explanatory value, accounting for about the same amount of variance as the basic numerical cognition variables. Language, attentive behavior, nonverbal problem solving, and listening span were uniquely predictive. PMID:20822213
2015-06-12
economy has grown through the export of timber, iron ore, and rubber . Reforming corruption inherited from the previous government and former warlords......on WPS, the LNAP has a post-conflict perspective. Liberia’s plan focuses attention on healing the trauma citizens suffered during the twenty-year
ERIC Educational Resources Information Center
Fuchs, Lynn S.; Gilbert, Jennifer K.; Fuchs, Douglas; Seethaler, Pamela M.; N. Martin, BrittanyLee
2018-01-01
This study was designed to deepen insights on whether word-problem (WP) solving is a form of text comprehension (TC) and on the role of language in WPs. A sample of 325 second graders, representing high, average, and low reading and math performance, was assessed on (a) start-of-year TC, WP skill, language, nonlinguistic reasoning, working memory,…
Scale Up Considerations for Sediment Microbial Fuel Cells
2013-01-01
density calculations were made once WPs stabilized for each system. Linear sweep voltametry was then used on these systems to generate polarization and...power density curves. The systems were allowed to equilibrate under open circuit conditions (about 12 h) before a potential sweep was performed with a...reference. The potential sweep was set to begin at the anode potential under open circuit conditions (20.4 V vs. Ag/AgCl) and was raised to the
Quantitative Study of Longitudinal Relaxation (T 1) Contrast Mechanisms in Brain MRI
NASA Astrophysics Data System (ADS)
Jiang, Xu
Longitudinal relaxation (T1) contrast in MRI is important for studying brain morphology and is widely used in clinical applications. Although MRI only detects signals from water hydrogen ( 1H) protons (WPs), T1 contrast is known to be influenced by other species of 1H protons, including those in macromolecules (MPs), such as lipids and proteins, through magnetization transfer (MT) between WPs and MPs. This complicates the use and quantification of T1 contrast for studying the underlying tissue composition and the physiology of the brain. MT contributes to T1 contrast to an extent that is generally dependent on MT kinetics, as well as the concentration and NMR spectral properties of MPs. However, the MP spectral properties and MT kinetics are both difficult to measure directly, as the signal from MPs is generally invisible to MRI. Therefore, to investigate MT kinetics and further quantify T1 contrast, we first developed a reliable way to indirectly measure the MP fraction and their exchange rate with WPs, with minimal dependence on the spectral properties of MPs. For this purpose, we used brief, highpower radiofrequency (RF) NMR excitation pulses to almost completely saturate the magnetization of MPs. Based on this, both MT kinetics and the contribution of MPs to T1 contrast through MT were studied. The thus obtained knowledge allowed us to subsequently infer the spectral properties of MPs by applying low-power, frequencyselective off-resonance RF pulses and measuring the offset-frequency dependent effect of MPs on the WP MRI signal. A two-pool exchange model was used in both cases to account for direct effects of the RF pulse on WP magnetization. Consistent with earlier works using MRI at low-field and post-mortem analysis of brain tissue, our novel measurement approach found that MPs constitute an up to 27% fraction of the total 1H protons in human brain white matter, and their spectrum follows a super-Lorentzian line with a T2 of 9.6+/-0.6 mus and a resonance frequency centered at -2.58+/-0.05 ppm, at 7 T. T1 contrast was found to be dominated by MP fraction, with iron only modestly contributing even in the iron-rich regions of brain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wodtke, C.H.; Frizzell, D.R.; Plunkett, W.A.
1986-06-01
Procedure WPS-1002 is qualified under Section IX of the ASME Boiler and Pressure Vessel Code for gas tungsten arc welding of aluminum alloys 3004, 5052, 5154, and 5454 (P-22), in thickness range 0.062 to 0.5 inches; filler metal is ER4043 (F-23) for 3004, and ER5356 (F-22) for other alloys; shielding gas is argon.
Prioritizing Strategic Interests in South Asia
2010-06-01
rolled out “Aghaz-e-Haqooq Balochistan ”—its by far the most serious fallout from the conflict in Afghanistan is the increasing radicalization of...Foreign Policy, August 2006, available at <www.foreignpolicy.com/ story/cms.php?story_id=3578>. 17 “Aghaz-e-Haqooq Balochistan Package,” Dawn. com...November 16, 2009, available at <www.dawn.com/ wps/wcm/connect/dawn-content-library/dawn/news/ pakistan/13+aghaz-e-haqooq+ balochistan +package- za-05
NASA Astrophysics Data System (ADS)
Choi, H. J.; Lee, S. B.; Lee, H. G.; Y Back, S.; Kim, S. H.; Kang, H. S.
2017-07-01
Several parts that comprise the large scientific device should be installed and operated at the accurate three-dimensional location coordinates (X, Y, and Z) where they should be subjected to survey and alignment. The location of the aligned parts should not be changed in order to ensure that the electron beam parameters (Energy 10 GeV, Charge 200 pC, and Bunch Length 60 fs, Emittance X/Y 0.481 μm/0.256 μm) of PAL-XFEL (X-ray Free Electron Laser of the Pohang Accelerator Laboratory) remain stable and can be operated without any problems. As time goes by, however, the ground goes through uplift and subsidence, which consequently deforms building floors. The deformation of the ground and buildings changes the location of several devices including magnets and RF accelerator tubes, which eventually leads to alignment errors (∆X, ∆Y, and ∆Z). Once alignment errors occur with regard to these parts, the electron beam deviates from its course and beam parameters change accordingly. PAL-XFEL has installed the Hydrostatic Leveling System (HLS) to measure and record the vertical change of buildings and ground consistently and systematically and the Wire Position System (WPS) to measure the two dimensional changes of girders. This paper is designed to introduce the operating principle and design concept of WPS and discuss the current situation regarding installation and operation.
Pseudo Landau levels and quantum oscillations in strained Weyl semimetals
NASA Astrophysics Data System (ADS)
Alisultanov, Z. Z.
2018-05-01
The crystal lattice deformation in Weyl materials where the two chiralities are separated in momentum space leads to the appearance of gauge pseudo-fields. We investigated the pseudo-magnetic field induced quantum oscillations in strained Weyl semimetal (WSM). In contrast to all previous works on this problem, we use here a more general tilted Hamiltonian. Such Hamiltonian, seems to be is more suitable for a strained WSMs. We have shown that a pseudo-magnetic field induced magnetization of strained WSM is nonzero due to the fact that electric field (gradient of the deformation potential) is induced simultaneously with the pseudo-magnetic field. This related with fact that the pseudo Landau levels (LLs) in strained WSM are differ in vicinities of different WPs due to the presence of tilt in spectrum. Such violation of the equivalence between Weyl points (WPs) leads to modulation of quantum oscillations. We also showed that magnetization magnitude can be changed by application of an external electric field. In particular, it can be reduced to zero. The possibility of controlling of the magnetization by an electric field is interesting both from a fundamental point of view (a new type of magneto-electric effect) and application point of view (additional possibility to control diamagnetism of deformed WSMs). Finally, a coexistence of type-I and type-II Weyl fermions is possible in the system under investigation. Such phase is absolutely new for physics of topological systems.
Nash, David J; Coulson, Sheila; Staurset, Sigrid; Ullyott, J Stewart; Babutsi, Mosarwa; Hopkinson, Laurence; Smith, Martin P
2013-04-01
Lithic artifacts from the African Middle Stone Age (MSA) offer an avenue to explore a range of human behaviors, including mobility, raw material acquisition, trade and exchange. However, to date, in southern Africa it has not been possible to provenance the locations from which commonly used stone materials were acquired prior to transport to archaeological sites. Here we present results of the first investigation to geochemically fingerprint silcrete, a material widely used for tool manufacture across the subcontinent. The study focuses on the provenancing of silcrete artifacts from the MSA of White Paintings Shelter (WPS), Tsodilo Hills, in the Kalahari Desert of northwest Botswana. Our results suggest that: (i) despite having access to local quartz and quartzite at Tsodilo Hills, MSA peoples chose to transport silcrete over 220 km to WPS from sites south of the Okavango Delta; (ii) these sites were preferred to silcrete sources much closer to Tsodilo Hills; (iii) the same source areas were repeatedly used for silcrete supply throughout the 3 m MSA sequence; (iv) during periods of colder, wetter climate, silcrete may have been sourced from unknown, more distant, sites. Our results offer a new provenancing approach for exploring prehistoric behavior at other sites where silcrete is present in the archaeological record. Copyright © 2013 Elsevier Ltd. All rights reserved.
Diagnosis of Group A Streptococcal Infections Directly From Throat Gargle.
1981-06-01
Streptococcal Infection Strep Throat Latex agglutination 20. ARVAT(Continue an revere side it neessar and identify by block num~ber) The diagnosis of... THROAT GARGLE ,-I E. A. EDWARDS, 1. A. PH1WPS & W. C. SUITER REPORT NO. 81-20 DTIC IELECTE VAL A11OCT I lSlt P.O. BOX 8022 SAN DIEGO, CALIFORNIA 92138...AVAL MICAL RESEARCH AND DEVELOPMENT COMMAND BE0hESDA, MARYLAND £ 81 9 30 069 (. DIAGNOSIS OF QROUP A STREPTOCOCCAL INFECTIONS ~1IRECTLY FROM ThROAT
Weich, Scott; Fenton, Sarah-Jane Hannah; Bhui, Kamaldeep; Staniszewska, Sophie; Madan, Jason; Larkin, Michael; Newton, Elizabeth; Crepaz-Keay, David; Canaway, Alastair; Croft, Charlotte; Griffiths, Frances
2018-06-14
Inpatient mental healthcare continues to be an area of high risk and where patients report negative experiences. To ensure the patient voice is heard, National Health Service (NHS) Trusts are required to collect feedback from patients routinely. We do not know what kinds of feedback are most important or what management processes are needed to translate this into effective action plans. Further, we do not know if this makes any difference to the patients themselves. This study seeks to explore which of the many different approaches to collecting and using patient experience data are the most useful for supporting improvements in inpatient mental healthcare. The overarching aim of the study is to arrive at recommendations for best practice in the collection and use of patient experience data in NHS England adult inpatient mental health settings. We present the protocol for Realist Evaluation of the Use of Patient Experience Data to Improve the Quality of Inpatient Mental Health Care study (EURIPIDES). The study is composed of five work packages (WPs), including a systematic review of patient experiences (WP1); a telephone survey to assist the selection of case sites (WP2); six indepth case studies involving interviews with service users, carers and staff to enable a realist evaluation of the use of patient experience to improve quality in adult inpatient mental health services (WP3); an economic evaluation of patient experience feedback activity (WP5); and a consensus conference (WP4). We discuss the methodological rationale for the five WPs. This study has received approval from West Midlands/South Birmingham NHS Research Ethics Committee. The outcome of the consensus conference meeting (WP4) will form the basis of the outputs to be disseminated to NHS providers. Dissemination will also take place through publications and presentations at relevant conferences. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Zhong, Lingyun; Niu, Bei; Tang, Lin; Chen, Fang; Zhao, Gang; Zhao, Jianglin
2016-11-25
The purpose of this study was to evaluate the effects of four different fungal polysaccharides, named water-extracted mycelia polysaccharide (WPS), sodium hydroxide-extracted mycelia polysaccharide (SPS), hydrochloric-extracted mycelia polysaccharide (APS), and exo-polysaccharide (EPS) obtained from the endophytic Fusarium oxysporum Fat9 on the sprout growth, flavonoid accumulation, and antioxidant capacity of tartary buckwheat. Without visible changes in the appearance of the sprouts, the exogenous polysaccharide elicitors strongly stimulated sprout growth and flavonoid production, and the stimulation effect was closely related with the polysaccharide (PS) species and its treatment dosage. With application of 200 mg/L of EPS, 200 mg/L of APS, 150 mg/L of WPS, or 100 mg/L of SPS, the total rutin and quercetin yields of buckwheat sprouts were significantly increased to 41.70 mg/(100 sprouts), 41.52 mg/(100 sprouts), 35.88 mg/(100 sprouts), and 32.95 mg/(100 sprouts), respectively. This was about 1.11 to 1.40-fold compared to the control culture of 31.40 mg/(100 sprouts). Moreover, the antioxidant capacity of tartary buckwheat sprouts was also enhanced after treatment with the four PS elicitors. Furthermore, the present study revealed the polysaccharide elicitation that caused the accumulation of functional flavonoid by stimulating the phenylpropanoid pathway. The application of beneficial fungal polysaccharide elicitors may be an effective approach to improve the nutritional and functional characteristics of tartary buckwheat sprouts.
Chapotin, Saharah Moon; Razanameharizaka, Juvet H; Holbrook, N Michele
2006-06-01
Baobab trees are often cited in the literature as water-storing trees, yet few studies have examined this assumption. We assessed the role of stored water in buffering daily water deficits in two species of baobabs (Adansonia rubrostipa Jum. and H. Perrier and Adansonia za Baill.) in a tropical dry forest in Madagascar. We found no lag in the daily onset of sap flow between the base and the crown of the tree. Some night-time sap flow occurred, but this was more consistent with a pattern of seasonal stem water replenishment than with diurnal usage. Intrinsic capacitance of both leaf and stem tissue (0.07-0.08 and 1.1-1.43 MPa(-1), respectively) was high, yet the amount of water that could be withdrawn before turgor loss was small because midday leaf and stem water potentials (WPs) were near the turgor-loss points. Stomatal conductance was high in the daytime but then declined rapidly, suggesting an embolism-avoidance strategy. Although the xylem of distal branches was relatively vulnerable to cavitation (P50: 1.1-1.7 MPa), tight stomatal control and minimum WPs near--1.0 MPa maintained native embolism levels at 30-65%. Stem morphology and anatomy restrict water movement between storage tissues and the conductive pathway, making stored-water usage more appropriate to longer-term water deficits than as a buffer against daily water deficits.
Building asynchronous geospatial processing workflows with web services
NASA Astrophysics Data System (ADS)
Zhao, Peisheng; Di, Liping; Yu, Genong
2012-02-01
Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.
77 FR 38033 - Notice of Establishment of a Commodity Import Approval Process Web Site
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-26
... Process Web Site AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice. SUMMARY: We are announcing the creation of a new Plant Protection and Quarantine Web site that will provide stakeholders with... comment on draft risk assessments. This Web site will make the commodity import approval process more...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wodtke, C.H.; Frizzell, D.R.; Plunkett, W.A.
1986-06-01
Procedure WPS-2202 is qualified under Section IX of the ASME Boiler and Pressure Vessel Code for gas tungsten arc welding of aluminum alloys 1060, 1100, and 3003 (P-21) to 3004, 5052, 5154, and 5454 (P-22), in thickness range 0.062 to 0.062 to 0.5 inch; filler metal is ER5356 (F-22); shielding gas is argon.
Experimental evaluation of the impact of packet capturing tools for web services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choe, Yung Ryn; Mohapatra, Prasant; Chuah, Chen-Nee
Network measurement is a discipline that provides the techniques to collect data that are fundamental to many branches of computer science. While many capturing tools and comparisons have made available in the literature and elsewhere, the impact of these packet capturing tools on existing processes have not been thoroughly studied. While not a concern for collection methods in which dedicated servers are used, many usage scenarios of packet capturing now requires the packet capturing tool to run concurrently with operational processes. In this work we perform experimental evaluations of the performance impact that packet capturing process have on web-based services;more » in particular, we observe the impact on web servers. We find that packet capturing processes indeed impact the performance of web servers, but on a multi-core system the impact varies depending on whether the packet capturing and web hosting processes are co-located or not. In addition, the architecture and behavior of the web server and process scheduling is coupled with the behavior of the packet capturing process, which in turn also affect the web server's performance.« less
Depth-of-processing effects as college students use academic advising Web sites.
Boatright-Horowitz, Su L; Langley, Michelle; Gunnip, Matthew
2009-06-01
This research examined students' cognitive and affective responses to an academic advising Web site. Specifically, we investigated whether exposure to our Web site increased student reports that they would access university Web sites to obtain various types of advising information. A depth-of-processing (DOP) manipulation revealed this effect as students engaged in semantic processing of Web content but not when they engaged in superficial examination of the physical appearance of the same Web site. Students appeared to scan online academic advising materials for information of immediate importance without noticing other information or hyperlinks (e.g., regarding internships and careers). Suggestions are presented for increasing the effectiveness of academic advising Web sites.
Availability of the OGC geoprocessing standard: March 2011 reality check
NASA Astrophysics Data System (ADS)
Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier
2012-10-01
This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.
NASA Astrophysics Data System (ADS)
Friberg, P. A.; Luis, R. S.; Quintiliani, M.; Lisowski, S.; Hunter, S.
2014-12-01
Recently, a novel set of modules has been included in the Open Source Earthworm seismic data processing system, supporting the use of web applications. These include the Mole sub-system, for storing relevant event data in a MySQL database (see M. Quintiliani and S. Pintore, SRL, 2013), and an embedded webserver, Moleserv, for serving such data to web clients in QuakeML format. These modules have enabled, for the first time using Earthworm, the use of web applications for seismic data processing. These can greatly simplify the operation and maintenance of seismic data processing centers by having one or more servers providing the relevant data as well as the data processing applications themselves to client machines running arbitrary operating systems.Web applications with secure online web access allow operators to work anywhere, without the often cumbersome and bandwidth hungry use of secure shell or virtual private networks. Furthermore, web applications can seamlessly access third party data repositories to acquire additional information, such as maps. Finally, the usage of HTML email brought the possibility of specialized web applications, to be used in email clients. This is the case of EWHTMLEmail, which produces event notification emails that are in fact simple web applications for plotting relevant seismic data.Providing web services as part of Earthworm has enabled a number of other tools as well. One is ISTI's EZ Earthworm, a web based command and control system for an otherwise command line driven system; another is a waveform web service. The waveform web service serves Earthworm data to additional web clients for plotting, picking, and other web-based processing tools. The current Earthworm waveform web service hosts an advanced plotting capability for providing views of event-based waveforms from a Mole database served by Moleserve.The current trend towards the usage of cloud services supported by web applications is driving improvements in JavaScript, css and HTML, as well as faster and more efficient web browsers, including mobile. It is foreseeable that in the near future, web applications are as powerful and efficient as native applications. Hence the work described here has been the first step towards bringing the Open Source Earthworm seismic data processing system to this new paradigm.
Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs
NASA Astrophysics Data System (ADS)
O'Connor, Rory V.
This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.
LePrevost, Catherine E; Storm, Julia F; Asuaje, Cesar R; Arellano, Consuelo; Cope, W Gregory
2014-01-01
Among agricultural workers, migrant and seasonal farmworkers have been recognized as a special risk population because these laborers encounter cultural challenges and linguistic barriers while attempting to maintain their safety and health within their working environments. The crop-specific Pesticides and Farmworker Health Toolkit (Toolkit) is a pesticide safety and health curriculum designed to communicate to farmworkers pesticide hazards commonly found in their working environments and to address Worker Protection Standard (WPS) pesticide training criteria for agricultural workers. The goal of this preliminary study was to test evaluation items for measuring knowledge increases among farmworkers and to assess the effectiveness of the Toolkit in improving farmworkers' knowledge of key WPS and risk communication concepts when the Toolkit lesson was delivered by trained trainers in the field. After receiving training on the curriculum, four participating trainers provided lessons using the Toolkit as part of their regular training responsibilities and orally administered a pre- and post-lesson evaluation instrument to 20 farmworker volunteers who were generally representative of the national farmworker population. Farmworker knowledge of pesticide safety messages significantly (P<.05) increased after participation in the lesson. Further, items with visual alternatives were found to be most useful in discriminating between more and less knowledgeable farmworkers. The pilot study suggests that the Pesticides and Farmworker Health Toolkit is an effective, research-based pesticide safety and health intervention for the at-risk farmworker population and identifies a testing format appropriate for evaluating the Toolkit and other similar interventions for farmworkers in the field.
Maley, Matthew J; Minett, Geoffrey M; Bach, Aaron J E; Zietek, Stephanie A; Stewart, Kelly L; Stewart, Ian B
2018-01-01
The present study aimed to compare a range of cooling methods possibly utilised by occupational workers, focusing on their effect on body temperature, perception and manual dexterity. Ten male participants completed eight trials involving 30 min of seated rest followed by 30 min of cooling or control of no cooling (CON) (34°C, 58% relative humidity). The cooling methods utilised were: ice cooling vest (CV0), phase change cooling vest melting at 14°C (CV14), evaporative cooling vest (CVEV), arm immersion in 10°C water (AI), portable water-perfused suit (WPS), heliox inhalation (HE) and ice slushy ingestion (SL). Immediately before and after cooling, participants were assessed for fine (Purdue pegboard task) and gross (grip and pinch strength) manual dexterity. Rectal and skin temperature, as well as thermal sensation and comfort, were monitored throughout. Compared with CON, SL was the only method to reduce rectal temperature (P = 0.012). All externally applied cooling methods reduced skin temperature (P<0.05), though CV0 resulted in the lowest skin temperature versus other cooling methods. Participants felt cooler with CV0, CV14, WPS, AI and SL (P<0.05). AI significantly impaired Purdue pegboard performance (P = 0.001), but did not affect grip or pinch strength (P>0.05). The present study observed that ice ingestion or ice applied to the skin produced the greatest effect on rectal and skin temperature, respectively. AI should not be utilised if workers require subsequent fine manual dexterity. These results will help inform future studies investigating appropriate pre-cooling methods for the occupational worker.
Li, Zhan-Chao; Zhou, Xi-Bin; Dai, Zong; Zou, Xiao-Yong
2009-07-01
A prior knowledge of protein structural classes can provide useful information about its overall structure, so it is very important for quick and accurate determination of protein structural class with computation method in protein science. One of the key for computation method is accurate protein sample representation. Here, based on the concept of Chou's pseudo-amino acid composition (AAC, Chou, Proteins: structure, function, and genetics, 43:246-255, 2001), a novel method of feature extraction that combined continuous wavelet transform (CWT) with principal component analysis (PCA) was introduced for the prediction of protein structural classes. Firstly, the digital signal was obtained by mapping each amino acid according to various physicochemical properties. Secondly, CWT was utilized to extract new feature vector based on wavelet power spectrum (WPS), which contains more abundant information of sequence order in frequency domain and time domain, and PCA was then used to reorganize the feature vector to decrease information redundancy and computational complexity. Finally, a pseudo-amino acid composition feature vector was further formed to represent primary sequence by coupling AAC vector with a set of new feature vector of WPS in an orthogonal space by PCA. As a showcase, the rigorous jackknife cross-validation test was performed on the working datasets. The results indicated that prediction quality has been improved, and the current approach of protein representation may serve as a useful complementary vehicle in classifying other attributes of proteins, such as enzyme family class, subcellular localization, membrane protein types and protein secondary structure, etc.
Evaluating Web accessibility at different processing phases
NASA Astrophysics Data System (ADS)
Fernandes, N.; Lopes, R.; Carriço, L.
2012-09-01
Modern Web sites use several techniques (e.g. DOM manipulation) that allow for the injection of new content into their Web pages (e.g. AJAX), as well as manipulation of the HTML DOM tree. This has the consequence that the Web pages that are presented to users (i.e. after browser processing) are different from the original structure and content that is transmitted through HTTP communication (i.e. after browser processing). This poses a series of challenges for Web accessibility evaluation, especially on automated evaluation software. This article details an experimental study designed to understand the differences posed by accessibility evaluation after Web browser processing. We implemented a Javascript-based evaluator, QualWeb, that can perform WCAG 2.0 based accessibility evaluations in the two phases of browser processing. Our study shows that, in fact, there are considerable differences between the HTML DOM trees in both phases, which have the consequence of having distinct evaluation results. We discuss the impact of these results in the light of the potential problems that these differences can pose to designers and developers that use accessibility evaluators that function before browser processing.
Adding Processing Functionality to the Sensor Web
NASA Astrophysics Data System (ADS)
Stasch, Christoph; Pross, Benjamin; Jirka, Simon; Gräler, Benedikt
2017-04-01
The Sensor Web allows discovering, accessing and tasking different kinds of environmental sensors in the Web, ranging from simple in-situ sensors to remote sensing systems. However, (geo-)processing functionality needs to be applied to integrate data from different sensor sources and to generate higher level information products. Yet, a common standardized approach for processing sensor data in the Sensor Web is still missing and the integration differs from application to application. Standardizing not only the provision of sensor data, but also the processing facilitates sharing and re-use of processing modules, enables reproducibility of processing results, and provides a common way to integrate external scalable processing facilities or legacy software. In this presentation, we provide an overview on on-going research projects that develop concepts for coupling standardized geoprocessing technologies with Sensor Web technologies. At first, different architectures for coupling sensor data services with geoprocessing services are presented. Afterwards, profiles for linear regression and spatio-temporal interpolation of the OGC Web Processing Services that allow consuming sensor data coming from and uploading predictions to Sensor Observation Services are introduced. The profiles are implemented in processing services for the hydrological domain. Finally, we illustrate how the R software can be coupled with existing OGC Sensor Web and Geoprocessing Services and present an example, how a Web app can be built that allows exploring the results of environmental models in an interactive way using the R Shiny framework. All of the software presented is available as Open Source Software.
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.
2011-01-01
Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.
Process property studies of melt blown thermoplastic polyurethane polymers
NASA Astrophysics Data System (ADS)
Lee, Youn Eung
The primary goal of this research was to determine optimum processing conditions to produce commercially acceptable melt blown (MB) thermoplastic polyurethane (TPU) webs. The 6-inch MB line and the 20-inch wide Accurate Products MB pilot line at the Textiles and Nonwovens Development Center (TANDEC), The University of Tennessee, Knoxville, were utilized for this study. The MB TPU trials were performed in four different phases: Phase 1 focused on the envelope of the MB operating conditions for different TPU polymers; Phase 2 focused on the production of commercially acceptable MB TPU webs; Phase 3 focused on the optimization of the processing conditions of MB TPU webs, and the determination of the significant relationships between processing parameters and web properties utilizing statistical analyses; Based on the first three phases, a more extensive study of fiber and web formation in the MB TPU process was made and a multi liner regression model for the MB TPU process versus properties was also developed in Phase 4. In conclusion, the basic MB process was fundamentally valid for the MB TPU process; however, the MB process was more complicated for TPU than PP, because web structures and properties of MB TPUs are very sensitive to MB process conditions: Furthermore, different TPU grades responded very differently to MB processing and exhibited different web structure and properties. In Phase 3 and Phase 4, small fiber diameters of less than 5mum were produced from TPU237, TPU245 and TPU280 pellets, and the mechanical strengths of MB TPU webs including the tensile strength, tear strength, abrasion resistance and tensile elongation were notably good. In addition, the statistical model showed useful interaction regarding trends for processing parameters versus properties of MB TPU webs. Die and air temperature showed multicollinearity problems and fiber diameter was notably affected by air flow rate, throughput and die/air temperature. It was also shown that most of the MB TPU web properties including mechanical strength, air permeability and fiber diameters were affected by air velocity and die temperature.
Fuchs, Lynn S.; Gilbert, Jennifer K.; Fuchs, Douglas; Seethaler, Pamela M.; Martin, BrittanyLee N.
2018-01-01
This study was designed to deepen insights on whether word-problem (WP) solving is a form of text comprehension (TC) and on the role of language in WPs. A sample of 325 second graders, representing high, average, and low reading and math performance, was assessed on (a) start-of-year TC, WP skill, language, nonlinguistic reasoning, working memory, and foundational skill (word identification, arithmetic) and (b) year-end WP solving, WP-language processing (understanding WP statements, without calculation demands), and calculations. Multivariate, multilevel path analysis, accounting for classroom and school effects, indicated that TC was a significant and comparably strong predictor of all outcomes. Start-of-year language was a significantly stronger predictor of both year-end WP outcomes than of calculations, whereas start-of-year arithmetic was a significantly stronger predictor of calculations than of either WP measure. Implications are discussed in terms of WP solving as a form of TC and a theoretically coordinated approach, focused on language, for addressing TC and WP-solving instruction. PMID:29643723
Optimizing Crawler4j using MapReduce Programming Model
NASA Astrophysics Data System (ADS)
Siddesh, G. M.; Suresh, Kavya; Madhuri, K. Y.; Nijagal, Madhushree; Rakshitha, B. R.; Srinivasa, K. G.
2017-06-01
World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programming model by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programming model improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.
Building Student-Centered Web Sites in the K12 Classroom.
ERIC Educational Resources Information Center
Hall, Alison; Basile, Brigitte
This paper examines the process of constructing a student-centered World Wide Web site and provides recommendations for improving this process. In the project, preservice teachers instructed the fifth grade students about how to design and develop a Web site on weather. The topics of the sessions included Internet ethics, using the Web,…
An Exploratory Study of User Searching of the World Wide Web: A Holistic Approach.
ERIC Educational Resources Information Center
Wang, Peiling; Tenopir, Carol; Laymman, Elizabeth; Penniman, David; Collins, Shawn
1998-01-01
Examines Web users' behaviors and needs and tests a methodology for studying users' interaction with the Web. A process-tracing technique, together with tests of cognitive style, anxiety levels, and self-report computer experience, provided data on how users interact with the Web in the process of finding factual information. (Author/AEF)
The Acquisition of Integrated Science Process Skills in a Web-Based Learning Environment
ERIC Educational Resources Information Center
Saat, Rohaida Mohd
2004-01-01
Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among…
Silicon web process development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.
1981-01-01
The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.
The Web Measurement Environment (WebME): A Tool for Combining and Modeling Distributed Data
NASA Technical Reports Server (NTRS)
Tesoriero, Roseanne; Zelkowitz, Marvin
1997-01-01
Many organizations have incorporated data collection into their software processes for the purpose of process improvement. However, in order to improve, interpreting the data is just as important as the collection of data. With the increased presence of the Internet and the ubiquity of the World Wide Web, the potential for software processes being distributed among several physically separated locations has also grown. Because project data may be stored in multiple locations and in differing formats, obtaining and interpreting data from this type of environment becomes even more complicated. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data in a distributed environment. The WebME system will permit the analysis of development data in distributed, heterogeneous environments. This paper provides an overview of the system and its capabilities.
Kavanaugh, Arthur; Smolen, Josef S; Emery, Paul; Purcaru, Oana; Keystone, Edward; Richard, Lance; Strand, Vibeke; van Vollenhoven, Ronald F
2009-11-15
To assess the impact of certolizumab pegol (CZP), a novel PEGylated anti-tumor necrosis factor, in combination with methotrexate (MTX) on productivity outside and within the home, and on participation in family, social, and leisure activities in adult patients with rheumatoid arthritis (RA). The efficacy and safety of CZP (200 mg and 400 mg) plus MTX were assessed in 2 phase III, multicenter, double-blind, placebo-controlled trials (Rheumatoid Arthritis Prevention of Structural Damage [RAPID] 1 and RAPID 2). The novel, validated, RA-specific Work Productivity Survey (WPS-RA) was used to assess work place and home productivity. WPS-RA responses were collected at baseline and every 4 weeks until withdrawal/study completion. At baseline, 41.6% and 39.8% of subjects were employed outside the home in RAPID 1 and RAPID 2, respectively. Compared with placebo plus MTX, CZP plus MTX significantly reduced work absenteeism and presenteeism among patients working outside the home. Significant reductions in number of household days lost, household days with productivity reduced by >/=50%, and days lost due to RA for participation in family, social, and leisure activities were reported by patients in active treatment relative to placebo plus MTX. Improvements in all measures were observed with CZP plus MTX as early as week 4, and maintained until the study end (12 months in RAPID 1, 6 months in RAPID 2). Findings were consistent with clinical improvements with CZP plus MTX in both trials. CZP plus MTX improved productivity outside and within the home and resulted in more participation in social activities compared with placebo plus MTX. These observations suggest that considerable indirect cost gains might be achieved with this therapeutic agent in RA.
Minett, Geoffrey M.; Bach, Aaron J. E.; Zietek, Stephanie A.; Stewart, Kelly L.; Stewart, Ian B.
2018-01-01
Objective The present study aimed to compare a range of cooling methods possibly utilised by occupational workers, focusing on their effect on body temperature, perception and manual dexterity. Methods Ten male participants completed eight trials involving 30 min of seated rest followed by 30 min of cooling or control of no cooling (CON) (34°C, 58% relative humidity). The cooling methods utilised were: ice cooling vest (CV0), phase change cooling vest melting at 14°C (CV14), evaporative cooling vest (CVEV), arm immersion in 10°C water (AI), portable water-perfused suit (WPS), heliox inhalation (HE) and ice slushy ingestion (SL). Immediately before and after cooling, participants were assessed for fine (Purdue pegboard task) and gross (grip and pinch strength) manual dexterity. Rectal and skin temperature, as well as thermal sensation and comfort, were monitored throughout. Results Compared with CON, SL was the only method to reduce rectal temperature (P = 0.012). All externally applied cooling methods reduced skin temperature (P<0.05), though CV0 resulted in the lowest skin temperature versus other cooling methods. Participants felt cooler with CV0, CV14, WPS, AI and SL (P<0.05). AI significantly impaired Purdue pegboard performance (P = 0.001), but did not affect grip or pinch strength (P>0.05). Conclusion The present study observed that ice ingestion or ice applied to the skin produced the greatest effect on rectal and skin temperature, respectively. AI should not be utilised if workers require subsequent fine manual dexterity. These results will help inform future studies investigating appropriate pre-cooling methods for the occupational worker. PMID:29357373
Regional thermal comfort zone in males and females.
Ciuha, Ursa; Mekjavic, Igor B
2016-07-01
Skin regions differ in their sensitivity to temperature stimuli. The present study examined whether such regional differences were also evident in the perception of thermal comfort. Regional thermal comfort was assessed in males (N=8) and females (N=8), by having them regulate the temperature of the water delivered to a water-perfused suit (WPS), within a temperature range considered thermally comfortable. In separate trials, subjects regulated the temperature of the WPS, or specific regions of the suit covering different skin areas (arms, legs, front torso and back torso). In the absence of subjective temperature regulation (TR), the temperature changed in a sinusoidal manner from 10°C to 50°C; by depressing a switch and reversing the direction of the temperature at the limits of the thermal comfort zone (TCZ), each subject defined TCZ for each body region investigated. The range of regulated temperatures did not differ between genders and skin regions. Local Tsk at the lower and upper limits of the TCZ was similar for both genders. Higher (p<0.05) local Tsk was preferred for the arms (35.4±2.1°C), compared to other regions (legs: 34.4±5.4°C, front torso: 34.6±2.8°C, 34.3±6.6°C), irrespective of gender. In thermally comfortable conditions, the well-established regional differences in thermosensitivity are not reflected in the TCZ, with similar temperature preferences by both genders. Thermal comfort of different skin regions and overall body is not achieved at a single skin temperature, but at range of temperatures, defined as the TCZ. Copyright © 2016 Elsevier Inc. All rights reserved.
Lamberti, Monica; Ratti, Gennaro; Gerardi, Donato; Capogrosso, Cristina; Ricciardi, Gianfranco; Fulgione, Cosimo; Latte, Salvatore; Tammaro, Paolo; Covino, Gregorio; Nienhaus, Albert; Grazillo, Elpidio Maria; Mallardo, Mario; Capogrosso, Paolo
2016-01-01
Coronary heart disease is frequent in the working-age population. Traditional outcomes, such as mortality and hospital readmission, are useful for evaluating prognosis. Fit-for-work is an emerging outcome with clinical as well as socioeconomic significance. We describe the possible benefit of a cardiac rehabilitation (CR) program for return to work (RTW) after acute coronary syndrome (ACS). We evaluated 204 patients with recent ACS. They were divided into 4 groups on the basis of their occupational work load: very light (VL), light (L), moderate (M), and heavy (H). Work-related outcomes were assessed with the Work Performance Scale (WPS) of the Functional Status Questionnaire and as "days missed from work" (DMW) in the previous 4 weeks. The variables considered for outcomes were percent ejection fraction, functional capacity expressed in metabolic equivalents (METs), and participation or non-participation in the CR program (CR+ and CR-). One hundred thirty (66%) patients took part in the CR program. Total WPS scores for CR+ and CR- subgroups were VL group: 18±4 vs. 14±4 (p < 0.001), L group: 18±3 vs. 14±3 (p < 0.0001), M group: 19±3 vs. 16±3 (p < 0.003), and H group: 20±4 vs. 17±3 (p < 0.006). Fewer DMW were reported by the CR+ group. Non-participation in CR was a consistent cause of poorer work-related outcomes. Our findings indicate that CR and occupational counseling play a very important role in worker recovery and subsequent reintegration in the workplace, in particular among clerical workers. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, X; Liu, L; Xing, L
Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less
NMRPro: an integrated web component for interactive processing and visualization of NMR spectra.
Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi
2016-07-01
The popularity of using NMR spectroscopy in metabolomics and natural products has driven the development of an array of NMR spectral analysis tools and databases. Particularly, web applications are well used recently because they are platform-independent and easy to extend through reusable web components. Currently available web applications provide the analysis of NMR spectra. However, they still lack the necessary processing and interactive visualization functionalities. To overcome these limitations, we present NMRPro, a web component that can be easily incorporated into current web applications, enabling easy-to-use online interactive processing and visualization. NMRPro integrates server-side processing with client-side interactive visualization through three parts: a python package to efficiently process large NMR datasets on the server-side, a Django App managing server-client interaction, and SpecdrawJS for client-side interactive visualization. Demo and installation instructions are available at http://mamitsukalab.org/tools/nmrpro/ mohamed@kuicr.kyoto-u.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Supporting Reflective Activities in Information Seeking on the Web
NASA Astrophysics Data System (ADS)
Saito, Hitomi; Miwa, Kazuhisa
Recently, many opportunities have emerged to use the Internet in daily life and classrooms. However, with the growth of the World Wide Web (Web), it is becoming increasingly difficult to find target information on the Internet. In this study, we explore a method for developing the ability of users in information seeking on the Web and construct a search process feedback system supporting reflective activities of information seeking on the Web. Reflection is defined as a cognitive activity for monitoring, evaluating, and modifying one's thinking and process. In the field of learning science, many researchers have investigated reflective activities that facilitate learners' problem solving and deep understanding. The characteristics of this system are: (1) to show learners' search processes on the Web as described, based on a cognitive schema, and (2) to prompt learners to reflect on their search processes. We expect that users of this system can reflect on their search processes by receiving information on their own search processes provided by the system, and that these types of reflective activity helps them to deepen their understanding of information seeking activities. We have conducted an experiment to investigate the effects of our system. The experimental results confirmed that (1) the system actually facilitated the learners' reflective activities by providing process visualization and prompts, and (2) the learners who reflected on their search processes more actively understood their own search processes more deeply.
Lee, Eunjoo; Noh, Hyun Kyung
2016-01-01
To examine the effects of a web-based nursing process documentation system on the stress and anxiety of nursing students during their clinical practice. A quasi-experimental design was employed. The experimental group (n = 110) used a web-based nursing process documentation program for their case reports as part of assignments for a clinical practicum, whereas the control group (n = 106) used traditional paper-based case reports. Stress and anxiety levels were measured with a numeric rating scale before, 2 weeks after, and 4 weeks after using the web-based nursing process documentation program during a clinical practicum. The data were analyzed using descriptive statistics, t tests, chi-square tests, and repeated-measures analyses of variance. Nursing students who used the web-based nursing process documentation program showed significant lower levels of stress and anxiety than the control group. A web-based nursing process documentation program could be used to reduce the stress and anxiety of nursing students during clinical practicum, which ultimately would benefit nursing students by increasing satisfaction with and effectiveness of clinical practicum. © 2015 NANDA International, Inc.
VisSearch: A Collaborative Web Searching Environment
ERIC Educational Resources Information Center
Lee, Young-Jin
2005-01-01
VisSearch is a collaborative Web searching environment intended for sharing Web search results among people with similar interests, such as college students taking the same course. It facilitates students' Web searches by visualizing various Web searching processes. It also collects the visualized Web search results and applies an association rule…
Web site development: applying aesthetics to promote breast health education and awareness.
Thomas, Barbara; Goldsmith, Susan B; Forrest, Anne; Marshall, Renée
2002-01-01
This article describes the process of establishing a Web site as part of a collaborative project using visual art to promote breast health education. The need for a more "user-friendly" comprehensive breast health Web site that is aesthetically rewarding was identified after an analysis of current Web sites available through the World Wide Web. Two predetermined sets of criteria, accountability and aesthetics, were used to analyze these sites and to generate ideas for creating a breast health education Web site using visual art. Results of the analyses conducted are included as well as the factors to consider for incorporating into a Web site. The process specified is thorough and can be applied to establish a Web site that is aesthetically rewarding and informative for a variety of educational purposes.
On-demand server-side image processing for web-based DICOM image display
NASA Astrophysics Data System (ADS)
Sakusabe, Takaya; Kimura, Michio; Onogi, Yuzo
2000-04-01
Low cost image delivery is needed in modern networked hospitals. If a hospital has hundreds of clients, cost of client systems is a big problem. Naturally, a Web-based system is the most effective solution. But a Web browser could not display medical images with certain image processing such as a lookup table transformation. We developed a Web-based medical image display system using Web browser and on-demand server-side image processing. All images displayed on a Web page are generated from DICOM files on a server, delivered on-demand. User interaction on the Web page is handled by a client-side scripting technology such as JavaScript. This combination makes a look-and-feel of an imaging workstation not only for its functionality but also for its speed. Real time update of images with tracing mouse motion is achieved on Web browser without any client-side image processing which may be done by client-side plug-in technology such as Java Applets or ActiveX. We tested performance of the system in three cases. Single client, small number of clients in a fast speed network, and large number of clients in a normal speed network. The result shows that there are very slight overhead for communication and very scalable in number of clients.
Circadian Patterns of Wikipedia Editorial Activity: A Demographic Analysis
Yasseri, Taha; Sumi, Robert; Kertész, János
2012-01-01
Wikipedia (WP) as a collaborative, dynamical system of humans is an appropriate subject of social studies. Each single action of the members of this society, i.e., editors, is well recorded and accessible. Using the cumulative data of 34 Wikipedias in different languages, we try to characterize and find the universalities and differences in temporal activity patterns of editors. Based on this data, we estimate the geographical distribution of editors for each WP in the globe. Furthermore we also clarify the differences among different groups of WPs, which originate in the variance of cultural and social features of the communities of editors. PMID:22272279
Reilly, Niamh
2018-05-01
The recent unprecedented focus on ending impunity for conflict-related sexual violence (CRSV) is positive in many respects. However, it has narrowed the scope of Security Council Resolution 1325 and the women, peace, and security (WPS) agenda it established in 2000. Through a critical discursive genealogy of the interrelation of two UN agendas-protection of civilians in armed conflict and women, peace, and security-the author traces how CRSV emerged as the defining issue of the latter while the transformative imperative of making women's participation central to every UN endeavor for peace and security has failed to gain traction.
Federal standards and procedures for the National Watershed Boundary Dataset (WBD)
,; ,; ,
2009-03-11
Terminology, definitions, and procedural information are provided to ensure uniformity in hydrologic unit boundaries, names, and numerical codes. Detailed standards and specifications for data are included. The document also includes discussion of objectives, communications required for revising the data resolution in the United States and the Caribbean, as well as final review and data-quality criteria. Instances of unusual landforms or artificial features that affect the hydrologic units are described with metadata standards. Up-to-date information and availability of the hydrologic units are listed athttp://www.nrcs.usda.gov/wps/portal/nrcs/detail/national/technical/nra/dma/?&cid=nrcs143_021630/.
Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A
2011-11-29
Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.
2011-01-01
Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392
The Four Levels of Web Site Development Expertise.
ERIC Educational Resources Information Center
Ingram, Albert L.
2000-01-01
Discusses the design of Web pages and sites and proposes a four-level model of Web development expertise that can serve as a curriculum overview or as a plan for an individual's professional development. Highlights include page design, media use, client-side processing, server-side processing, and site structure. (LRW)
ERIC Educational Resources Information Center
Kuiper, Els; Volman, Monique; Terwel, Jan
2005-01-01
The use of the Web in K-12 education has increased substantially in recent years. The Web, however, does not support the learning processes of students as a matter of course. In this review, the authors analyze what research says about the demands that the use of the Web as an information resource in education makes on the support and supervision…
[A solution for display and processing of DICOM images in web PACS].
Xue, Wei-jing; Lu, Wen; Wang, Hai-yang; Meng, Jian
2009-03-01
Use the technique of Java Applet to realize the supporting of DICOM image in ordinary Web browser, thereby to expand the processing function of medical image. First analyze the format of DICOM file and design a class which can acquire the pixels, then design two Applet classes, of which one is used to disposal the DICOM image, the other is used to display DICOM image that have been disposaled in the first Applet. They all embedded in the View page, and they communicate by Applet Context object. The method designed in this paper can make users display and process DICOM images directly by using ordinary Web browser, which makes Web PACS not only have the advantages of B/S model, but also have the advantages of the C/S model. Java Applet is the key for expanding the Web browser's function in Web PACS, which provides a guideline to sharing of medical images.
NASA Technical Reports Server (NTRS)
1981-01-01
Liquid diffusion masks and liquid applied dopants to replace the CVD Silox masking and gaseous diffusion operations specified for forming junctions in the Westinghouse baseline process sequence for producing solar cells from dendritic web silicon were investigated. The baseline diffusion masking and drive processes were compared with those involving direct liquid applications to the dendritic web silicon strips. Attempts were made to control the number of variables by subjecting dendritic web strips cut from a single web crystal to both types of operations. Data generated reinforced earlier conclusions that efficiency levels at least as high as those achieved with the baseline back junction formation process can be achieved using liquid diffusion masks and liquid dopants. The deliveries of dendritic web sheet material and solar cells specified by the current contract were made as scheduled.
Analysis and Development of a Web-Enabled Planning and Scheduling Database Application
2013-09-01
establishes an entity—relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web- enabled interface for...development, develop, design, process, re- engineering, reengineering, MySQL , structured query language, SQL, myPHPadmin. 15. NUMBER OF PAGES 107 16...relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web-enabled interface for the population of
NASA Astrophysics Data System (ADS)
Paulraj, D.; Swamynathan, S.; Madhaiyan, M.
2012-11-01
Web Service composition has become indispensable as a single web service cannot satisfy complex functional requirements. Composition of services has received much interest to support business-to-business (B2B) or enterprise application integration. An important component of the service composition is the discovery of relevant services. In Semantic Web Services (SWS), service discovery is generally achieved by using service profile of Ontology Web Languages for Services (OWL-S). The profile of the service is a derived and concise description but not a functional part of the service. The information contained in the service profile is sufficient for atomic service discovery, but it is not sufficient for the discovery of composite semantic web services (CSWS). The purpose of this article is two-fold: first to prove that the process model is a better choice than the service profile for service discovery. Second, to facilitate the composition of inter-organisational CSWS by proposing a new composition method which uses process ontology. The proposed service composition approach uses an algorithm which performs a fine grained match at the level of atomic process rather than at the level of the entire service in a composite semantic web service. Many works carried out in this area have proposed solutions only for the composition of atomic services and this article proposes a solution for the composition of composite semantic web services.
NASA Astrophysics Data System (ADS)
Giraud, Francois
1999-10-01
This dissertation investigates the application of neural network theory to the analysis of a 4-kW Utility-interactive Wind-Photovoltaic System (WPS) with battery storage. The hybrid system comprises a 2.5-kW photovoltaic generator and a 1.5-kW wind turbine. The wind power generator produces power at variable speed and variable frequency (VSVF). The wind energy is converted into dc power by a controlled, tree-phase, full-wave, bridge rectifier. The PV power is maximized by a Maximum Power Point Tracker (MPPT), a dc-to-dc chopper, switching at a frequency of 45 kHz. The whole dc power of both subsystems is stored in the battery bank or conditioned by a single-phase self-commutated inverter to be sold to the utility at a predetermined amount. First, the PV is modeled using Artificial Neural Network (ANN). To reduce model uncertainty, the open-circuit voltage VOC and the short-circuit current ISC of the PV are chosen as model input variables of the ANN. These input variables have the advantage of incorporating the effects of the quantifiable and non-quantifiable environmental variants affecting the PV power. Then, a simplified way to predict accurately the dynamic responses of the grid-linked WPS to gusty winds using a Recurrent Neural Network (RNN) is investigated. The RNN is a single-output feedforward backpropagation network with external feedback, which allows past responses to be fed back to the network input. In the third step, a Radial Basis Functions (RBF) Network is used to analyze the effects of clouds on the Utility-Interactive WPS. Using the irradiance as input signal, the network models the effects of random cloud movement on the output current, the output voltage, the output power of the PV system, as well as the electrical output variables of the grid-linked inverter. Fourthly, using RNN, the combined effects of a random cloud and a wind gusts on the system are analyzed. For short period intervals, the wind speed and the solar radiation are considered as the sole sources of power, whose variations influence the system variables. Since both subsystems have different dynamics, their respective responses are expected to impact differently the whole system behavior. The dispatchability of the battery-supported system as well as its stability and reliability during gusts and/or cloud passage is also discussed. In the fifth step, the goal is to determine to what extent the overall power quality of the grid would be affected by a proliferation of Utility-interactive hybrid system and whether recourse to bulky or individual filtering and voltage controller is necessary. The final stage of the research includes a steady-state analysis of two-year operation (May 96--Apr 98) of the system, with a discussion on system reliability, on any loss of supply probability, and on the effects of the randomness in the wind and solar radiation upon the system design optimization.
Randomized evaluation of a web based interview process for urology resident selection.
Shah, Satyan K; Arora, Sanjeev; Skipper, Betty; Kalishman, Summers; Timm, T Craig; Smith, Anthony Y
2012-04-01
We determined whether a web based interview process for resident selection could effectively replace the traditional on-site interview. For the 2010 to 2011 match cycle, applicants to the University of New Mexico urology residency program were randomized to participate in a web based interview process via Skype or a traditional on-site interview process. Both methods included interviews with the faculty, a tour of facilities and the opportunity to ask current residents any questions. To maintain fairness the applicants were then reinterviewed via the opposite process several weeks later. We assessed comparative effectiveness, cost, convenience and satisfaction using anonymous surveys largely scored on a 5-point Likert scale. Of 39 total participants (33 applicants and 6 faculty) 95% completed the surveys. The web based interview was less costly to applicants (mean $171 vs $364, p=0.05) and required less time away from school (10% missing 1 or more days vs 30%, p=0.04) compared to traditional on-site interview. However, applicants perceived the web based interview process as less effective than traditional on-site interview, with a mean 6-item summative effectiveness score of 21.3 vs 25.6 (p=0.003). Applicants and faculty favored continuing the web based interview process in the future as an adjunct to on-site interviews. Residency interviews can be successfully conducted via the Internet. The web based interview process reduced costs and improved convenience. The findings of this study support the use of videoconferencing as an adjunct to traditional interview methods rather than as a replacement. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Interactivity, Information Processing, and Learning on the World Wide Web.
ERIC Educational Resources Information Center
Tremayne, Mark; Dunwoody, Sharon
2001-01-01
Examines the role of interactivity in the presentation of science news on the World Wide Web. Proposes and tests a model of interactive information processing that suggests that characteristics of users and Web sites influence interactivity, which influences knowledge acquisition. Describes use of a think-aloud method to study participants' mental…
Client-Side Event Processing for Personalized Web Advertisement
NASA Astrophysics Data System (ADS)
Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad
The market for Web advertisement is continuously growing and correspondingly, the number of approaches that can be used for realizing Web advertisement are increasing. However, current approaches fail to generate very personalized ads for a current Web user that is visiting a particular Web content. They mainly try to develop a profile based on the content of that Web page or on a long-term user's profile, by not taking into account current user's preferences. We argue that by discovering a user's interest from his current Web behavior we can support the process of ad generation, especially the relevance of an ad for the user. In this paper we present the conceptual architecture and implementation of such an approach. The approach is based on the extraction of simple events from the user interaction with a Web page and their combination in order to discover the user's interests. We use semantic technologies in order to build such an interpretation out of many simple events. We present results from preliminary evaluation studies. The main contribution of the paper is a very efficient, semantic-based client-side architecture for generating and combining Web events. The architecture ensures the agility of the whole advertisement system, by complexly processing events on the client. In general, this work contributes to the realization of new, event-driven applications for the (Semantic) Web.
Silicon web process development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.
1977-01-01
Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.
An Educational Tool for Browsing the Semantic Web
ERIC Educational Resources Information Center
Yoo, Sujin; Kim, Younghwan; Park, Seongbin
2013-01-01
The Semantic Web is an extension of the current Web where information is represented in a machine processable way. It is not separate from the current Web and one of the confusions that novice users might have is where the Semantic Web is. In fact, users can easily encounter RDF documents that are components of the Semantic Web while they navigate…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denton, Mark A.
Under Task Order 22 of the industry Advisory and Assistance Services (A&AS) Contract to the Department of Energy (DOE) DE-NE0000291, AREVA has been tasked with providing assistance with engineering, analysis, cost estimating, and design support of a system for disposal of radioactive wastes in deep boreholes (without the use of radioactive waste). As part of this task order, AREVA was requested, through a letter of technical direction, to evaluate Sandia National Laboratory’s (SNL’s) waste package borehole emplacement system concept recommendation using input from DOE and SNL. This summary review report (SRR) documents this evaluation, with its focus on the primarymore » input document titled: “Deep Borehole Field Test Specifications/M2FT-15SN0817091” Rev. 1 [1], hereafter referred to as the “M2 report.” The M2 report focuses on the conceptual design development for the Deep Borehole Field Test (DBFT), mainly the test waste packages (WPs) and the system for demonstrating emplacement and retrieval of those packages in the Field Test Borehole (FTB). This SRR follows the same outline as the M2 report, which allows for easy correlation between AREVA’s review comments, discussion, potential proposed alternatives, and path forward with information established in the M2 report. AREVA’s assessment focused on three primary elements of the M2 report: the conceptual design of the WPs proposed for deep borehole disposal (DBD), the mode of emplacement of the WP into DBD, and the conceptual design of the DBFT. AREVA concurs with the M2 report’s selection of the wireline emplacement mode specifically over the drill-string emplacement mode and generically over alternative emplacement modes. Table 5-1 of this SRR compares the pros and cons of each emplacement mode considered viable for DBD. The primary positive characteristics of the wireline emplacement mode include: (1) considered a mature technology; (2) operations are relatively simple; (3) probability of a radiological release due to off-normal events are relatively low; (4) costs are relatively low; and (5) maintenance activities are relatively simple. The primary drawback associated with the wireline emplacement mode for DBD is the number of emplacement trips-in to the borehole, which results in a relatively higher probability for a drop event. Fortunately, the WPs can be engineered with impact limiters that will minimize the likelihood of a breach of the WP due to a drop. The WP designs presented in the M2 report appear to be focused on compatibility with the drill-string emplacement mode (e.g., the threaded connections). With the recommendation that the wireline emplacement mode be utilized for the DBFT, some changes may be warranted to these WPs. For example, the development of a WP release connection that is more reliable than the currently credited connection, which is considered to have a high failure probability, and the integration of an impact limiter into its design. The M2 report states the engineering demonstration of the DBFT will occur in the FTB over a 4-year period. AREVA recommends development and testing of the WP emplacement handling equipment occur separately (but concurrently, if not earlier) from the FTB at a mock-up facility. The separation of this activity would prevent schedule interference between the science and engineering thrusts of the project. Performing tests in a mock-up facility would allow additional control and observation compared to the FTB. The mock-up facility could also be utilized as a training facility for future operations. Terminal velocity and impact limiter testing would require the FTB for testing, since these areas would be difficult to reproduce in a limited depth mock-up. Although only at the end of the conceptual stage of design development, DBD appears to be a viable solution for some waste forms produced by the nuclear industry. However, regulatory requirements have yet to be established for pre- and post-closure performance of DBD and should be established as soon as possible. Some of the main areas of focus from a regulatory perspective include: (1) establishing acceptable performance requirements for the long-term behavior of DBD; (2) determining acceptable borehole abandonment criteria; (3) establishing retrievability requirements; (4) developing a consensus on the factor of safety (FoS) for the emplacement mode and WP; and (5) establishing safety and safeguards performance requirements for DBD. Although conservative requirements have been utilized to provide the foundation for the conceptual design of DBD, regulatory requirements and feedback are necessary to confirm recommendations made herein and to ensure the long-term performance of DBD is acceptable. The combination of the M2 report and this SRR is intended to facilitate the completion of the conceptual design for DBD for the Cs and Sr capsules and calcined waste forms. Using the conceptual design, preliminary design activities (the second stage of a three-stage process described in the M2 report) can proceed and the DBFT utilized to support, demonstrate, and confirm engineering elements of this design.« less
No Longer Conveyor but Creator: Developing an Epistemology of the World Wide Web.
ERIC Educational Resources Information Center
Trombley, Laura E. Skandera; Flanagan, William G.
2001-01-01
Discusses the impact of the World Wide Web in terms of epistemology. Topics include technological innovations, including new dimensions of virtuality; the accessibility of information; tracking Web use via cookies; how the Web transforms the process of learning and knowing; linking information sources; and the Web as an information delivery…
ERIC Educational Resources Information Center
Chen, Gwo-Dong; Liu, Chen-Chung; Ou, Kuo-Liang; Liu, Baw-Jhiune
2000-01-01
Discusses the use of Web logs to record student behavior that can assist teachers in assessing performance and making curriculum decisions for distance learning students who are using Web-based learning systems. Adopts decision tree and data cube information processing methodologies for developing more effective pedagogical strategies. (LRW)
ERIC Educational Resources Information Center
Moallem, Mahnaz
2001-01-01
Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…
Web-based interactive 2D/3D medical image processing and visualization software.
Mahmoudi, Seyyed Ehsan; Akhondi-Asl, Alireza; Rahmani, Roohollah; Faghih-Roohi, Shahrooz; Taimouri, Vahid; Sabouri, Ahmad; Soltanian-Zadeh, Hamid
2010-05-01
There are many medical image processing software tools available for research and diagnosis purposes. However, most of these tools are available only as local applications. This limits the accessibility of the software to a specific machine, and thus the data and processing power of that application are not available to other workstations. Further, there are operating system and processing power limitations which prevent such applications from running on every type of workstation. By developing web-based tools, it is possible for users to access the medical image processing functionalities wherever the internet is available. In this paper, we introduce a pure web-based, interactive, extendable, 2D and 3D medical image processing and visualization application that requires no client installation. Our software uses a four-layered design consisting of an algorithm layer, web-user-interface layer, server communication layer, and wrapper layer. To compete with extendibility of the current local medical image processing software, each layer is highly independent of other layers. A wide range of medical image preprocessing, registration, and segmentation methods are implemented using open source libraries. Desktop-like user interaction is provided by using AJAX technology in the web-user-interface. For the visualization functionality of the software, the VRML standard is used to provide 3D features over the web. Integration of these technologies has allowed implementation of our purely web-based software with high functionality without requiring powerful computational resources in the client side. The user-interface is designed such that the users can select appropriate parameters for practical research and clinical studies. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Dragone, G. N.; Bologna, M.; Gimenez, M. E.; Alvarez, O.; Lince Klinger, F. G.; Correa-Otto, S.; Ussami, N.
2017-12-01
The Paraná Magmatic Province (PMP) together with the Etendeka Province (EP) in Africa is one of the Earth's largest igneous provinces originated prior to the Western Gondwanaland break-up and the inception of the South Atlantic Ocean in the Lower Cretaceous. Geochemical data of PMP-EP basalts collected since late 1980's indicate the origin of PMP-EP by melting of a heterogeneous and enriched subcontinental lithospheric mantle with fast rate of eruption (< 3 My). The geodynamical cause of this magmatism is still a matter of debate (deep mantle plume x plate model). New isotopic geochemical data from Re-Os systematics (Rocha-Jr et al., 2012, EPSL) of PMP basalts indicate metasomatized asthenospheric mantle component probably generated at the mantle wedge between the PMP-EP lithosphere and the subducting oceanic plate. A combined seismic velocity and density model of PMP by Chaves et al. (2016, G3) indicates high velocity and a density increase of PMP ancient lithosphere interpreted as due to a long-term mantle refertilization process. To investigate the role of the subduction zones in the development of both the Paraná basin subsidence and the magmatic province we present the results of regional scale broad-band MT-magnetotelluric soundings across the western and southern borders of the PMP, the Western Paraná suture zone (WPS in Fig. 1). We discuss the electrical properties of the lithosphere along three MT profiles across the WPS. MT-A profile (Padilha et al., 2015, JGR) extends from Rio Apa craton towards the center of PMP (high-TiO2 basalts). Profile MT-B extends from Tebicuary craton towards the center of PMP (low-TiO2) and profile MT-C extends from Rio de la Plata craton towards the southern PMP (low- and high-TiO2). All profiles show a resistive ( 104 ohm m) and thick (> 150 km) lithosphere in the cratonic areas whereas the electrical lithosphere is thinner (<100 km) with alternating high and low resistivities within PMP. Vertically elongated and high electrical conductivity anomalies ( 10 ohm m) centered at 40 km depth occur along the -30 mGal contour line in the three profiles, and are interpreted as the location of the suture and former subduction zone. We will discuss the correlation between geochemical and petrological characteristics of basalts and the electrical properties of the lithospheric mantle underneath.
Judging nursing information on the WWW: a theoretical understanding.
Cader, Raffik; Campbell, Steve; Watson, Don
2009-09-01
This paper is a report of a study of the judgement processes nurses use when evaluating World Wide Web information related to nursing practice. The World Wide Web has increased the global accessibility of online health information. However, the variable nature of the quality of World Wide Web information and its perceived level of reliability may lead to misinformation. This makes demands on healthcare professionals, and on nurses in particular, to ensure that health information of reliable quality is selected for use in practice. A grounded theory approach was adopted. Semi-structured interviews and focus groups were used to collect data, between 2004 and 2005, from 20 nurses undertaking a postqualification graduate course at a university and 13 nurses from a local hospital in the United Kingdom. A theoretical framework emerged that gave insight into the judgement process nurses use when evaluating World Wide Web information. Participants broke the judgement process down into specific tasks. In addition, they used tacit, process and propositional knowledge and intuition, quasi-rational cognition and analysis to undertake these tasks. World Wide Web information cues, time available and nurses' critical skills were influencing factors in their judgement process. Addressing the issue of quality and reliability associated with World Wide Web information is a global challenge. This theoretical framework could contribute towards meeting this challenge.
An Architecture for Autonomic Web Service Process Planning
NASA Astrophysics Data System (ADS)
Moore, Colm; Xue Wang, Ming; Pahl, Claus
Web service composition is a technology that has received considerable attention in the last number of years. Languages and tools to aid in the process of creating composite Web services have been received specific attention. Web service composition is the process of linking single Web services together in order to accomplish more complex tasks. One area of Web service composition that has not received as much attention is the area of dynamic error handling and re-planning, enabling autonomic composition. Given a repository of service descriptions and a task to complete, it is possible for AI planners to automatically create a plan that will achieve this goal. If however a service in the plan is unavailable or erroneous the plan will fail. Motivated by this problem, this paper suggests autonomous re-planning as a means to overcome dynamic problems. Our solution involves automatically recovering from faults and creating a context-dependent alternate plan. We present an architecture that serves as a basis for the central activities autonomous composition, monitoring and fault handling.
Service-based analysis of biological pathways
Zheng, George; Bouguettaya, Athman
2009-01-01
Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403
He, Longjun; Xu, Lang; Ming, Xing; Liu, Qian
2015-02-01
Three-dimensional post-processing operations on the volume data generated by a series of CT or MR images had important significance on image reading and diagnosis. As a part of the DIOCM standard, WADO service defined how to access DICOM objects on the Web, but it didn't involve three-dimensional post-processing operations on the series images. This paper analyzed the technical features of three-dimensional post-processing operations on the volume data, and then designed and implemented a web service system for three-dimensional post-processing operations of medical images based on the WADO protocol. In order to improve the scalability of the proposed system, the business tasks and calculation operations were separated into two modules. As results, it was proved that the proposed system could support three-dimensional post-processing service of medical images for multiple clients at the same moment, which met the demand of accessing three-dimensional post-processing operations on the volume data on the web.
NASA Astrophysics Data System (ADS)
Buszko, Marian L.; Buszko, Dominik; Wang, Daniel C.
1998-04-01
A custom-written Common Gateway Interface (CGI) program for remote control of an NMR spectrometer using a World Wide Web browser has been described. The program, running on a UNIX workstation, uses multiple processes to handle concurrent tasks of interacting with the user and with the spectrometer. The program's parent process communicates with the browser and sends out commands to the spectrometer; the child process is mainly responsible for data acquisition. Communication between the processes is via the shared memory mechanism. The WWW pages that have been developed for the system make use of the frames feature of web browsers. The CGI program provides an intuitive user interface to the NMR spectrometer, making, in effect, a complex system an easy-to-use Web appliance.
Soil food web properties explain ecosystem services across European land use systems.
de Vries, Franciska T; Thébault, Elisa; Liiri, Mira; Birkhofer, Klaus; Tsiafouli, Maria A; Bjørnlund, Lisa; Bracht Jørgensen, Helene; Brady, Mark Vincent; Christensen, Søren; de Ruiter, Peter C; d'Hertefeldt, Tina; Frouz, Jan; Hedlund, Katarina; Hemerik, Lia; Hol, W H Gera; Hotes, Stefan; Mortimer, Simon R; Setälä, Heikki; Sgardelis, Stefanos P; Uteseny, Karoline; van der Putten, Wim H; Wolters, Volkmar; Bardgett, Richard D
2013-08-27
Intensive land use reduces the diversity and abundance of many soil biota, with consequences for the processes that they govern and the ecosystem services that these processes underpin. Relationships between soil biota and ecosystem processes have mostly been found in laboratory experiments and rarely are found in the field. Here, we quantified, across four countries of contrasting climatic and soil conditions in Europe, how differences in soil food web composition resulting from land use systems (intensive wheat rotation, extensive rotation, and permanent grassland) influence the functioning of soils and the ecosystem services that they deliver. Intensive wheat rotation consistently reduced the biomass of all components of the soil food web across all countries. Soil food web properties strongly and consistently predicted processes of C and N cycling across land use systems and geographic locations, and they were a better predictor of these processes than land use. Processes of carbon loss increased with soil food web properties that correlated with soil C content, such as earthworm biomass and fungal/bacterial energy channel ratio, and were greatest in permanent grassland. In contrast, processes of N cycling were explained by soil food web properties independent of land use, such as arbuscular mycorrhizal fungi and bacterial channel biomass. Our quantification of the contribution of soil organisms to processes of C and N cycling across land use systems and geographic locations shows that soil biota need to be included in C and N cycling models and highlights the need to map and conserve soil biodiversity across the world.
Soil food web properties explain ecosystem services across European land use systems
de Vries, Franciska T.; Thébault, Elisa; Liiri, Mira; Birkhofer, Klaus; Tsiafouli, Maria A.; Bjørnlund, Lisa; Bracht Jørgensen, Helene; Brady, Mark Vincent; Christensen, Søren; de Ruiter, Peter C.; d’Hertefeldt, Tina; Frouz, Jan; Hedlund, Katarina; Hemerik, Lia; Hol, W. H. Gera; Hotes, Stefan; Mortimer, Simon R.; Setälä, Heikki; Sgardelis, Stefanos P.; Uteseny, Karoline; van der Putten, Wim H.; Wolters, Volkmar; Bardgett, Richard D.
2013-01-01
Intensive land use reduces the diversity and abundance of many soil biota, with consequences for the processes that they govern and the ecosystem services that these processes underpin. Relationships between soil biota and ecosystem processes have mostly been found in laboratory experiments and rarely are found in the field. Here, we quantified, across four countries of contrasting climatic and soil conditions in Europe, how differences in soil food web composition resulting from land use systems (intensive wheat rotation, extensive rotation, and permanent grassland) influence the functioning of soils and the ecosystem services that they deliver. Intensive wheat rotation consistently reduced the biomass of all components of the soil food web across all countries. Soil food web properties strongly and consistently predicted processes of C and N cycling across land use systems and geographic locations, and they were a better predictor of these processes than land use. Processes of carbon loss increased with soil food web properties that correlated with soil C content, such as earthworm biomass and fungal/bacterial energy channel ratio, and were greatest in permanent grassland. In contrast, processes of N cycling were explained by soil food web properties independent of land use, such as arbuscular mycorrhizal fungi and bacterial channel biomass. Our quantification of the contribution of soil organisms to processes of C and N cycling across land use systems and geographic locations shows that soil biota need to be included in C and N cycling models and highlights the need to map and conserve soil biodiversity across the world. PMID:23940339
ERIC Educational Resources Information Center
Liu, Chen-Chung; Don, Ping-Hsing; Chung, Chen-Wei; Lin, Shao-Jun; Chen, Gwo-Dong; Liu, Baw-Jhiune
2010-01-01
While Web discovery is usually undertaken as a solitary activity, Web co-discovery may transform Web learning activities from the isolated individual search process into interactive and collaborative knowledge exploration. Recent studies have proposed Web co-search environments on a single computer, supported by multiple one-to-one technologies.…
Consumer trophic diversity as a fundamental mechanism linking predation and ecosystem functioning.
Hines, Jes; Gessner, Mark O
2012-11-01
1. Primary production and decomposition, two fundamental processes determining the functioning of ecosystems, may be sensitive to changes in biodiversity and food web interactions. 2. The impacts of food web interactions on ecosystem functioning are generally quantified by experimentally decoupling these linked processes and examining either primary production-based (green) or decomposition-based (brown) food webs in isolation. This decoupling may strongly limit our ability to assess the importance of food web interactions on ecosystem processes. 3. To evaluate how consumer trophic diversity mediates predator effects on ecosystem functioning, we conducted a mesocosm experiment and a field study using an assemblage of invertebrates that naturally co-occur on North Atlantic coastal saltmarshes. We measured the indirect impact of predation on primary production and leaf decomposition as a result of prey communities composed of herbivores alone, detritivores alone or both prey in combination. 4. We find that primary consumers can influence ecosystem process rates not only within, but also across green and brown sub-webs. Moreover, by feeding on a functionally diverse consumer assemblage comprised of both herbivores and detritivores, generalist predators can diffuse consumer effects on decomposition, primary production and feedbacks between the two processes. 5. These results indicate that maintaining functional diversity among primary consumers can alter the consequences of traditional trophic cascades, and they emphasize the role of the detritus-based sub-web when seeking key biotic drivers of plant production. Clearly, traditional compartmentalization of empirical food webs can limit our ability to predict the influence of food web interactions on ecosystem functioning. © 2012 The Authors. Journal of Animal Ecology © 2012 British Ecological Society.
Katzman, G L; Morris, D; Lauman, J; Cochella, C; Goede, P; Harnsberger, H R
2001-06-01
To foster a community supported evaluation processes for open-source digital teaching file (DTF) development and maintenance. The mechanisms used to support this process will include standard web browsers, web servers, forum software, and custom additions to the forum software to potentially enable a mediated voting protocol. The web server will also serve as a focal point for beta and release software distribution, which is the desired end-goal of this process. We foresee that www.mdtf.org will provide for widespread distribution of open source DTF software that will include function and interface design decisions from community participation on the website forums.
WebGL and web audio software lightweight components for multimedia education
NASA Astrophysics Data System (ADS)
Chang, Xin; Yuksel, Kivanc; Skarbek, Władysław
2017-08-01
The paper presents the results of our recent work on development of contemporary computing platform DC2 for multimedia education usingWebGL andWeb Audio { the W3C standards. Using literate programming paradigm the WEBSA educational tools were developed. It offers for a user (student), the access to expandable collection of WEBGL Shaders and web Audio scripts. The unique feature of DC2 is the option of literate programming, offered for both, the author and the reader in order to improve interactivity to lightweightWebGL andWeb Audio components. For instance users can define: source audio nodes including synthetic sources, destination audio nodes, and nodes for audio processing such as: sound wave shaping, spectral band filtering, convolution based modification, etc. In case of WebGL beside of classic graphics effects based on mesh and fractal definitions, the novel image processing analysis by shaders is offered like nonlinear filtering, histogram of gradients, and Bayesian classifiers.
NASA Astrophysics Data System (ADS)
Yoon, Dai Geon; Chin, Byung Doo; Bail, Robert
2017-03-01
A convenient process for fabricating a transparent conducting electrode on a flexible substrate is essential for numerous low-cost optoelectronic devices, including organic solar cells (OSCs), touch sensors, and free-form lighting applications. Solution-processed metal-nanowire arrays are attractive due to their low sheet resistance and optical clarity. However, the limited conductance at wire junctions and the rough surface topology still need improvement. Here, we present a facile process of electrohydrodynamic spinning using a silver (Ag) - polymer composite paste with high viscosity. Unlike the metal-nanofiber web formed by conventional electrospinning, a relatively thick, but still invisible-to-naked eye, Ag-web random pattern was formed on a glass substrate. The process parameters such as the nozzle diameter, voltage, flow rate, standoff height, and nozzle-scanning speed, were systematically engineered. The formed random texture Ag webs were embedded in a flexible substrate by in-situ photo-polymerization, release from the glass substrate, and post-annealing. OSCs with a donor-acceptor polymeric heterojunction photoactive layer were prepared on the Ag-web-embedded flexible films with various Ag-web densities. The short-circuit current and the power conversion efficiency of an OSC with a Ag-web-embedded electrode were not as high as those of the control sample with an indium-tin-oxide electrode. However, the Ag-web textures embedded in the OSC served well as electrodes when bent (6-mm radius), showing a power conversion efficiency of 2.06% (2.72% for the flat OSC), and the electrical stability of the Ag-web-textured patterns was maintained for up to 1,000 cycles of bending.
Web-based data collection: detailed methods of a questionnaire and data gathering tool
Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R
2006-01-01
There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556
ERIC Educational Resources Information Center
Snider, Jean; Martin, Florence
2012-01-01
Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…
Stable Isotope Tracers of Process in Great Lakes Food Webs
Stable isotope analyses of biota are now commonly used to discern trophic pathways between consumers and their foods. However, those same isotope data also hold information about processes that influence the physicochemical setting of food webs as well as biological processes ope...
Teachers' Attitudes Toward WebQuests as a Method of Teaching
ERIC Educational Resources Information Center
Perkins, Robert; McKnight, Margaret L.
2005-01-01
One of the latest uses of technology gaining popular status in education is the WebQuest, a process that involves students using the World Wide Web to solve a problem. The goals of this project are to: (a) determine if teachers are using WebQuests in their classrooms; (b) ascertain whether teachers feel WebQuests are effective for teaching…
An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling.
Devi, R Suganya; Manjula, D; Siddharth, R K
2015-01-01
Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling.
An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling
Devi, R. Suganya; Manjula, D.; Siddharth, R. K.
2015-01-01
Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling. PMID:26137592
Going, going, still there: using the WebCite service to permanently archive cited web pages.
Eysenbach, Gunther; Trudel, Mathieu
2005-12-30
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics.
A web-based solution for 3D medical image visualization
NASA Astrophysics Data System (ADS)
Hou, Xiaoshuai; Sun, Jianyong; Zhang, Jianguo
2015-03-01
In this presentation, we present a web-based 3D medical image visualization solution which enables interactive large medical image data processing and visualization over the web platform. To improve the efficiency of our solution, we adopt GPU accelerated techniques to process images on the server side while rapidly transferring images to the HTML5 supported web browser on the client side. Compared to traditional local visualization solution, our solution doesn't require the users to install extra software or download the whole volume dataset from PACS server. By designing this web-based solution, it is feasible for users to access the 3D medical image visualization service wherever the internet is available.
2017-02-01
Image Processing Web Server Administration ...........................17 Fig. 18 Microsoft ASP.NET MVC 4 installation...algorithms are made into client applications that can be accessed from an image processing web service2 developed following Representational State...Transfer (REST) standards by a mobile app, laptop PC, and other devices. Similarly, weather tweets can be accessed via the Weather Digest Web Service
ERIC Educational Resources Information Center
Chou, Pao-Nan; Chang, Chi-Cheng
2011-01-01
This study examines the effects of reflection category and reflection quality on learning outcomes during Web-based portfolio assessment process. Experimental subjects consist of forty-five eight-grade students in a "Computer Application" course. Through the Web-based portfolio assessment system, these students write reflection, and join…
Buszko; Buszko; Wang
1998-04-01
A custom-written Common Gateway Interface (CGI) program for remote control of an NMR spectrometer using a World Wide Web browser has been described. The program, running on a UNIX workstation, uses multiple processes to handle concurrent tasks of interacting with the user and with the spectrometer. The program's parent process communicates with the browser and sends out commands to the spectrometer; the child process is mainly responsible for data acquisition. Communication between the processes is via the shared memory mechanism. The WWW pages that have been developed for the system make use of the frames feature of web browsers. The CGI program provides an intuitive user interface to the NMR spectrometer, making, in effect, a complex system an easy-to-use Web appliance. Copyright 1998 Academic Press.
A study of an adaptive replication framework for orchestrated composite web services.
Mohamed, Marwa F; Elyamany, Hany F; Nassar, Hamed M
2013-01-01
Replication is considered one of the most important techniques to improve the Quality of Services (QoS) of published Web Services. It has achieved impressive success in managing resource sharing and usage in order to moderate the energy consumed in IT environments. For a robust and successful replication process, attention should be paid to suitable time as well as the constraints and capabilities in which the process runs. The replication process is time-consuming since outsourcing some new replicas into other hosts is lengthy. Furthermore, nowadays, most of the business processes that might be implemented over the Web are composed of multiple Web services working together in two main styles: Orchestration and Choreography. Accomplishing a replication over such business processes is another challenge due to the complexity and flexibility involved. In this paper, we present an adaptive replication framework for regular and orchestrated composite Web services. The suggested framework includes a number of components for detecting unexpected and unhappy events that might occur when consuming the original published web services including failure or overloading. It also includes a specific replication controller to manage the replication process and select the best host that would encapsulate a new replica. In addition, it includes a component for predicting the incoming load in order to decrease the time needed for outsourcing new replicas, enhancing the performance greatly. A simulation environment has been created to measure the performance of the suggested framework. The results indicate that adaptive replication with prediction scenario is the best option for enhancing the performance of the replication process in an online business environment.
Work of the Web Weavers: Web Development in Academic Libraries
ERIC Educational Resources Information Center
Bundza, Maira; Vander Meer, Patricia Fravel; Perez-Stable, Maria A.
2009-01-01
Although the library's Web site has become a standard tool for seeking information and conducting research in academic institutions, there are a variety of ways libraries approach the often challenging--and sometimes daunting--process of Web site development and maintenance. Three librarians at Western Michigan University explored issues related…
Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective
ERIC Educational Resources Information Center
Hadjerrouit, Said
2005-01-01
In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-16
...: Exchange Programs Alumni Web Site Registration ACTION: Notice of request for public comment and submission... Information Collection: Exchange Programs Alumni Web site Registration. OMB Control Number: 1405-0192. Type of... proposed collection: The International Exchange Alumni Web site requires information to process users...
The New Web-Based Hera Data Processing System at the HEASARC
NASA Technical Reports Server (NTRS)
Pence, W.
2011-01-01
The HEASARC at NASA/GSFC has provide an on-line astronomical data processing system called Hera for several years. Hera provides a complete data processing environment, including installed software packages, local data storage, and the CPU resources needed to process the user's data. The original design of Hera, however, has 2 requirements that has limited it's usefulness for some users, namely, that 1) the user must download and install a small helper program on their own computer before using Hera, and 2) Hera requires that several computer ports/sockets be allowed to communicate through any local firewalls on the users machine. Both of these restrictions can be problematic for some users, therefore we are now migrating Hera into a purely Web based environment which only requires a standard Web browser. The first release of Web Hera is now publicly available at http://heasarc.gsfc.nasa.gov/webheara/. It currently provides a standard graphical interface for running hundreds of different data processing programs that are available in the HEASARC's ftools software package. Over the next year we to add more features to Web Hera, including an interactive command line interface, and more display and line capabilities.
Kavanaugh, A; Gladman, D; van der Heijde, D; Purcaru, O; Mease, P
2015-01-01
Objectives To evaluate the effect of certolizumab pegol (CZP) on productivity outside and within the home, and on participation in family, social and leisure activities in adult patients with psoriatic arthritis (PsA). Methods RAPID-PsA (NCT01087788) is a phase 3, double-blind, placebo-controlled trial. 409 patients with active PsA were randomised 1:1:1 to placebo, CZP 200 mg every 2 weeks (Q2W) or CZP 400 mg every 4 weeks (Q4W). The arthritis-specific Work Productivity Survey (WPS) assessed the impact of PsA on paid work and household productivity, and participation in social activities during the preceding month. WPS responses were compared between treatment arms using a non-parametric bootstrap-t method. Results At baseline, 56.6%, 60.1% and 61.5% of placebo, CZP 200 mg Q2W and CZP 400 mg Q4W patients were employed. By week 24, employed CZP patients reported an average of 1.0–1.8 and 3.0–3.9 fewer days of absenteeism and presenteeism, respectively, per month compared with 1.0 and 0.3 fewer days for placebo patients (p<0.05). Within the home, by week 24, CZP patients reported an average of 3.0–3.5 household work days gained per month versus 1.0 day for placebo (p<0.05). CZP patients also reported fewer days with reduced household productivity or days lost for participation in family, social and leisure activities. Improvements with CZP were seen as early as week 4 and continued to week 24. Conclusions CZP treatment significantly improved productivity at paid work and within the home, and resulted in greater participation in social activities for PsA patients. Trial registration number NCT01087788. PMID:24942382
A critical role for the regulation of Syk from agglutination to aggregation in human platelets.
Shih, Chun-Ho; Chiang, Tin-Bin; Wang, Wen-Jeng
2014-01-10
Agglucetin, a tetrameric glycoprotein (GP) Ibα agonist from Formosan Agkistrodon acutus venom, has been characterized as an agglutination inducer in human washed platelets (WPs). In platelet-rich plasma (PRP), agglucetin dramatically elicits a biphasic response of agglutination and subsequent aggregation. For clarifying the intracellular signaling events from agglutination to aggregation in human platelets, we examined the essential signaling molecules involved through the detection of protein tyrosine phosphorylation (PTP). In WPs, an anti-GPIbα monoclonal antibody (mAb) AP1, but not a Src kinase inhibitor PP1, completely inhibited agglucetin-induced agglutination. However, PP1 but not AP1 had a potent suppression on platelet aggregation by a GPVI activator convulxin. The PTP analyses showed agglucetin alone can cause a weak pattern involving sequential phosphorylation of Lyn/Fyn, Syk, SLP-76 and phospholipase Cγ2 (PLCγ2). Furthermore, a Syk-selective kinase inhibitor, piceatannol, significantly suppressed the aggregating response in agglucetin-activated PRP. Analyzed by flow cytometry, the binding capacity of fluorophore-conjugated PAC-1, a mAb recognizing activated integrin αIIbβ3, was shown to increase in agglucetin-stimulated platelets. Again, piceatannol but not PP1 had a concentration-dependent suppression on agglucetin-induced αIIbβ3 exposure. Moreover, the formation of signalosome, including Syk, SLP-76, VAV, adhesion and degranulation promoting adapter protein (ADAP) and PLCγ2, are required for platelet aggregation in agglucetin/fibrinogen-activated platelets. In addition, GPIbα-ligation via agglucetin can substantially promote the interactions between αIIbβ3 and fibrinogen. Therefore, the signal pathway of Lyn/Fyn/Syk/SLP-76/ADAP/VAV/PLCγ2/PKC is sufficient to trigger platelet aggregation in agglucetin/fibrinogen-pretreated platelets. Importantly, Syk may function as a major regulator for the response from GPIbα-initiated agglutination to integrin αIIbβ3-dependent aggregation in human platelets. Copyright © 2013 Elsevier Inc. All rights reserved.
Warren, David W.
1997-01-01
A process and an apparatus for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquified eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciately stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna
The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology usedmore » in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.« less
Processing biological literature with customizable Web services supporting interoperable formats.
Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia
2014-01-01
Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.
Processing biological literature with customizable Web services supporting interoperable formats
Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia
2014-01-01
Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225
A web-based tree crown condition training and evaluation tool for urban and community forestry
Matthew F. Winn; Neil A. Clark; Philip A. Araman; Sang-Mook Lee
2007-01-01
Training personnel for natural resource related field work can be a costly and time-consuming process. For that reason, web-based training is considered by many to be a more attractive alternative to on-site training. The U.S. Forest Service Southern Research Station unit with Virginia Tech cooperators in Blacksburg, Va., are in the process of constructing a web site...
ERIC Educational Resources Information Center
Acat, M. Bahaddin; Demiral, Hilmi; Kaya, Mehmet Fatih
2016-01-01
The main purpose of this study is to measure listening comprehension skills of 5th grade school students with the help of web based system. This study was conducted on 5th grade students studying at the primary schools of Eskisehir. The scale used in the process of the study is "Web Based Listening Scale". In the process of the study,…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-28
...: Exchange Programs Alumni Web Site Registration, DS-7006 ACTION: Notice of request for public comment and... Collection The Exchange Programs Alumni Web site requires information to process users' voluntary requests for participation in the Web site. Other than contact information, which is required for website...
ERIC Educational Resources Information Center
Keng, Tan Chin; Ching, Yeoh Kah
2015-01-01
The use of web applications has become a trend in many disciplines including education. In view of the influence of web application in education, this study examines web application technologies that could enhance undergraduates' learning experiences, with focus on Quantity Surveying (QS) and Information Technology (IT) undergraduates. The…
20 CFR 656.17 - Basic labor certification process.
Code of Federal Regulations, 2010 CFR
2010-04-01
... participant in the job fair. (B) Employer's Web site. The use of the employer's Web site as a recruitment... involved in the application. (C) Job search Web site other than the employer's. The use of a job search Web...) The Department of Labor may issue or require the use of certain identifying information, including...
The Adoption and Diffusion of Web Technologies into Mainstream Teaching.
ERIC Educational Resources Information Center
Hansen, Steve; Salter, Graeme
2001-01-01
Discusses various adoption and diffusion frameworks and methodologies to enhance the use of Web technologies by teaching staff. Explains the use of adopter-based models for product development; discusses the innovation-decision process; and describes PlatformWeb, a Web information system that was developed to help integrate a universities'…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.
2008-05-04
This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less
Health and medication information resources on the World Wide Web.
Grossman, Sara; Zerilli, Tina
2013-04-01
Health care practitioners have increasingly used the Internet to obtain health and medication information. The vast number of Internet Web sites providing such information and concerns with their reliability makes it essential for users to carefully select and evaluate Web sites prior to use. To this end, this article reviews the general principles to consider in this process. Moreover, as cost may limit access to subscription-based health and medication information resources with established reputability, freely accessible online resources that may serve as an invaluable addition to one's reference collection are highlighted. These include government- and organization-sponsored resources (eg, US Food and Drug Administration Web site and the American Society of Health-System Pharmacists' Drug Shortage Resource Center Web site, respectively) as well as commercial Web sites (eg, Medscape, Google Scholar). Familiarity with such online resources can assist health care professionals in their ability to efficiently navigate the Web and may potentially expedite the information gathering and decision-making process, thereby improving patient care.
QoS measurement of workflow-based web service compositions using Colored Petri net.
Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra
2014-01-01
Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.
Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.
Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel
2015-06-02
Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.
A novel architecture for information retrieval system based on semantic web
NASA Astrophysics Data System (ADS)
Zhang, Hui
2011-12-01
Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.
Accredited hand surgery fellowship Web sites: analysis of content and accessibility.
Trehan, Samir K; Morrell, Nathan T; Akelman, Edward
2015-04-01
To assess the accessibility and content of accredited hand surgery fellowship Web sites. A list of all accredited hand surgery fellowships was obtained from the online database of the American Society for Surgery of the Hand (ASSH). Fellowship program information on the ASSH Web site was recorded. All fellowship program Web sites were located via Google search. Fellowship program Web sites were analyzed for accessibility and content in 3 domains: program overview, application information/recruitment, and education. At the time of this study, there were 81 accredited hand surgery fellowships with 169 available positions. Thirty of 81 programs (37%) had a functional link on the ASSH online hand surgery fellowship directory; however, Google search identified 78 Web sites. Three programs did not have a Web site. Analysis of content revealed that most Web sites contained contact information, whereas information regarding the anticipated clinical, research, and educational experiences during fellowship was less often present. Furthermore, information regarding past and present fellows, salary, application process/requirements, call responsibilities, and case volume was frequently lacking. Overall, 52 of 81 programs (64%) had the minimal online information required for residents to independently complete the fellowship application process. Hand fellowship program Web sites could be accessed either via the ASSH online directory or Google search, except for 3 programs that did not have Web sites. Although most fellowship program Web sites contained contact information, other content such as application information/recruitment and education, was less frequently present. This study provides comparative data regarding the clinical and educational experiences outlined on hand fellowship program Web sites that are of relevance to residents, fellows, and academic hand surgeons. This study also draws attention to various ways in which the hand surgery fellowship application process can be made more user-friendly and efficient. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
E-Government Goes Semantic Web: How Administrations Can Transform Their Information Processes
NASA Astrophysics Data System (ADS)
Klischewski, Ralf; Ukena, Stefan
E-government applications and services are built mainly on access to, retrieval of, integration of, and delivery of relevant information to citizens, businesses, and administrative users. In order to perform such information processing automatically through the Semantic Web,1 machine-readable2 enhancements of web resources are needed, based on the understanding of the content and context of the information in focus. While these enhancements are far from trivial to produce, administrations in their role of information and service providers so far find little guidance on how to migrate their web resources and enable a new quality of information processing; even research is still seeking best practices. Therefore, the underlying research question of this chapter is: what are the appropriate approaches which guide administrations in transforming their information processes toward the Semantic Web? In search for answers, this chapter analyzes the challenges and possible solutions from the perspective of administrations: (a) the reconstruction of the information processing in the e-government in terms of how semantic technologies must be employed to support information provision and consumption through the Semantic Web; (b) the required contribution to the transformation is compared to the capabilities and expectations of administrations; and (c) available experience with the steps of transformation are reviewed and discussed as to what extent they can be expected to successfully drive the e-government to the Semantic Web. This research builds on studying the case of Schleswig-Holstein, Germany, where semantic technologies have been used within the frame of the Access-eGov3 project in order to semantically enhance electronic service interfaces with the aim of providing a new way of accessing and combining e-government services.
Four-dimensional characterization of a sheet-forming web
Sari-Sarraf, Hamed; Goddard, James S.
2003-04-22
A method and apparatus are provided by which a sheet-forming web may be characterized in four dimensions. Light images of the web are recorded at a point adjacent the initial stage of the web, for example, near the headbox in a paperforming operation. The images are digitized, and the resulting data is processed by novel algorithms to provide a four-dimensional measurement of the web. The measurements include two-dimensional spatial information, the intensity profile of the web, and the depth profile of the web. These measurements can be used to characterize the web, predict its properties and monitor production events, and to analyze and quantify headbox flow dynamics.
Designing a Web Site to Share Information with Parents
ERIC Educational Resources Information Center
Englund, Lillian White
2009-01-01
This article discusses the development and use of an on-line portfolio process. It presents a background rationale for the need and effectiveness of a communication tool that supports the use of the portfolio process throughout the education of a child with identified disabilities. The process for developing the individualized Web page is…
Going, Going, Still There: Using the WebCite Service to Permanently Archive Cited Web Pages
Trudel, Mathieu
2005-01-01
Scholars are increasingly citing electronic “web references” which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To “webcite” a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its “instructions for authors” accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) “prospectively” before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted “citing articles” (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics. PMID:16403724
Tsukamoto, Takafumi; Yasunaga, Takuo
2014-11-01
Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Knowlden, Adam P; Sharma, Manoj
2014-09-01
Family-and-home-based interventions are an important vehicle for preventing childhood obesity. Systematic process evaluations have not been routinely conducted in assessment of these interventions. The purpose of this study was to plan and conduct a process evaluation of the Enabling Mothers to Prevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial. The trial was composed of two web-based, mother-centered interventions for prevention of obesity in children between 4 and 6 years of age. Process evaluation used the components of program fidelity, dose delivered, dose received, context, reach, and recruitment. Categorical process evaluation data (program fidelity, dose delivered, dose exposure, and context) were assessed using Program Implementation Index (PII) values. Continuous process evaluation variables (dose satisfaction and recruitment) were assessed using ANOVA tests to evaluate mean differences between groups (experimental and control) and sessions (sessions 1 through 5). Process evaluation results found that both groups (experimental and control) were equivalent, and interventions were administered as planned. Analysis of web-based intervention process objectives requires tailoring of process evaluation models for online delivery. Dissemination of process evaluation results can advance best practices for implementing effective online health promotion programs. © 2014 Society for Public Health Education.
Warren, D.W.
1997-04-15
A process and an apparatus are disclosed for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquefied eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciatively stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers. 6 figs.
The Role of Virtual Reference in Library Web Site Design: A Qualitative Source for Usage Data
ERIC Educational Resources Information Center
Powers, Amanda Clay; Shedd, Julie; Hill, Clay
2011-01-01
Gathering qualitative information about usage behavior of library Web sites is a time-consuming process requiring the active participation of patron communities. Libraries that collect virtual reference transcripts, however, hold valuable data regarding how the library Web site is used that could benefit Web designers. An analysis of virtual…
Use of Web Technology to Access and Update College Plans
ERIC Educational Resources Information Center
Valeau, Edward J.; Luan, Jing
2007-01-01
In this study, the process and outcome of a web-based planning application, called Ports of Call, are discussed. The application allows college management to create, edit, and report out activities relating to college plans, all through a web browser. Its design was based on best practices in modern web technology and the application can be easily…
Users' Interaction with World Wide Web Resources: An Exploratory Study Using a Holistic Approach.
ERIC Educational Resources Information Center
Wang, Peiling; Hawk, William B.; Tenopir, Carol
2000-01-01
Presents results of a study that explores factors of user-Web interaction in finding factual information, develops a conceptual framework for studying user-Web interaction, and applies a process-tracing method for conducting holistic user-Web studies. Describes measurement techniques and proposes a model consisting of the user, interface, and the…
ERIC Educational Resources Information Center
Wood, Pamela L.; Quitadamo, Ian J.; DePaepe, James L.; Loverro, Ian
2007-01-01
The WebQuest is a four-step process integrated at appropriate points in the Animal Studies unit. Through the WebQuest, students create a series of habitat maps that build on the knowledge gained from conducting the various activities of the unit. The quest concludes with an evaluation using the WebQuest rubric and an oral presentation of a final…
ERIC Educational Resources Information Center
Karagiannis, P.; Markelis, I.; Paparrizos, K.; Samaras, N.; Sifaleras, A.
2006-01-01
This paper presents new web-based educational software (webNetPro) for "Linear Network Programming." It includes many algorithms for "Network Optimization" problems, such as shortest path problems, minimum spanning tree problems, maximum flow problems and other search algorithms. Therefore, webNetPro can assist the teaching process of courses such…
A Web Browser Interface to Manage the Searching and Organizing of Information on the Web by Learners
ERIC Educational Resources Information Center
Li, Liang-Yi; Chen, Gwo-Dong
2010-01-01
Information Gathering is a knowledge construction process. Web learners make a plan for their Information Gathering task based on their prior knowledge. The plan is evolved with new information encountered and their mental model is constructed through continuously assimilating and accommodating new information gathered from different Web pages. In…
Environmental Response Laboratory Network (ERLN) WebEDR Quick Reference Guide
The Web Electronic Data Review is a web-based system that performs automated data processing on laboratory-submitted Electronic Data Deliverables (EDDs). Enables users to perform technical audits on data, and against Measurement Quality Objectives (MQOs).
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
Low cost silicon solar array project large area silicon sheet task: Silicon web process development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.
1977-01-01
Growth configurations were developed which produced crystals having low residual stress levels. The properties of a 106 mm diameter round crucible were evaluated and it was found that this design had greatly enhanced temperature fluctuations arising from convection in the melt. Thermal modeling efforts were directed to developing finite element models of the 106 mm round crucible and an elongated susceptor/crucible configuration. Also, the thermal model for the heat loss modes from the dendritic web was examined for guidance in reducing the thermal stress in the web. An economic analysis was prepared to evaluate the silicon web process in relation to price goals.
TreeVector: scalable, interactive, phylogenetic trees for the web.
Pethica, Ralph; Barker, Gary; Kovacs, Tim; Gough, Julian
2010-01-28
Phylogenetic trees are complex data forms that need to be graphically displayed to be human-readable. Traditional techniques of plotting phylogenetic trees focus on rendering a single static image, but increases in the production of biological data and large-scale analyses demand scalable, browsable, and interactive trees. We introduce TreeVector, a Scalable Vector Graphics-and Java-based method that allows trees to be integrated and viewed seamlessly in standard web browsers with no extra software required, and can be modified and linked using standard web technologies. There are now many bioinformatics servers and databases with a range of dynamic processes and updates to cope with the increasing volume of data. TreeVector is designed as a framework to integrate with these processes and produce user-customized phylogenies automatically. We also address the strengths of phylogenetic trees as part of a linked-in browsing process rather than an end graphic for print. TreeVector is fast and easy to use and is available to download precompiled, but is also open source. It can also be run from the web server listed below or the user's own web server. It has already been deployed on two recognized and widely used database Web sites.
Youpi: A Web-based Astronomical Image Processing Pipeline
NASA Astrophysics Data System (ADS)
Monnerville, M.; Sémah, G.
2010-12-01
Youpi stands for “YOUpi is your processing PIpeline”. It is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. It is built on top of open source processing tools that are released to the community by Terapix, in order to organize your data on a computer cluster, to manage your processing jobs in real time and to facilitate teamwork by allowing fine-grain sharing of results and data. On the server side, Youpi is written in the Python programming language and uses the Django web framework. On the client side, Ajax techniques are used along with the Prototype and script.aculo.us Javascript librairies.
ERIC Educational Resources Information Center
Rátiva Velandia, Marlén; Pedreros Torres, Andrés Leonardo; Núñez Alí, Mónica
2012-01-01
It is considered valuable to take advantage of web activities to improve and qualify the English teaching and learning processes, especially in the promotion of reading comprehension. In this article we share the process and results of a study that focused on some activities based on web materials that were designed and used with 10th grade…
Macroscopic characterisations of Web accessibility
NASA Astrophysics Data System (ADS)
Lopes, Rui; Carriço, Luis
2010-12-01
The Web Science framework poses fundamental questions on the analysis of the Web, by focusing on how microscopic properties (e.g. at the level of a Web page or Web site) emerge into macroscopic properties and phenomena. One research topic on the analysis of the Web is Web accessibility evaluation, which centres on understanding how accessible a Web page is for people with disabilities. However, when framing Web accessibility evaluation on Web Science, we have found that existing research stays at the microscopic level. This article presents an experimental study on framing Web accessibility evaluation into Web Science's goals. This study resulted in novel accessibility properties of the Web not found at microscopic levels, as well as of Web accessibility evaluation processes themselves. We observed at large scale some of the empirical knowledge on how accessibility is perceived by designers and developers, such as the disparity of interpretations of accessibility evaluation tools warnings. We also found a direct relation between accessibility quality and Web page complexity. We provide a set of guidelines for designing Web pages, education on Web accessibility, as well as on the computational limits of large-scale Web accessibility evaluations.
Connection Development: Web Lessons from Westchester.
ERIC Educational Resources Information Center
Freedman, Maurice J.
1996-01-01
Committed to utilizing information technology, the Westchester Library System (New York) made the World Wide Web publicly accessible. Describes the planning, implementation, and management process; obstacles involving financing; establishing Internet connectivity; and vendor negotiations. Westchester hired a Web manager, created Internet use…
Solution Kinetics Database on the Web
National Institute of Standards and Technology Data Gateway
SRD 40 NDRL/NIST Solution Kinetics Database on the Web (Web, free access) Data for free radical processes involving primary radicals from water, inorganic radicals and carbon-centered radicals in solution, and singlet oxygen and organic peroxyl radicals in various solvents.
A Virtual Tour of the Radio Astronomy Process
NASA Astrophysics Data System (ADS)
Conrad, S. B.; Finley, D. G.; Claussen, M. J.; Ulvestad, J. S.
2000-12-01
In the summer of 2000, two teachers working on a Masters of Science Teaching Degree at New Mexico Tech and participating in the Research Experience for Teachers (RET) program sponsored by the National Science Foundation, spent eight weeks as interns researching and working on projects at the National Radio Astronomy Observatory (NRAO) which will directly benefit students in their classrooms and also impact other science educators. One of the products of the interships is a set of web pages for NRAO's web page educational section. The purpose of these web pages is to familiarize students, teachers, and other people with the process that a radio astronomer goes through to do radio astronomy science. A virtual web tour was created of this process. This required interviewing radio astronomers and other professionals involved with this process at the NRAO (e.g. engineers, data analysts, and operations people), and synthesizing the interviews into a descriptive, visual-based set of web pages. These pages do meet the National as well as New Mexico Standards and Benchmarks for Science Education. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc. The NSF's RET program is gratefully acknowledged.
Towards Web-based representation and processing of health information
Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J
2009-01-01
Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445
49 CFR 1560.205 - Redress process.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Transportation Other Regulations Relating to Transportation (Continued) TRANSPORTATION SECURITY ADMINISTRATION... may obtain the forms and information necessary to initiate the redress process on the DHS TRIP Web... will provide the necessary forms and information to individuals through its Web site or by mail. (c...
Reliable Execution Based on CPN and Skyline Optimization for Web Service Composition
Ha, Weitao; Zhang, Guojun
2013-01-01
With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets. PMID:23935431
Reliable execution based on CPN and skyline optimization for Web service composition.
Chen, Liping; Ha, Weitao; Zhang, Guojun
2013-01-01
With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets.
Synthesis of a Carbon-activated Microfiber from Spider Webs Silk
NASA Astrophysics Data System (ADS)
Taer, E.; Mustika, W. S.; Taslim, R.
2017-03-01
Carbon fiber of spider web silk has been produced through the simple carbonization process. Cobwebs are a source of strong natural fiber, flexible and micrometer in size. Preparation of micro carbon fiber from spider webs that consist of carbonization and activation processes. Carbonization was performed in N2 gas environment by multi step heating profile up to temperature of 400 °C, while the activation process was done by using chemical activation with KOH activating agent assistance. Measurement of physical properties was conducted on the surface morphology, element content and the degree of crystallinity. The measurement results found that micro carbon fiber from spider webs has a diameter in the range of 0.5 -25 micrometers. It is found that the carbon-activated microfiber takes the amorphous form with the carbon content of 84 %.
The Interface Design and the Usability Testing of a Fossilization Web-Based Learning Environment
ERIC Educational Resources Information Center
Wang, Shiang-Kwei; Yang, Chiachi
2005-01-01
This article describes practical issues related to the design and the development of a Web-Based Learning Environment (Web-LE) for high school students. The purpose of the Fossilization Web-LE was to help students understand the process of fossilization, which is a complex phenomenon and is affected by many factors. The instructional design team…
Classroom Web Pages: A "How-To" Guide for Educators.
ERIC Educational Resources Information Center
Fehling, Eric E.
This manual provides teachers, with very little or no technology experience, with a step-by-step guide for developing the necessary skills for creating a class Web Page. The first part of the manual is devoted to the thought processes preceding the actual creation of the Web Page. These include looking at other Web Pages, deciding what should be…
AIRSAR Automated Web-based Data Processing and Distribution System
NASA Technical Reports Server (NTRS)
Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen
2005-01-01
In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.
Design and evaluation of web-based image transmission and display with different protocols
NASA Astrophysics Data System (ADS)
Tan, Bin; Chen, Kuangyi; Zheng, Xichuan; Zhang, Jianguo
2011-03-01
There are many Web-based image accessing technologies used in medical imaging area, such as component-based (ActiveX Control) thick client Web display, Zerofootprint thin client Web viewer (or called server side processing Web viewer), Flash Rich Internet Application(RIA) ,or HTML5 based Web display. Different Web display methods have different peformance in different network environment. In this presenation, we give an evaluation on two developed Web based image display systems. The first one is used for thin client Web display. It works between a PACS Web server with WADO interface and thin client. The PACS Web server provides JPEG format images to HTML pages. The second one is for thick client Web display. It works between a PACS Web server with WADO interface and thick client running in browsers containing ActiveX control, Flash RIA program or HTML5 scripts. The PACS Web server provides native DICOM format images or JPIP stream for theses clients.
The Importance of Process-Oriented Accessibility Guidelines for Web Developers.
Steen-Hansen, Linn; Fagernes, Siri
2016-01-01
Current accessibility research shows that in the web development, the process itself may lead to inaccessible web sites and applications. Common practices typically do not allow sufficient testing. The focus is mainly on complying with minimum standards, and treating accessibility compliance as a sort of bug-fixing process, missing the user perspective. In addition, there is an alarming lack of knowledge and experience with accessibility issues. It has also been argued that bringing accessibility into the development process at all stages is the only way to achieve the highest possible level of accessibility. The work presented in this paper is based on a previous project focusing on guidelines for developing accessible rich Internet applications. The guidelines were classified as either process-oriented or technology-oriented. In this paper, we examine the process-oriented guidelines and give a practical perspective on how these guidelines will make the development process more accessibility-friendly.
The role of EMODnet Chemistry in the European challenge for Good Environmental Status
NASA Astrophysics Data System (ADS)
Vinci, Matteo; Giorgetti, Alessandra; Lipizer, Marina
2017-02-01
The European Union set the ambitious objective to reach within 2020 the goal of Good Environmental Status. The European Commission (2008) represents the legislative framework that drives member state efforts to reach it. The Integrated Maritime Policy supported the need to provide a European knowledge base able to drive sustainable development by launching in 2009 a new European Marine Observation and Data Network (EMODnet). Through a stepwise approach, EMODnet Chemistry aims to provide high-quality marine environmental data and related products at the scale of regions and sub-regions defined by the Marine Strategy Framework Directive. The chemistry lot takes advantage and further develops the SeaDataNet pan-European infrastructure and the distributed approach, linking together a network of more than 100 National Oceanographic Data Centres providing data from more than 500 data originators. The close interaction with EEA, RSCs, ICES and EMODnet-MSFD coordination group facilitated the identification of the most appropriate set of information required for the MSFD process. EMODnet Chemistry provides aggregated and validated regional data collections for nutrients, dissolved gasses, chlorophyll, and contaminants, properly visualized with OGC WMS and WPS viewing services. Concentration maps with 10-year moving window from 1960 to 2014, by season and for selected vertical layers, are computed and made available.
A verification strategy for web services composition using enhanced stacked automata model.
Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali
2015-01-01
Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the existing models.
NASA Technical Reports Server (NTRS)
Falke, Stefan; Husar, Rudolf
2011-01-01
The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.
ERIC Educational Resources Information Center
Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.
2011-01-01
The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in RD through 5th grades. This study included a racially and…
Web service discovery among large service pools utilising semantic similarity and clustering
NASA Astrophysics Data System (ADS)
Chen, Fuzan; Li, Minqiang; Wu, Harris; Xie, Lingli
2017-03-01
With the rapid development of electronic business, Web services have attracted much attention in recent years. Enterprises can combine individual Web services to provide new value-added services. An emerging challenge is the timely discovery of close matches to service requests among large service pools. In this study, we first define a new semantic similarity measure combining functional similarity and process similarity. We then present a service discovery mechanism that utilises the new semantic similarity measure for service matching. All the published Web services are pre-grouped into functional clusters prior to the matching process. For a user's service request, the discovery mechanism first identifies matching services clusters and then identifies the best matching Web services within these matching clusters. Experimental results show that the proposed semantic discovery mechanism performs better than a conventional lexical similarity-based mechanism.
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
A User-centered Model for Web Site Design
Kinzie, Mable B.; Cohn, Wendy F.; Julian, Marti F.; Knaus, William A.
2002-01-01
As the Internet continues to grow as a delivery medium for health information, the design of effective Web sites becomes increasingly important. In this paper, the authors provide an overview of one effective model for Web site design, a user-centered process that includes techniques for needs assessment, goal/task analysis, user interface design, and rapid prototyping. They detail how this approach was employed to design a family health history Web site, Health Heritage
Web-Based Mapping Puts the World at Your Fingertips
NASA Technical Reports Server (NTRS)
2008-01-01
NASA's award-winning Earth Resources Laboratory Applications Software (ELAS) package was developed at Stennis Space Center. Since 1978, ELAS has been used worldwide for processing satellite and airborne sensor imagery data of the Earth's surface into readable and usable information. DATASTAR Inc., of Picayune, Mississippi, has used ELAS software in the DATASTAR Image Processing Exploitation (DIPEx) desktop and Internet image processing, analysis, and manipulation software. The new DIPEx Version III includes significant upgrades and improvements compared to its esteemed predecessor. A true World Wide Web application, this product evolved with worldwide geospatial dimensionality and numerous other improvements that seamlessly support the World Wide Web version.
Spatiotemporal-Thematic Data Processing for the Semantic Web
NASA Astrophysics Data System (ADS)
Hakimpour, Farshad; Aleman-Meza, Boanerges; Perry, Matthew; Sheth, Amit
This chapter presents practical approaches to data processing in the space, time and theme dimensions using existing Semantic Web technologies. It describes how we obtain geographic and event data from Internet sources and also how we integrate them into an RDF store. We briefly introduce a set of functionalities in space, time and semantics. These functionalities are implemented based on our existing technology for main-memory-based RDF data processing developed at the LSDIS Lab. A number of these functionalities are exposed as REST Web services. We present two sample client-side applications that are developed using a combination of our services with Google Maps service.
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Silicon Web Process Development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Hopkins, R. H.; Mchugh, J. P.; Hill, F. E.; Heimlich, M. E.; Driggers, J. M.
1978-01-01
Progress in the development of techniques to grow silicon web at 25 wq cm/min output rate is reported. Feasibility of web growth with simultaneous melt replenishment is discussed. Other factors covered include: (1) tests of aftertrimmers to improve web width; (2) evaluation of growth lid designs to raise speed and output rate; (3) tests of melt replenishment hardware; and (4) investigation of directed gas flow systems to control unwanted oxide deposition in the system and to improve convective cooling of the web. Compatibility with sufficient solar cell performance is emphasized.
Creating Patient and Family Education Web Sites
YADRICH, DONNA MACAN; FITZGERALD, SHARON A.; WERKOWITCH, MARILYN; SMITH, CAROL E.
2013-01-01
This article gives details about the methods and processes used to ensure that usability and accessibility were achieved during development of the Home Parenteral Nutrition Family Caregivers Web site, an evidence-based health education Web site for the family members and caregivers of chronically ill patients. This article addresses comprehensive definitions of usability and accessibility and illustrates Web site development according to Section 508 standards and the national Health and Human Services’ Research-Based Web Design and Usability Guidelines requirements. PMID:22024970
ERIC Educational Resources Information Center
Türker, Fatih Mehmet
2016-01-01
In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…
geoknife: Reproducible web-processing of large gridded datasets
Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily K.; Winslow, Luke A.
2016-01-01
Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.
ERIC Educational Resources Information Center
Vinokur, Amiram D.; Merion, Robert M.; Couper, Mick P.; Jones, Eleanor G.; Dong, Yihui
2006-01-01
A sample of 490 high school students from 81 schools in Michigan participated in an experiment in which they were randomly assigned to either a control or an experimental Web site. The experimental Web site provided exposure to educational material about the process of organ donation and organ transplantation. The control Web site provided…
ERIC Educational Resources Information Center
Diacopoulos, Mark M.
2015-01-01
The potential for social studies to embrace instructional technology and Web 2.0 applications has become a growing trend in recent social studies research. As part of an ongoing process of collaborative enquiry between an instructional specialist and social studies teachers in a Professional Learning Community, a table of Web 2.0 applications was…
KnowledgePuzzle: A Browsing Tool to Adapt the Web Navigation Process to the Learner's Mental Model
ERIC Educational Resources Information Center
AlAgha, Iyad
2012-01-01
This article presents KnowledgePuzzle, a browsing tool for knowledge construction from the web. It aims to adapt the structure of web content to the learner's information needs regardless of how the web content is originally delivered. Learners are provided with a meta-cognitive space (e.g., a concept mapping tool) that enables them to plan…
NASA Astrophysics Data System (ADS)
Bos, Nathan Daniel
This dissertation investigates the emerging affordance of the World Wide Web as a place for high school students to become authors and publishers of information. Two empirical studies lay groundwork for student publishing by examining learning issues related to audience adaptation in writing, motivation and engagement with hypermedia, design, problem-solving, and critical evaluation. Two models of student publishing on the World Wide Web were investigated over the course of two 11spth grade project-based science curriculums. In the first curricular model, students worked in pairs to design informative hypermedia projects about infectious diseases that were published on the Web. Four case studies were written, drawing on both product- and process-related data sources. Four theoretically important findings are illustrated through these cases: (1) multimedia, especially graphics, seemed to catalyze some students' design processes by affecting the sequence of their design process and by providing a connection between the science content and their personal interest areas, (2) hypermedia design can demand high levels of analysis and synthesis of science content, (3) students can learn to think about science content representation through engagement with challenging design tasks, and (4) students' consideration of an outside audience can be facilitated by teacher-given design principles. The second Web-publishing model examines how students critically evaluate scientific resources on the Web, and how students can contribute to the Web's organization and usability by publishing critical reviews. Students critically evaluated Web resources using a four-part scheme: summarization of content, content, evaluation of credibility, evaluation of organizational structure, and evaluation of appearance. Content analyses comparing students' reviews and reviewed Web documents showed that students were proficient at summarizing content of Web documents, identifying their publishing source, and evaluating their organizational features; however, students struggled to identify scientific evidence, bias, or sophisticated use of media in Web pages. Shortcomings were shown to be partly due to deficiencies in the Web pages themselves and partly due to students' inexperience with the medium or lack of critical evaluation skills. Future directions of this idea are discussed, including discussion of how students' reviews have been integrated into a current digital library development project.
EuroGEOSS/GENESIS ``e-Habitat'' AIP-3 Use Scenario
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Dubois, G.; Santoro, M.; Peedell, S.; de Longueville, B.; Nativi, S.; Craglia, M.
2010-12-01
Natural ecosystems are in rapid decline. Major habitats are disappearing at a speed never observed before. The current rate of species extinction is several orders of magnitude higher than the background rate from the fossil record. Protected Areas (PAs) and Protected Area Systems are designed to conserve natural and cultural resources, to maintain biodiversity (ecosystems, species, genes) and ecosystem services. The scientific challenge of understanding how environmental and climatological factors impact on ecosystems and habitats requires the use of information from different scientific domains. Thus, multidisciplinary interoperability is a crucial requirement for a framework aiming to support scientists. The Group on Earth Observations (or GEO) is coordinating international efforts to build a Global Earth Observation System of Systems (GEOSS). This emerging public infrastructure is interconnecting a diverse and growing array of instruments and systems for monitoring and forecasting changes in the global environment. This “system of systems” supports multidisciplinary and cross-disciplinary scientific researches. The presented GEOSS-based interoperability framework facilitates the discovery and exploitation of datasets and models from heterogeneous scientific domains and Information Technology services (data sources). The GEO Architecture and Data Committee (ADC) launched the Architecture Implementation Pilot (AIP) Initiative to develop and deploy new processes and infrastructure components for the GEOSS Common Infrastructure (GCI) and the broader GEOSS architecture. The current AIP Phase 3 (AIP-3) aims to increase GEOSS capacity to support several strategic Societal Benefit Areas (SBAs) including: Disaster Management, Health/Air Quality, Biodiversity, Energy, Health/Disease and Water. As to Biodiversity, the EC-funded EuroGEOSS (http://www.eurogeoss.eu) and GENESIS (http://www.genesis-fp7.eu) projects have developed a use scenario called “e-Habitat”. This scenario demonstrates how a GEOSS-based interoperability infrastructure can aid decision makers to assess and possibly forecast the irreplaceability of a given protected area, an essential indicator for assessing the criticality of threats this protected area is exposed to. Based on the previous AIP-Phase2 experience, the EuroGEOSS and GENESIS projects enhanced the successfully experimented interoperability infrastructure with: a) a discovery broker service which underpins semantics enabled queries: the EuroGEOSS/GENESIS Discovery Augmentation Component (DAC); b) environmental modeling components (i.e. OGC WPS instances) implementing algorithms to predict evolution of PAs ecosystems; c) a workflow engine to: i) browse semantic repositories; ii) retrieve concepts of interest; iii) search for resources (i.e. datasets and models) related to such concepts; iv) execute WPS instances. This presentation introduces the enhanced infrastructure developed by the EuroGEOSS/GENESIS AIP-3 Pilot to implement the “e-Habitat” use scenario. The presented infrastructure is accessible through the GEO Portal and is going to be used for demonstrating the “e-Habitat” model at the GEO Ministerial Meeting - Beijing, November 2010.
40 CFR 52.254 - Organic solvent usage.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Air Quality Control Regions (the “Regions”), as described in 40 CFR part 81, dated July 1, 1979... contrivances designed for processing continuous web, strip, or wire that emit organic materials in the course... articles, machines, equipment, or other contrivances designed for processing a continuous web, strip, or...
40 CFR 52.254 - Organic solvent usage.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Air Quality Control Regions (the “Regions”), as described in 40 CFR part 81, dated July 1, 1979... contrivances designed for processing continuous web, strip, or wire that emit organic materials in the course... articles, machines, equipment, or other contrivances designed for processing a continuous web, strip, or...
40 CFR 52.254 - Organic solvent usage.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Air Quality Control Regions (the “Regions”), as described in 40 CFR part 81, dated July 1, 1979... contrivances designed for processing continuous web, strip, or wire that emit organic materials in the course... articles, machines, equipment, or other contrivances designed for processing a continuous web, strip, or...
40 CFR 52.254 - Organic solvent usage.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Air Quality Control Regions (the “Regions”), as described in 40 CFR part 81, dated July 1, 1979... contrivances designed for processing continuous web, strip, or wire that emit organic materials in the course... articles, machines, equipment, or other contrivances designed for processing a continuous web, strip, or...
Strategic Positioning of the Web in a Multi-Channel Market Approach.
ERIC Educational Resources Information Center
Simons, Luuk P. A.; Steinfield, Charles; Bouwman, Harry
2002-01-01
Discusses channel economics in retail activities and trends toward unbundling due to the emergence of the Web channel. Highlights include sales processes and physical distribution processes; transaction costs; hybrid electronic commerce strategies; channel management and customer support; information economics, thing economics, and service…
Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems
NASA Astrophysics Data System (ADS)
Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn
The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.
Web-based Interspecies Correlation Estimation (Web-ICE) for Acute Toxicity: User Manual 3.2
The Web-ICE Endangered Species module simultaneously estimates toxicity to taxa representing threatened or endangered species using up to 25 surrogates. This module batch processes toxicity values for endangered species from all species, genus, and family level models available f...
Computer Mediated Communication: Online Instruction and Interactivity.
ERIC Educational Resources Information Center
Lavooy, Maria J.; Newlin, Michael H.
2003-01-01
Explores the different forms and potential applications of computer mediated communication (CMC) for Web-based and Web-enhanced courses. Based on their experiences with three different Web courses (Research Methods in Psychology, Statistical Methods in Psychology, and Basic Learning Processes) taught repeatedly over the last five years, the…
NASA Astrophysics Data System (ADS)
Demir, I.
2014-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.
ERIC Educational Resources Information Center
Bower, Matt
2011-01-01
Based on a three-semester design-based research study examining learning and teaching in a web-conferencing environment, this article identifies types of synchronous collaboration competencies and reveals their influence on learning processes. Four levels of online collaborative competencies were observed--operational, interactional, managerial,…
Montagni, Ilaria; Langlois, Emmanuel; Wittwer, Jérôme; Tzourio, Christophe
2017-02-16
University students aged 18-30 years are a population group reporting low access to health care services, with high rates of avoidance and delay of medical care. This group also reports not having appropriate information about available health care services. However, university students are at risk for several health problems, and regular medical consultations are recommended in this period of life. New digital devices are popular among the young, and Web-apps can be used to facilitate easy access to information regarding health care services. A small number of electronic health (eHealth) tools have been developed with the purpose of displaying real-world health care services, and little is known about how such eHealth tools can improve access to care. This paper describes the processes of co-creating and evaluating the beta version of a Web-app aimed at mapping and describing free or low-cost real-world health care services available in the Bordeaux area of France, which is specifically targeted to university students. The co-creation process involves: (1) exploring the needs of students to know and access real-world health care services; (2) identifying the real-world health care services of interest for students; and (3) deciding on a user interface, and developing the beta version of the Web-app. Finally, the evaluation process involves: (1) testing the beta version of the Web-app with the target audience (university students aged 18-30 years); (2) collecting their feedback via a satisfaction survey; and (3) planning a long-term evaluation. The co-creation process of the beta version of the Web-app was completed in August 2016 and is described in this paper. The evaluation process started on September 7, 2016. The project was completed in December 2016 and implementation of the Web-app is ongoing. Web-apps are an innovative way to increase the health literacy of young people in terms of delivery of and access to health care. The creation of Web-apps benefits from the involvement of stakeholders (eg, students and health care providers) to correctly identify the real-world health care services to be displayed. ©Ilaria Montagni, Emmanuel Langlois, Jérôme Wittwer, Christophe Tzourio. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 16.02.2017.
Langlois, Emmanuel; Wittwer, Jérôme; Tzourio, Christophe
2017-01-01
Background University students aged 18-30 years are a population group reporting low access to health care services, with high rates of avoidance and delay of medical care. This group also reports not having appropriate information about available health care services. However, university students are at risk for several health problems, and regular medical consultations are recommended in this period of life. New digital devices are popular among the young, and Web-apps can be used to facilitate easy access to information regarding health care services. A small number of electronic health (eHealth) tools have been developed with the purpose of displaying real-world health care services, and little is known about how such eHealth tools can improve access to care. Objective This paper describes the processes of co-creating and evaluating the beta version of a Web-app aimed at mapping and describing free or low-cost real-world health care services available in the Bordeaux area of France, which is specifically targeted to university students. Methods The co-creation process involves: (1) exploring the needs of students to know and access real-world health care services; (2) identifying the real-world health care services of interest for students; and (3) deciding on a user interface, and developing the beta version of the Web-app. Finally, the evaluation process involves: (1) testing the beta version of the Web-app with the target audience (university students aged 18-30 years); (2) collecting their feedback via a satisfaction survey; and (3) planning a long-term evaluation. Results The co-creation process of the beta version of the Web-app was completed in August 2016 and is described in this paper. The evaluation process started on September 7, 2016. The project was completed in December 2016 and implementation of the Web-app is ongoing. Conclusions Web-apps are an innovative way to increase the health literacy of young people in terms of delivery of and access to health care. The creation of Web-apps benefits from the involvement of stakeholders (eg, students and health care providers) to correctly identify the real-world health care services to be displayed. PMID:28209561
Distributed data collection and supervision based on web sensor
NASA Astrophysics Data System (ADS)
He, Pengju; Dai, Guanzhong; Fu, Lei; Li, Xiangjun
2006-11-01
As a node in Internet/Intranet, web sensor has been promoted in recent years and wildly applied in remote manufactory, workshop measurement and control field. However, the conventional scheme can only support HTTP protocol, and the remote users supervise and control the collected data published by web in the standard browser because of the limited resource of the microprocessor in the sensor; moreover, only one node of data acquirement can be supervised and controlled in one instant therefore the requirement of centralized remote supervision, control and data process can not be satisfied in some fields. In this paper, the centralized remote supervision, control and data process by the web sensor are proposed and implemented by the principle of device driver program. The useless information of the every collected web page embedded in the sensor is filtered and the useful data is transmitted to the real-time database in the workstation, and different filter algorithms are designed for different sensors possessing independent web pages. Every sensor node has its own filter program of web, called "web data collection driver program", the collecting details are shielded, and the supervision, control and configuration software can be implemented by the call of web data collection driver program just like the use of the I/O driver program. The proposed technology can be applied in the data acquirement where relative low real-time is required.
Overlay accuracy on a flexible web with a roll printing process based on a roll-to-roll system.
Chang, Jaehyuk; Lee, Sunggun; Lee, Ki Beom; Lee, Seungjun; Cho, Young Tae; Seo, Jungwoo; Lee, Sukwon; Jo, Gugrae; Lee, Ki-yong; Kong, Hyang-Shik; Kwon, Sin
2015-05-01
For high-quality flexible devices from printing processes based on Roll-to-Roll (R2R) systems, overlay alignment during the patterning of each functional layer poses a major challenge. The reason is because flexible substrates have a relatively low stiffness compared with rigid substrates, and they are easily deformed during web handling in the R2R system. To achieve a high overlay accuracy for a flexible substrate, it is important not only to develop web handling modules (such as web guiding, tension control, winding, and unwinding) and a precise printing tool but also to control the synchronization of each unit in the total system. A R2R web handling system and reverse offset printing process were developed in this work, and an overlay between the 1st and 2nd layers of ±5μm on a 500 mm-wide film was achieved at a σ level of 2.4 and 2.8 (x and y directions, respectively) in a continuous R2R printing process. This paper presents the components and mechanisms used in reverse offset printing based on a R2R system and the printing results including positioning accuracy and overlay alignment accuracy.
Increasing the value of geospatial informatics with open approaches for Big Data
NASA Astrophysics Data System (ADS)
Percivall, G.; Bermudez, L. E.
2017-12-01
Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."
Tanikawa, Akio; Shinkai, Akira; Miyashita, Tadashi
2014-11-01
The evolutionary process of the unique web architectures of spiders of the sub-family Cyrtarachninae, which includes the triangular web weaver, bolas spider, and webless spider, is thought to be derived from reduction of orbicular 'spanning-thread webs' resembling ordinal orb webs. A molecular phylogenetic analysis was conducted to explore this hypothesis using orbicular web spiders Cyrtarachne, Paraplectana, Poecilopachys, triangular web spider Pasilobus, bolas spiders Ordgarius and Mastophora, and webless spider Celaenia. The phylogeny inferred from partial sequences of mt-COI, nuclear 18S-rRNA and 28S-rRNA showed that the common ancestor of these spiders diverged into two clades: a spanning-thread web clade and a bolas or webless clade. This finding suggests that the triangular web evolved by reduction of an orbicular spanning web, but that bolas spiders evolved in the early stage, which does not support the gradual web reduction hypothesis.
The effect of tooling design parameters on web-warping in the flexible roll forming of UHSS
NASA Astrophysics Data System (ADS)
Jiao, Jingsi; Rolfe, Bernard; Mendiguren, Joseba; Galdos, Lander; Weiss, Matthias
2013-12-01
To reduce weight and improve passenger safety there is an increased need in the automotive industry to use Ultra High Strength Steels (UHSS) for structural and crash components. However, the application of UHSS is restricted by their limited formability and the difficulty of forming them in conventional processes. An alternative method of manufacturing structural auto body parts from UHSS is the flexible roll forming process which can accommodate materials with high strength and limited ductility in the production of complex and weight-optimised components. However, one major concern in the flexible roll forming is web-warping, which is the height deviation of the profile web area. This paper investigates, using a numerical model, the effect on web-warping with respect to various forming methods. The results demonstrate that different forming methods lead to different amount of web-warping in terms of forming the product with identical geometry.
Development of processes for the production of low cost silicon dendritic web for solar cells
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Skutch, M. E.; Driggers, J. M.; Hill, F. E.
1980-01-01
High area output rates and continuous, automated growth are two key technical requirements for the growth of low-cost silicon ribbons for solar cells. By means of computer-aided furnace design, silicon dendritic web output rates as high as 27 sq cm/min have been achieved, a value in excess of that projected to meet a $0.50 per peak watt solar array manufacturing cost. The feasibility of simultaneous web growth while the melt is replenished with pelletized silicon has also been demonstrated. This step is an important precursor to the development of an automated growth system. Solar cells made on the replenished material were just as efficient as devices fabricated on typical webs grown without replenishment. Moreover, web cells made on a less-refined, pelletized polycrystalline silicon synthesized by the Battelle process yielded efficiencies up to 13% (AM1).
29 CFR 1926.751 - Definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... process of erection. Steel joist means an open web, secondary load-carrying member of 144 feet (43.9 m) or... structural steel trusses or cold-formed joists. Steel joist girder means an open web, primary load-carrying... structural steel trusses. Steel truss means an open web member designed of structural steel components by the...
29 CFR 1926.751 - Definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... process of erection. Steel joist means an open web, secondary load-carrying member of 144 feet (43.9 m) or... structural steel trusses or cold-formed joists. Steel joist girder means an open web, primary load-carrying... structural steel trusses. Steel truss means an open web member designed of structural steel components by the...
WebQuests as Language-Learning Tools
ERIC Educational Resources Information Center
Aydin, Selami
2016-01-01
This study presents a review of the literature that examines WebQuests as tools for second-language acquisition and foreign language-learning processes to guide teachers in their teaching activities and researchers in further research on the issue. The study first introduces the theoretical background behind WebQuest use in the mentioned…
Simplify Web Development for Faculty and Promote Instructional Design.
ERIC Educational Resources Information Center
Pedersen, David C.
Faculty members are often overwhelmed with the prospect of implementing Web-based instruction. In an effort to simplify the process and incorporate some basic instructional design elements, the Educational Technology Team at Embry Riddle Aeronautical University created a course template for WebCT. Utilizing rapid prototyping, the template…
Children's Search Engines from an Information Search Process Perspective.
ERIC Educational Resources Information Center
Broch, Elana
2000-01-01
Describes cognitive and affective characteristics of children and teenagers that may affect their Web searching behavior. Reviews literature on children's searching in online public access catalogs (OPACs) and using digital libraries. Profiles two Web search engines. Discusses some of the difficulties children have searching the Web, in the…
Discovering Student Web Usage Profiles Using Markov Chains
ERIC Educational Resources Information Center
Marques, Alice; Belo, Orlando
2011-01-01
Nowadays, Web based platforms are quite common in any university, supporting a very diversified set of applications and services. Ranging from personal management to student evaluation processes, Web based platforms are doing a great job providing a very flexible way of working, promote student enrolment, and making access to academic information…
Automatic Semantic Generation and Arabic Translation of Mathematical Expressions on the Web
ERIC Educational Resources Information Center
Doush, Iyad Abu; Al-Bdarneh, Sondos
2013-01-01
Automatic processing of mathematical information on the web imposes some difficulties. This paper presents a novel technique for automatic generation of mathematical equations semantic and Arabic translation on the web. The proposed system facilitates unambiguous representation of mathematical equations by correlating equations to their known…
Social Networking on the Semantic Web
ERIC Educational Resources Information Center
Finin, Tim; Ding, Li; Zhou, Lina; Joshi, Anupam
2005-01-01
Purpose: Aims to investigate the way that the semantic web is being used to represent and process social network information. Design/methodology/approach: The Swoogle semantic web search engine was used to construct several large data sets of Resource Description Framework (RDF) documents with social network information that were encoded using the…
Mining a Web Citation Database for Author Co-Citation Analysis.
ERIC Educational Resources Information Center
He, Yulan; Hui, Siu Cheung
2002-01-01
Proposes a mining process to automate author co-citation analysis based on the Web Citation Database, a data warehouse for storing citation indices of Web publications. Describes the use of agglomerative hierarchical clustering for author clustering and multidimensional scaling for displaying author cluster maps, and explains PubSearch, a…
SIP: A Web-Based Astronomical Image Processing Program
NASA Astrophysics Data System (ADS)
Simonetti, J. H.
1999-12-01
I have written an astronomical image processing and analysis program designed to run over the internet in a Java-compatible web browser. The program, Sky Image Processor (SIP), is accessible at the SIP webpage (http://www.phys.vt.edu/SIP). Since nothing is installed on the user's machine, there is no need to download upgrades; the latest version of the program is always instantly available. Furthermore, the Java programming language is designed to work on any computer platform (any machine and operating system). The program could be used with students in web-based instruction or in a computer laboratory setting; it may also be of use in some research or outreach applications. While SIP is similar to other image processing programs, it is unique in some important respects. For example, SIP can load images from the user's machine or from the Web. An instructor can put images on a web server for students to load and analyze on their own personal computer. Or, the instructor can inform the students of images to load from any other web server. Furthermore, since SIP was written with students in mind, the philosophy is to present the user with the most basic tools necessary to process and analyze astronomical images. Images can be combined (by addition, subtraction, multiplication, or division), multiplied by a constant, smoothed, cropped, flipped, rotated, and so on. Statistics can be gathered for pixels within a box drawn by the user. Basic tools are available for gathering data from an image which can be used for performing simple differential photometry, or astrometry. Therefore, students can learn how astronomical image processing works. Since SIP is not part of a commercial CCD camera package, the program is written to handle the most common denominator image file, the FITS format.
Using Sensor Web Processes and Protocols to Assimilate Satellite Data into a Forecast Model
NASA Technical Reports Server (NTRS)
Goodman, H. Michael; Conover, Helen; Zavodsky, Bradley; Maskey, Manil; Jedlovec, Gary; Regner, Kathryn; Li, Xiang; Lu, Jessica; Botts, Mike; Berthiau, Gregoire
2008-01-01
The goal of the Sensor Management Applied Research Technologies (SMART) On-Demand Modeling project is to develop and demonstrate the readiness of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities to integrate both space-based Earth observations and forecast model output into new data acquisition and assimilation strategies. The project is developing sensor web-enabled processing plans to assimilate Atmospheric Infrared Sounding (AIRS) satellite temperature and moisture retrievals into a regional Weather Research and Forecast (WRF) model over the southeastern United States.
Research on SaaS and Web Service Based Order Tracking
NASA Astrophysics Data System (ADS)
Jiang, Jianhua; Sheng, Buyun; Gong, Lixiong; Yang, Mingzhong
To solve the order tracking of across enterprises in Dynamic Virtual Enterprise (DVE), a SaaS and web service based order tracking solution was designed by analyzing the order management process in DVE. To achieve the system, the SaaS based architecture of data management on order tasks manufacturing states was constructed, and the encapsulation method of transforming application system into web service was researched. Then the process of order tracking in the system was given out. Finally, the feasibility of this study was verified by the development of a prototype system.
Web-Based Learning Support System
NASA Astrophysics Data System (ADS)
Fan, Lisa
Web-based learning support system offers many benefits over traditional learning environments and has become very popular. The Web is a powerful environment for distributing information and delivering knowledge to an increasingly wide and diverse audience. Typical Web-based learning environments, such as Web-CT, Blackboard, include course content delivery tools, quiz modules, grade reporting systems, assignment submission components, etc. They are powerful integrated learning management systems (LMS) that support a number of activities performed by teachers and students during the learning process [1]. However, students who study a course on the Internet tend to be more heterogeneously distributed than those found in a traditional classroom situation. In order to achieve optimal efficiency in a learning process, an individual learner needs his or her own personalized assistance. For a web-based open and dynamic learning environment, personalized support for learners becomes more important. This chapter demonstrates how to realize personalized learning support in dynamic and heterogeneous learning environments by utilizing Adaptive Web technologies. It focuses on course personalization in terms of contents and teaching materials that is according to each student's needs and capabilities. An example of using Rough Set to analyze student personal information to assist students with effective learning and predict student performance is presented.
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.
2017-12-01
The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.
The value of the Semantic Web in the laboratory.
Frey, Jeremy G
2009-06-01
The Semantic Web is beginning to impact on the wider chemical and physical sciences, beyond the earlier adopted bio-informatics. While useful in large-scale data driven science with automated processing, these technologies can also help integrate the work of smaller scale laboratories producing diverse data. The semantics aid the discovery, reliable re-use of data, provide improved provenance and facilitate automated processing by increased resilience to changes in presentation and reduced ambiguity. The Semantic Web, its tools and collections are not yet competitive with well-established solutions to current problems. It is in the reduced cost of instituting solutions to new problems that the versatility of Semantic Web-enabled data and resources will make their mark once the more general-purpose tools are more available.
WebEAV: automatic metadata-driven generation of web interfaces to entity-attribute-value databases.
Nadkarni, P M; Brandt, C M; Marenco, L
2000-01-01
The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples.
Electrical and Structural Characterization of Web Dendrite Crystals
NASA Technical Reports Server (NTRS)
Schwuttke, G. H.; Koliwad, K.; Dumas, K. A.
1985-01-01
Minority carrier lifetime distributions in silicon web dendrites are measured. Emphasis is placed on measuring areal homogeneity of lifetime, show its dependency on structural defects, and its unique change during hot processing. The internal gettering action of defect layers present in web crystals and their relation to minority carrier lifetime distributions is discussed. Minority carrier lifetime maps of web dendrites obtained before and after high temperature heat treatment are compared to similar maps obtained from 100 mm diameter Czochralski silicon wafers. Such maps indicate similar or superior areal homogeneity of minority carrier lifetime in webs.
Finite Element Analysis for the Web Offset of Wind Turbine Blade
NASA Astrophysics Data System (ADS)
Zhou, Bo; Wang, Xin; Zheng, Changwei; Cao, Jinxiang; Zou, Pingguo
2017-05-01
The web is an important part of wind turbine blade, which improves bending properties. Much of blade process is handmade, so web offset of wind turbine blade is one of common quality defects. In this paper, a 3D parametric finite element model of a blade for 2MW turbine was established by ANSYS. Stress distributions in different web offset values were studied. There were three kinds of web offset. The systematic study of web offset was done by orthogonal experiment. The most important factor of stress distributions was found. The analysis results have certain instructive significance to design and manufacture of wind turbine blade.
Beyond accuracy: creating interoperable and scalable text-mining web services.
Wei, Chih-Hsuan; Leaman, Robert; Lu, Zhiyong
2016-06-15
The biomedical literature is a knowledge-rich resource and an important foundation for future research. With over 24 million articles in PubMed and an increasing growth rate, research in automated text processing is becoming increasingly important. We report here our recently developed web-based text mining services for biomedical concept recognition and normalization. Unlike most text-mining software tools, our web services integrate several state-of-the-art entity tagging systems (DNorm, GNormPlus, SR4GN, tmChem and tmVar) and offer a batch-processing mode able to process arbitrary text input (e.g. scholarly publications, patents and medical records) in multiple formats (e.g. BioC). We support multiple standards to make our service interoperable and allow simpler integration with other text-processing pipelines. To maximize scalability, we have preprocessed all PubMed articles, and use a computer cluster for processing large requests of arbitrary text. Our text-mining web service is freely available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/#curl : Zhiyong.Lu@nih.gov. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.
Guidelines for Transferring Residential Courses into Web
ERIC Educational Resources Information Center
Tüzün, Hakan; Çinar, Murat
2016-01-01
This study shared unique design experiences by examining the process of transferring residential courses to the Web, and proposed a design model for individuals who want to transfer their courses into this environment. The formative research method was used in the study, and two project teams' processes of putting courses, which were being taught…
Evaluation of strength-controlling defects in paper by stress concentration analyses
John M. Considine; David W. Vahey; James W. Evans; Kevin T. Turner; Robert E. Rowlands
2011-01-01
Cellulosic webs, such as paper materials, are composed of an interwoven, bonded network of cellulose fibers. Strength-controlling parameters in these webs are influenced by constituent fibers and method of processing and manufacture. Instead of estimating the effect on tensile strength of each processing/manufacturing variable, this study modifies and compares the...
Development and Evaluation of a Thai Learning System on the Web Using Natural Language Processing.
ERIC Educational Resources Information Center
Dansuwan, Suyada; Nishina, Kikuko; Akahori, Kanji; Shimizu, Yasutaka
2001-01-01
Describes the Thai Learning System, which is designed to help learners acquire the Thai word order system. The system facilitates the lessons on the Web using HyperText Markup Language and Perl programming, which interfaces with natural language processing by means of Prolog. (Author/VWL)
ERIC Educational Resources Information Center
Price, Geoffrey P.; Wright, Vivian H.
2012-01-01
Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…
Distributed Group Design Process: Lessons Learned.
ERIC Educational Resources Information Center
Eseryel, Deniz; Ganesan, Radha
A typical Web-based training development team consists of a project manager, an instructional designer, a subject-matter expert, a graphic artist, and a Web programmer. The typical scenario involves team members working together in the same setting during the entire design and development process. What happens when the team is distributed, that is…
ERIC Educational Resources Information Center
Liou, Hsien-Chin; Chang, Jason S; Chen, Hao-Jan; Lin, Chih-Cheng; Liaw, Meei-Ling; Gao, Zhao-Ming; Jang, Jyh-Shing Roger; Yeh, Yuli; Chuang, Thomas C.; You, Geeng-Neng
2006-01-01
This paper describes the development of an innovative web-based environment for English language learning with advanced data-driven and statistical approaches. The project uses various corpora, including a Chinese-English parallel corpus ("Sinorama") and various natural language processing (NLP) tools to construct effective English…
Semiconductor junction formation by directed heat
Campbell, Robert B.
1988-03-24
The process of the invention includes applying precursors 6 with N- and P-type dopants therein to a silicon web 2, with the web 2 then being baked in an oven 10 to drive off excessive solvents, and the web 2 is then heated using a pulsed high intensity light in a mechanism 12 at 1100.degree.-1150.degree. C. for about 10 seconds to simultaneously form semiconductor junctions in both faces of the web.
The combined influence of central and peripheral routes in the online persuasion process.
SanJosé-Cabezudo, Rebeca; Gutiérrez-Arranz, Ana M; Gutiérrez-Cillán, Jesús
2009-06-01
The elaboration likelihood model (ELM) is one of the most widely used psychological theories in academic literature to account for how advertising information is processed. The current work seeks to overturn one of the basic principles of the ELM and takes account of new variables in the model that help to explain the online persuasion process more clearly. Specifically, we posit that in a context of high-involvement exposure to advertising (e.g., Web pages), central and peripheral processing routes may act together. In a repeated-measures experimental design, 112 participants were exposed to two Web sites of a fictitious travel agency, differing only in their design--serious versus amusing. Findings evidence that a peripheral cue, such as how the Web pages are presented, does prove relevant when attempting to reflect the level of effectiveness. Moreover, if we take account of individuals' motivation when accessing the Internet, whether cognitive or affective, the motivation will impact their response to the Web site design. The work contributes to ELM literature and may help firms to pinpoint those areas and features of Internet advertising that prove most efficient.
Multilingual Speech and Language Processing
2003-04-01
client software handles the user end of the transaction. Historically, four clients were provided: e-mail, web, FrameMaker , and command line. By...command-line client and an API. The API allows integration of CyberTrans into a number of processes including word processing packages ( FrameMaker ...preservation and logging, and others. The available clients remain e-mail, Web and FrameMaker . Platforms include both Unix and PC for clients, with
Nadkarni, Prakash M.; Brandt, Cynthia M.; Marenco, Luis
2000-01-01
The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples. PMID:10887163
Advanced dendritic web growth development
NASA Technical Reports Server (NTRS)
Hopkins, R. H.
1985-01-01
A program to develop the technology of the silicon dendritic web ribbon growth process is examined. The effort is being concentrated on the area rate and quality requirements necessary to meet the JPL/DOE goals for terrestrial PV applications. Closed loop web growth system development and stress reduction for high area rate growth is considered.
ERIC Educational Resources Information Center
Huang, Yueh-Min; Liu, Chien-Hung
2009-01-01
One of the key challenges in the promotion of web-based learning is the development of effective collaborative learning environments. We posit that the structuration process strongly influences the effectiveness of technology used in web-based collaborative learning activities. In this paper, we propose an ant swarm collaborative learning (ASCL)…
Distance Learning Courses on the Web: The Authoring Approach.
ERIC Educational Resources Information Center
Santos, Neide; Diaz, Alicia; Bibbo, Luis Mariano
This paper proposes a framework for supporting the authoring process of distance learning courses. An overview of distance learning courses and the World Wide Web is presented. The proposed framework is then described, including: (1) components of the framework--a hypermedia design methodology for authoring the course, links to related Web sites,…
Integrating a Project Management Approach to E-Business Application Course
ERIC Educational Resources Information Center
Chen, Kuan C.; Chuang, Keh-Wen
2008-01-01
Teaching students project managements requires a hands-on approach. Incorporating project management concepts and processes into a student team Web development project adds a dimension that exposes students to the realities of effective Web development. This paper will describe the project management approach used in a Web development course in…
sTeam--Providing Primary Media Functions for Web-Based Computer-Supported Cooperative Learning.
ERIC Educational Resources Information Center
Hampel, Thorsten
The World Wide Web has developed as the de facto standard for computer based learning. However, as a server-centered approach, it confines readers and learners to passive nonsequential reading. Authoring and Web-publishing systems aim at supporting the authors' design process. Consequently, learners' activities are confined to selecting and…
Web Services as Public Services: Are We Supporting Our Busiest Service Point?
ERIC Educational Resources Information Center
Riley-Huff, Debra A.
2009-01-01
This article is an analysis of academic library organizational culture, patterns, and processes as they relate to Web services. Data gathered in a research survey is examined in an attempt to reveal current departmental and administrative attitudes, practices, and support for Web services in the library research environment. (Contains 10 tables.)
ERIC Educational Resources Information Center
Yesiltas, Erkan
2016-01-01
Web pedagogical content knowledge generally takes pedagogical knowledge, content knowledge, and Web knowledge as basis. It is a structure emerging through the interaction of these three components. Content knowledge refers to knowledge of subjects to be taught. Pedagogical knowledge involves knowledge of process, implementation, learning methods,…
Widening and Deepening Questions in Web-Based Investigative Learning
ERIC Educational Resources Information Center
Kashihara, Akihiro; Akiyama, Naoto
2016-01-01
Web allows learners to investigate any question with a great variety of Web resources, in which they could construct a wider, and deeper knowledge. In such investigative learning process, it is important for them to deepen and widen the question, which involves decomposing the question into the sub-questions to be further investigated. This…
A Sample WebQuest Applicable in Teaching Topological Concepts
ERIC Educational Resources Information Center
Yildiz, Sevda Goktepe; Korpeoglu, Seda Goktepe
2016-01-01
In recent years, WebQuests have received a great deal of attention and have been used effectively in teaching-learning process in various courses. In this study, a WebQuest that can be applicable in teaching topological concepts for undergraduate level students was prepared. A number of topological concepts, such as countability, infinity, and…
Exploring the Relationship between Self-Regulated Vocabulary Learning and Web-Based Collaboration
ERIC Educational Resources Information Center
Liu, Sarah Hsueh-Jui; Lan, Yu-Ju; Ho, Cloudia Ya-Yu
2014-01-01
Collaborative learning has placed an emphasis on co-constructing knowledge by sharing and negotiating meaning for problem-solving activities, and this cannot be accomplished without governing the self-regulatory processes of students. This study employed a Web-based tool, Google Docs, to determine the effects of Web-based collaboration on…
Exploring Fish Diversity as a Determinant of Ecosystem Properties in Aquatic Food Webs
ERIC Educational Resources Information Center
Carey, Michael P.
2009-01-01
Dramatic biodiversity changes occurring globally from species loss and invasion have altered native food webs and ecosystem processes. My research objectives are to understand the consequences of fish diversity to freshwater systems by (1) examining the food web consequences of multiple top predators, (2) determining how biodiversity influences…
Web-Based Learning Environment: A Theory-Based Design Process for Development and Evaluation
ERIC Educational Resources Information Center
Nam, Chang S.; Smith-Jackson, Tonya L.
2007-01-01
Web-based courses and programs have increasingly been developed by many academic institutions, organizations, and companies worldwide due to their benefits for both learners and educators. However, many of the developmental approaches lack two important considerations needed for implementing Web-based learning applications: (1) integration of the…
Weaving Web 2.0 and the Writing Process with Feminist Pedagogy
ERIC Educational Resources Information Center
Zhao, Ruijie
2010-01-01
This dissertation, as a theoretical study, focused on how Web 2.0 technology potentially helps students gain power, knowledge, and agency in the networked learning environment and how feminist pedagogy conceivably facilitates the implementation of Web 2.0 technology to produce an opportune learning environment. Primarily, this study used feminist…
ERIC Educational Resources Information Center
Gerjets, Peter; Kammerer, Yvonne; Werner, Benita
2011-01-01
Web searching for complex information requires to appropriately evaluating diverse sources of information. Information science studies identified different criteria applied by searchers to evaluate Web information. However, the explicit evaluation instructions used in these studies might have resulted in a distortion of spontaneous evaluation…
ERIC Educational Resources Information Center
Olaniran, Bolanle A.
2010-01-01
The semantic web describes the process whereby information content is made available for machine consumption. With increased reliance on information communication technologies, the semantic web promises effective and efficient information acquisition and dissemination of products and services in the global economy, in particular, e-learning.…
Server-Side Includes Made Simple.
ERIC Educational Resources Information Center
Fagan, Jody Condit
2002-01-01
Describes server-side include (SSI) codes which allow Webmasters to insert content into Web pages without programming knowledge. Explains how to enable the codes on a Web server, provides a step-by-step process for implementing them, discusses tags and syntax errors, and includes examples of their use on the Web site for Southern Illinois…
Survey Says? A Primer on Web-based Survey Design and Distribution
Oppenheimer, Adam J.; Pannucci, Christopher J.; Kasten, Steven J.; Haase, Steven C.
2011-01-01
The internet has changed the way in which we gather and interpret information. While books were once the exclusive bearers of data, knowledge is now only a keystroke away. The internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over one-hundred medical publications have been based on web-based survey data alone. Due to emerging internet technologies, web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, web-based survey techniques are not without their limitations, namely recall and response biases. When used properly, however, web-based surveys can greatly simplify the research process. This article discusses the implications of web-based surveys and provides guidelines for their effective design and distribution. PMID:21701347
GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.
Liang, Steve H L; Huang, Chih-Yuan
2013-10-02
The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.
GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web
Liang, Steve H.L.; Huang, Chih-Yuan
2013-01-01
The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision. PMID:24152921
rasdaman Array Database: current status
NASA Astrophysics Data System (ADS)
Merticariu, George; Toader, Alexandru
2015-04-01
rasdaman (Raster Data Manager) is a Free Open Source Array Database Management System which provides functionality for storing and processing massive amounts of raster data in the form of multidimensional arrays. The user can access, process and delete the data using SQL. The key features of rasdaman are: flexibility (datasets of any dimensionality can be processed with the help of SQL queries), scalability (rasdaman's distributed architecture enables it to seamlessly run on cloud infrastructures while offering an increase in performance with the increase of computation resources), performance (real-time access, processing, mixing and filtering of arrays of any dimensionality) and reliability (legacy communication protocol replaced with a new one based on cutting edge technology - Google Protocol Buffers and ZeroMQ). Among the data with which the system works, we can count 1D time series, 2D remote sensing imagery, 3D image time series, 3D geophysical data, and 4D atmospheric and climate data. Most of these representations cannot be stored only in the form of raw arrays, as the location information of the contents is also important for having a correct geoposition on Earth. This is defined by ISO 19123 as coverage data. rasdaman provides coverage data support through the Petascope service. Extensions were added on top of rasdaman in order to provide support for the Geoscience community. The following OGC standards are currently supported: Web Map Service (WMS), Web Coverage Service (WCS), and Web Coverage Processing Service (WCPS). The Web Map Service is an extension which provides zoom and pan navigation over images provided by a map server. Starting with version 9.1, rasdaman supports WMS version 1.3. The Web Coverage Service provides capabilities for downloading multi-dimensional coverage data. Support is also provided for several extensions of this service: Subsetting Extension, Scaling Extension, and, starting with version 9.1, Transaction Extension, which defines request types for inserting, updating and deleting coverages. A web client, designed for both novice and experienced users, is also available for the service and its extensions. The client offers an intuitive interface that allows users to work with multi-dimensional coverages by abstracting the specifics of the standard definitions of the requests. The Web Coverage Processing Service defines a language for on-the-fly processing and filtering multi-dimensional raster coverages. rasdaman exposes this service through the WCS processing extension. Demonstrations are provided online via the Earthlook website (earthlook.org) which presents use-cases from a wide variety of application domains, using the rasdaman system as processing engine.
MAGMA: analysis of two-channel microarrays made easy.
Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph
2007-07-01
The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.
Semantic orchestration of image processing services for environmental analysis
NASA Astrophysics Data System (ADS)
Ranisavljević, Élisabeth; Devin, Florent; Laffly, Dominique; Le Nir, Yannick
2013-09-01
In order to analyze environmental dynamics, a major process is the classification of the different phenomena of the site (e.g. ice and snow for a glacier). When using in situ pictures, this classification requires data pre-processing. Not all the pictures need the same sequence of processes depending on the disturbances. Until now, these sequences have been done manually, which restricts the processing of large amount of data. In this paper, we present how to realize a semantic orchestration to automate the sequencing for the analysis. It combines two advantages: solving the problem of the amount of processing, and diversifying the possibilities in the data processing. We define a BPEL description to express the sequences. This BPEL uses some web services to run the data processing. Each web service is semantically annotated using an ontology of image processing. The dynamic modification of the BPEL is done using SPARQL queries on these annotated web services. The results obtained by a prototype implementing this method validate the construction of the different workflows that can be applied to a large number of pictures.
A Role for Semantic Web Technologies in Patient Record Data Collection
NASA Astrophysics Data System (ADS)
Ogbuji, Chimezie
Business Process Management Systems (BPMS) are a component of the stack of Web standards that comprise Service Oriented Architecture (SOA). Such systems are representative of the architectural framework of modern information systems built in an enterprise intranet and are in contrast to systems built for deployment on the larger World Wide Web. The REST architectural style is an emerging style for building loosely coupled systems based purely on the native HTTP protocol. It is a coordinated set of architectural constraints with a goal to minimize latency, maximize the independence and scalability of distributed components, and facilitate the use of intermediary processors.Within the development community for distributed, Web-based systems, there has been a debate regarding themerits of both approaches. In some cases, there are legitimate concerns about the differences in both architectural styles. In other cases, the contention seems to be based on concerns that are marginal at best. In this chapter, we will attempt to contribute to this debate by focusing on a specific, deployed use case that emphasizes the role of the Semantic Web, a simple Web application architecture that leverages the use of declarative XML processing, and the needs of a workflow system. The use case involves orchestrating a work process associated with the data entry of structured patient record content into a research registry at the Cleveland Clinic's Clinical Investigation department in the Heart and Vascular Institute.
Schutyser, M A I; Straatsma, J; Keijzer, P M; Verschueren, M; De Jong, P
2008-11-30
In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. It can be applied to existing products and processes but also to reduce time to market for new products. Important aspects of the tool are its user-friendliness and its specifications customised to the needs of small dairy companies. To challenge the web-based tool it was applied for optimisation of thermal treatments in 16 dairy companies producing yoghurt, fresh cream, chocolate milk and cheese. Optimisation with WebSim-MILQ resulted in concrete improvements with respect to risk of microbial contamination, cheese yield, fouling and production costs. In this paper we illustrate the use of WebSim-MILQ for optimisation of a cheese milk pasteurisation process where we could increase the cheese yield (1 extra cheese for each 100 produced cheeses from the same amount of milk) and reduced the risk of contamination of pasteurised cheese milk with thermoresistent streptococci from critical to negligible. In another case we demonstrate the advantage for changing from an indirect to a direct heating method for a UHT process resulting in 80% less fouling, while improving product quality and maintaining product safety.
NASA Astrophysics Data System (ADS)
Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi
2017-03-01
This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.
Bringing Control System User Interfaces to the Web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xihui; Kasemir, Kay
With the evolution of web based technologies, especially HTML5 [1], it becomes possible to create web-based control system user interfaces (UI) that are cross-browser and cross-device compatible. This article describes two technologies that facilitate this goal. The first one is the WebOPI [2], which can seamlessly display CSS BOY [3] Operator Interfaces (OPI) in web browsers without modification to the original OPI file. The WebOPI leverages the powerful graphical editing capabilities of BOY and provides the convenience of re-using existing OPI files. On the other hand, it uses generic JavaScript and a generic communication mechanism between the web browser andmore » web server. It is not optimized for a control system, which results in unnecessary network traffic and resource usage. Our second technology is the WebSocket-based Process Data Access (WebPDA) [4]. It is a protocol that provides efficient control system data communication using WebSocket [5], so that users can create web-based control system UIs using standard web page technologies such as HTML, CSS and JavaScript. WebPDA is control system independent, potentially supporting any type of control system.« less
Using JavaScript and the FDSN web service to create an interactive earthquake information system
NASA Astrophysics Data System (ADS)
Fischer, Kasper D.
2015-04-01
The FDSN web service provides a web interface to access earthquake meta-data (e. g. event or station information) and waveform date over the internet. Requests are send to a server as URLs and the output is either XML or miniSEED. This makes it hard to read by humans but easy to process with different software. Different data centers are already supporting the FDSN web service, e. g. USGS, IRIS, ORFEUS. The FDSN web service is also part of the Seiscomp3 (http://www.seiscomp3.org) software. The Seismological Observatory of the Ruhr-University switched to Seiscomp3 as the standard software for the analysis of mining induced earthquakes at the beginning of 2014. This made it necessary to create a new web-based earthquake information service for the publication of results to the general public. This has be done by processing the output of a FDSN web service query by javascript running in a standard browser. The result is an interactive map presenting the observed events and further information of events and stations on a single web page as a table and on a map. In addition the user can download event information, waveform data and station data in different formats like miniSEED, quakeML or FDSNxml. The developed code and all used libraries are open source and freely available.
A Web simulation of medical image reconstruction and processing as an educational tool.
Papamichail, Dimitrios; Pantelis, Evaggelos; Papagiannis, Panagiotis; Karaiskos, Pantelis; Georgiou, Evangelos
2015-02-01
Web educational resources integrating interactive simulation tools provide students with an in-depth understanding of the medical imaging process. The aim of this work was the development of a purely Web-based, open access, interactive application, as an ancillary learning tool in graduate and postgraduate medical imaging education, including a systematic evaluation of learning effectiveness. The pedagogic content of the educational Web portal was designed to cover the basic concepts of medical imaging reconstruction and processing, through the use of active learning and motivation, including learning simulations that closely resemble actual tomographic imaging systems. The user can implement image reconstruction and processing algorithms under a single user interface and manipulate various factors to understand the impact on image appearance. A questionnaire for pre- and post-training self-assessment was developed and integrated in the online application. The developed Web-based educational application introduces the trainee in the basic concepts of imaging through textual and graphical information and proceeds with a learning-by-doing approach. Trainees are encouraged to participate in a pre- and post-training questionnaire to assess their knowledge gain. An initial feedback from a group of graduate medical students showed that the developed course was considered as effective and well structured. An e-learning application on medical imaging integrating interactive simulation tools was developed and assessed in our institution.
An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi
NASA Astrophysics Data System (ADS)
Deng, D.-P.; Lemmens, R.
2011-08-01
The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.
Kwak, Dae Hyun; Lee, Eun Ju; Kim, Deug Joong
2014-11-01
Hydroxyapatite/cellulose acetate composite webs were fabricated by an electro-spinning process. This electro-spinning process makes it possible to fabricate complex three-dimensional shapes. Nano fibrous web consisting of cellulose acetate and hydroxyapatite was produced from their mixture solution by using an electro-spinning process under high voltage. The surface of the electro-spun fiber was modified by a plasma and alkaline solution in order to increase its bioactivity. The structure, morphology and properties of the electro-spun fibers were investigated and an in-vitro bioactivity test was evaluated in simulated body fluid (SBF). Bioactivity of the electro-spun web was enhanced with the filler concentration and surface treatment. The surface changes of electro-spun fibers modified by plasma and alkaline solution were investigated by FT-IR (Fourier Transform Infrared Spectroscopy) and XPS (X-ray Photoelectron Spectroscopy).
Toward a More Flexible Web-Based Framework for Multidisciplinary Design
NASA Technical Reports Server (NTRS)
Rogers, J. L.; Salas, A. O.
1999-01-01
In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary design, is defined as a hardware-software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, monitoring, controlling, and displaying the design process. The objective of this research is to explore how Web technology can improve these areas of weakness and lead toward a more flexible framework. This article describes a Web-based system that optimizes and controls the execution sequence of design processes in addition to monitoring the project status and displaying the design results.
Integrating UIMA annotators in a web-based text processing framework.
Chen, Xiang; Arnold, Corey W
2013-01-01
The Unstructured Information Management Architecture (UIMA) [1] framework is a growing platform for natural language processing (NLP) applications. However, such applications may be difficult for non-technical users deploy. This project presents a web-based framework that wraps UIMA-based annotator systems into a graphical user interface for researchers and clinicians, and a web service for developers. An annotator that extracts data elements from lung cancer radiology reports is presented to illustrate the use of the system. Annotation results from the web system can be exported to multiple formats for users to utilize in other aspects of their research and workflow. This project demonstrates the benefits of a lay-user interface for complex NLP applications. Efforts such as this can lead to increased interest and support for NLP work in the clinical domain.
Harker, Laura; Bamps, Yvan; Flemming, Shauna St. Clair; Perryman, Jennie P; Thompson, Nancy J; Patzer, Rachel E; Williams, Nancy S DeSousa; Arriola, Kimberly R Jacob
2017-01-01
Background The lack of available organs is often considered to be the single greatest problem in transplantation today. Internet use is at an all-time high, creating an opportunity to increase public commitment to organ donation through the broad reach of Web-based behavioral interventions. Implementing Internet interventions, however, presents challenges including preventing fraudulent respondents and ensuring intervention uptake. Although Web-based organ donation interventions have increased in recent years, process evaluation models appropriate for Web-based interventions are lacking. Objective The aim of this study was to describe a refined process evaluation model adapted for Web-based settings and used to assess the implementation of a Web-based intervention aimed to increase organ donation among African Americans. Methods We used a randomized pretest-posttest control design to assess the effectiveness of the intervention website that addressed barriers to organ donation through corresponding videos. Eligible participants were African American adult residents of Georgia who were not registered on the state donor registry. Drawing from previously developed process evaluation constructs, we adapted reach (the extent to which individuals were found eligible, and participated in the study), recruitment (online recruitment mechanism), dose received (intervention uptake), and context (how the Web-based setting influenced study implementation) for Internet settings and used the adapted model to assess the implementation of our Web-based intervention. Results With regard to reach, 1415 individuals completed the eligibility screener; 948 (67.00%) were determined eligible, of whom 918 (96.8%) completed the study. After eliminating duplicate entries (n=17), those who did not initiate the posttest (n=21) and those with an invalid ZIP code (n=108), 772 valid entries remained. Per the Internet protocol (IP) address analysis, only 23 of the 772 valid entries (3.0%) were within Georgia, and only 17 of those were considered unique entries and could be considered for analyses. With respect to recruitment, 517 of the 772 valid entries (67.0%) of participants were recruited from a Web recruiter. Regarding dose received, no videos from the intervention website were watched in their entirety, and the average viewing duration was 17 seconds over the minimum. With respect to context, context analysis provided us with valuable insights into factors in the Internet environment that may have affected study implementation. Although only active for a brief period of time, the Craigslist website advertisement may have contributed the largest volume of fraudulent responses. Conclusions We determined fraud and low uptake to be serious threats to this study and further confirmed the importance of conducting a process evaluation to identify such threats. We suggest checking participants’ IP addresses before study initiation, selecting software that allows for automatic duplicate protection, and tightening minimum requirements for intervention uptake. Further research is needed to understand how process evaluation models can be used to monitor implementation of Web-based studies. PMID:29191799
Graphic Novels, Web Comics, and Creator Blogs: Examining Product and Process
ERIC Educational Resources Information Center
Carter, James Bucky
2011-01-01
Young adult literature (YAL) of the late 20th and early 21st century is exploring hybrid forms with growing regularity by embracing textual conventions from sequential art, video games, film, and more. As well, Web-based technologies have given those who consume YAL more immediate access to authors, their metacognitive creative processes, and…
Developing and Evaluating the GriefLink Web Site: Processes, Protocols, Dilemmas and Lessons Learned
ERIC Educational Resources Information Center
Clark, Sheila; Burgess, Teresa; Laven, Gillian; Bull, Michael; Marker, Julie; Browne, Eric
2004-01-01
Despite a profusion of recommendations regarding the quality of web sites and guidelines related to ethical issues surrounding health-related sites, there is little guidance for the design and evaluation of sites relating to loss and grief. This article, which addresses these deficiencies, results from a community consultation process of designing…
WaveNet: A Web-Based Metocean Data Access, Processing and Analysis Tool; Part 5 - WW3 Database
2015-02-01
Program ( CDIP ); and Part 4 for the Great Lakes Observing System/Coastal Forecasting System (GLOS/GLCFS). Using step-by-step instructions, this Part 5...Demirbilek, Z., L. Lin, and D. Wilson. 2014a. WaveNet: A web-based metocean data access, processing, and analysis tool; part 3– CDIP database
Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry
ERIC Educational Resources Information Center
Sun, Daner; Looi, Chee-Kit
2013-01-01
The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…
Web-based platform for collaborative medical imaging research
NASA Astrophysics Data System (ADS)
Rittner, Leticia; Bento, Mariana P.; Costa, André L.; Souza, Roberto M.; Machado, Rubens C.; Lotufo, Roberto A.
2015-03-01
Medical imaging research depends basically on the availability of large image collections, image processing and analysis algorithms, hardware and a multidisciplinary research team. It has to be reproducible, free of errors, fast, accessible through a large variety of devices spread around research centers and conducted simultaneously by a multidisciplinary team. Therefore, we propose a collaborative research environment, named Adessowiki, where tools and datasets are integrated and readily available in the Internet through a web browser. Moreover, processing history and all intermediate results are stored and displayed in automatic generated web pages for each object in the research project or clinical study. It requires no installation or configuration from the client side and offers centralized tools and specialized hardware resources, since processing takes place in the cloud.
Customer Decision Making in Web Services with an Integrated P6 Model
NASA Astrophysics Data System (ADS)
Sun, Zhaohao; Sun, Junqing; Meredith, Grant
Customer decision making (CDM) is an indispensable factor for web services. This article examines CDM in web services with a novel P6 model, which consists of the 6 Ps: privacy, perception, propensity, preference, personalization and promised experience. This model integrates the existing 6 P elements of marketing mix as the system environment of CDM in web services. The new integrated P6 model deals with the inner world of the customer and incorporates what the customer think during the DM process. The proposed approach will facilitate the research and development of web services and decision support systems.
ERIC Educational Resources Information Center
McCracken, Holly
2009-01-01
The importance of the interconnectedness of academic, student, and technical support processes intrinsic to the provision of on-line instruction has been frequently depicted as a "service Web," with students at the center of the infrastructure. However, as programming to support distance learning continues to develop, such service Webs have grown…
Speaking the Same Language: Information College Seekers Look for on a College Web Site
ERIC Educational Resources Information Center
Tucciarone, Krista M.
2009-01-01
The purpose of this qualitative study is to analyze and understand what information students seek from a college's Web site during their college search. Often, college Web sites fail either to offer students an interactive dialogue or to involve them in the communicative process, negatively affecting students' college search. Undergraduate…
Collaborative Writing among Second Language Learners in Academic Web-Based Projects
ERIC Educational Resources Information Center
Kessler, Greg; Bikowski, Dawn; Boggs, Jordan
2012-01-01
This study investigates Web-based, project oriented, many-to-many collaborative writing for academic purposes. Thirty-eight Fulbright scholars in an orientation program at a large Midwestern university used a Web-based word processing tool to collaboratively plan and report on a research project. The purpose of this study is to explore and…
Faculty Recommendations for Web Tools: Implications for Course Management Systems
ERIC Educational Resources Information Center
Oliver, Kevin; Moore, John
2008-01-01
A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…
Evaluations of Students on Facebook as an Educational Environment
ERIC Educational Resources Information Center
Coklar, Ahmet Naci
2012-01-01
Taking cognizance of the transformation experienced in education technologies, the concept that comes into prominence in integration of ICTs to education process at present is web 2.0. The main philosophy of web 2.0 technologies is its contribution to content formation of users and high-level interaction between users. One of web 2.0 technologies…
User-Centered Design and Usability Testing of a Web Site: An Illustrative Case Study.
ERIC Educational Resources Information Center
Corry, Michael D.; Frick, Theodore W.; Hansen, Lisa
1997-01-01
Presents an overview of user-centered design and usability testing. Describes a Web site evaluation project at a university, the iterative process of rapid prototyping and usability testing, and how the findings helped to improve the design. Discusses recommendations for university Web site design and reflects on problems faced in usability…
The Effect of Web-Based Portfolio Use on Academic Achievement and Retention
ERIC Educational Resources Information Center
Guzeller, Cem Oktay
2012-01-01
The web-based portfolio emerged as a result of the influence of technological developments on educational practices. In this study, the effect of the web-based portfolio building process on academic achievement and retention is explored. For this purpose, a study platform known as a computer-assisted personal development portfolio was designed for…
Effectiveness of Learning Process Using "Web Technology" in the Distance Learning System
ERIC Educational Resources Information Center
Killedar, Manoj
2008-01-01
Web is a globally distributed, still highly personalized media for cost-effective delivery of multimedia information and services. Web is expected to have a strong impact on almost every aspect of how we learn. "Total Quality" is the totality of features, as perceived by the customers of the product or service. Totality of features…
Problem-Based Learning in Web-Based Science Classroom.
ERIC Educational Resources Information Center
Kim, Heeyoung; Chung, Ji-Sook; Kim, Younghoon
The purpose of this paper is to discuss how general problem-based learning (PBL) models and social-constructivist perspectives are applied to the design and development of a Web-based science program, which emphasizes inquiry-based learning for fifth grade students. The paper also deals with the general features and learning process of a Web-based…
ERIC Educational Resources Information Center
Levitt, Roberta; Piro, Joseph
2014-01-01
Technology integration and Information and Communication Technology (ICT)-based education have enhanced the teaching and learning process by introducing a range of web-based instructional resources for classroom practitioners to deepen and extend instruction. One of the most durable of these resources has been the WebQuest. Introduced around the…
The Effectiveness of Web-Based Learning Environment: A Case Study of Public Universities in Kenya
ERIC Educational Resources Information Center
Kirui, Paul A.; Mutai, Sheila J.
2010-01-01
Web mining is emerging in many aspects of e-learning, aiming at improving online learning and teaching processes and making them more transparent and effective. Researchers using Web mining tools and techniques are challenged to learn more about the online students' reshaping online courses and educational websites, and create tools for…
ERIC Educational Resources Information Center
Young, Shelley Shwu-Ching; Huang, Yi-Long; Jang, Jyh-Shing Roger
2000-01-01
Describes the development and implementation process of a Web-based science museum in Taiwan. Topics include use of the Internet; lifelong distance learning; museums and the Internet; objectives of the science museum; funding; categories of exhibitions; analysis of Web users; homepage characteristics; graphics and the effect on speed; and future…
A midas plugin to enable construction of reproducible web-based image processing pipelines
Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A.; Oguz, Ipek
2013-01-01
Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline. PMID:24416016
A midas plugin to enable construction of reproducible web-based image processing pipelines.
Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek
2013-01-01
Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.
A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don
2011-01-01
A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.
Manufacturing process and material selection in concurrent collaborative design of MEMS devices
NASA Astrophysics Data System (ADS)
Zha, Xuan F.; Du, H.
2003-09-01
In this paper we present knowledge of an intensive approach and system for selecting suitable manufacturing processes and materials for microelectromechanical systems (MEMS) devices in concurrent collaborative design environment. In the paper, fundamental issues on MEMS manufacturing process and material selection such as concurrent design framework, manufacturing process and material hierarchies, and selection strategy are first addressed. Then, a fuzzy decision support scheme for a multi-criteria decision-making problem is proposed for estimating, ranking and selecting possible manufacturing processes, materials and their combinations. A Web-based prototype advisory system for the MEMS manufacturing process and material selection, WebMEMS-MASS, is developed based on the client-knowledge server architecture and framework to help the designer find good processes and materials for MEMS devices. The system, as one of the important parts of an advanced simulation and modeling tool for MEMS design, is a concept level process and material selection tool, which can be used as a standalone application or a Java applet via the Web. The running sessions of the system are inter-linked with webpages of tutorials and reference pages to explain the facets, fabrication processes and material choices, and calculations and reasoning in selection are performed using process capability and material property data from a remote Web-based database and interactive knowledge base that can be maintained and updated via the Internet. The use of the developed system including operation scenario, use support, and integration with an MEMS collaborative design system is presented. Finally, an illustration example is provided.
2018-01-01
Background Structural and functional brain images are essential imaging modalities for medical experts to study brain anatomy. These images are typically visually inspected by experts. To analyze images without any bias, they must be first converted to numeric values. Many software packages are available to process the images, but they are complex and difficult to use. The software packages are also hardware intensive. The results obtained after processing vary depending on the native operating system used and its associated software libraries; data processed in one system cannot typically be combined with data on another system. Objective The aim of this study was to fulfill the neuroimaging community’s need for a common platform to store, process, explore, and visualize their neuroimaging data and results using Neuroimaging Web Services Interface: a series of processing pipelines designed as a cyber physical system for neuroimaging and clinical data in brain research. Methods Neuroimaging Web Services Interface accepts magnetic resonance imaging, positron emission tomography, diffusion tensor imaging, and functional magnetic resonance imaging. These images are processed using existing and custom software packages. The output is then stored as image files, tabulated files, and MySQL tables. The system, made up of a series of interconnected servers, is password-protected and is securely accessible through a Web interface and allows (1) visualization of results and (2) downloading of tabulated data. Results All results were obtained using our processing servers in order to maintain data validity and consistency. The design is responsive and scalable. The processing pipeline started from a FreeSurfer reconstruction of Structural magnetic resonance imaging images. The FreeSurfer and regional standardized uptake value ratio calculations were validated using Alzheimer’s Disease Neuroimaging Initiative input images, and the results were posted at the Laboratory of Neuro Imaging data archive. Notable leading researchers in the field of Alzheimer’s Disease and epilepsy have used the interface to access and process the data and visualize the results. Tabulated results with unique visualization mechanisms help guide more informed diagnosis and expert rating, providing a truly unique multimodal imaging platform that combines magnetic resonance imaging, positron emission tomography, diffusion tensor imaging, and resting state functional magnetic resonance imaging. A quality control component was reinforced through expert visual rating involving at least 2 experts. Conclusions To our knowledge, there is no validated Web-based system offering all the services that Neuroimaging Web Services Interface offers. The intent of Neuroimaging Web Services Interface is to create a tool for clinicians and researchers with keen interest on multimodal neuroimaging. More importantly, Neuroimaging Web Services Interface significantly augments the Alzheimer’s Disease Neuroimaging Initiative data, especially since our data contain a large cohort of Hispanic normal controls and Alzheimer’s Disease patients. The obtained results could be scrutinized visually or through the tabulated forms, informing researchers on subtle changes that characterize the different stages of the disease. PMID:29699962
The effect of tooling design parameters on web-warping in the flexible roll forming of UHSS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiao, Jingsi; Weiss, Matthias; Rolfe, Bernard
To reduce weight and improve passenger safety there is an increased need in the automotive industry to use Ultra High Strength Steels (UHSS) for structural and crash components. However, the application of UHSS is restricted by their limited formability and the difficulty of forming them in conventional processes. An alternative method of manufacturing structural auto body parts from UHSS is the flexible roll forming process which can accommodate materials with high strength and limited ductility in the production of complex and weight-optimised components. However, one major concern in the flexible roll forming is web-warping, which is the height deviation ofmore » the profile web area. This paper investigates, using a numerical model, the effect on web-warping with respect to various forming methods. The results demonstrate that different forming methods lead to different amount of web-warping in terms of forming the product with identical geometry.« less
Using the World Wide Web for GIDEP Problem Data Processing at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
McPherson, John W.; Haraway, Sandra W.; Whirley, J. Don
1999-01-01
Since April 1997, Marshall Space Flight Center has been using electronic transfer and the web to support our processing of the Government-Industry Data Exchange Program (GIDEP) and NASA ALERT information. Specific aspects include: (1) Extraction of ASCII text information from GIDEP for loading into Word documents for e-mail to ALERT actionees; (2) Downloading of GIDEP form image formats in Adobe Acrobat (.pdf) for internal storage display on the MSFC ALERT web page; (3) Linkage of stored GRDEP problem forms with summary information for access from the MSFC ALERT Distribution Summary Chart or from an html table of released MSFC ALERTs (4) Archival of historic ALERTs for reference by GIDEP ID, MSFC ID, or MSFC release date; (5) On-line tracking of ALERT response status using a Microsoft Access database and the web (6) On-line response to ALERTs from MSFC actionees through interactive web forms. The technique, benefits, effort, coordination, and lessons learned for each aspect are covered herein.
Solar cells and modules from dentritic web silicon
NASA Technical Reports Server (NTRS)
Campbell, R. B.; Rohatgi, A.; Seman, E. J.; Davis, J. R.; Rai-Choudhury, P.; Gallagher, B. D.
1980-01-01
Some of the noteworthy features of the processes developed in the fabrication of solar cell modules are the handling of long lengths of web, the use of cost effective dip coating of photoresist and antireflection coatings, selective electroplating of the grid pattern and ultrasonic bonding of the cell interconnect. Data on the cells is obtained by means of dark I-V analysis and deep level transient spectroscopy. A histogram of over 100 dentritic web solar cells fabricated in a number of runs using different web crystals shows an average efficiency of over 13%, with some efficiencies running above 15%. Lower cell efficiency is generally associated with low minority carrier time due to recombination centers sometimes present in the bulk silicon. A cost analysis of the process sequence using a 25 MW production line indicates a selling price of $0.75/peak watt in 1986. It is concluded that the efficiency of dentritic web cells approaches that of float zone silicon cells, reduced somewhat by the lower bulk lifetime of the former.
Web service activities at the IRIS DMC to support federated and multidisciplinary access
NASA Astrophysics Data System (ADS)
Trabant, Chad; Ahern, Timothy K.
2013-04-01
At the IRIS Data Management Center (DMC) we have developed a suite of web service interfaces to access our large archive of, primarily seismological, time series data and related metadata. The goals of these web services include providing: a) next-generation and easily used access interfaces for our current users, b) access to data holdings in a form usable for non-seismologists, c) programmatic access to facilitate integration into data processing workflows and d) a foundation for participation in federated data discovery and access systems. To support our current users, our services provide access to the raw time series data and metadata or conversions of the raw data to commonly used formats. Our services also support simple, on-the-fly signal processing options that are common first steps in many workflows. Additionally, high-level data products derived from raw data are available via service interfaces. To support data access by researchers unfamiliar with seismic data we offer conversion of the data to broadly usable formats (e.g. ASCII text) and data processing to convert the data to Earth units. By their very nature, web services are programmatic interfaces. Combined with ubiquitous support for web technologies in programming & scripting languages and support in many computing environments, web services are very well suited for integrating data access into data processing workflows. As programmatic interfaces that can return data in both discipline-specific and broadly usable formats, our services are also well suited for participation in federated and brokered systems either specific to seismology or multidisciplinary. Working within the International Federation of Digital Seismograph Networks, the DMC collaborated on the specification of standardized web service interfaces for use at any seismological data center. These data access interfaces, when supported by multiple data centers, will form a foundation on which to build discovery and access mechanisms for data sets spanning multiple centers. To promote the adoption of these standardized services the DMC has developed portable implementations of the software needed to host these interfaces, minimizing the work required at each data center. Within the COOPEUS project framework, the DMC is working with EU partners to install web services implementations at multiple data centers in Europe.
Integrating Thematic Web Portal Capabilities into the NASA Earthdata Web Infrastructure
NASA Technical Reports Server (NTRS)
Wong, Minnie; Baynes, Kathleen E.; Huang, Thomas; McLaughlin, Brett
2015-01-01
This poster will present the process of integrating thematic web portal capabilities into the NASA Earth data web infrastructure, with examples from the Sea Level Change Portal. The Sea Level Change Portal will be a source of current NASA research, data and information regarding sea level change. The portal will provide sea level change information through articles, graphics, videos and animations, an interactive tool to view and access sea level change data and a dashboard showing sea level change indicators.
Earth Science Mining Web Services
NASA Astrophysics Data System (ADS)
Pham, L. B.; Lynnes, C. S.; Hegde, M.; Graves, S.; Ramachandran, R.; Maskey, M.; Keiser, K.
2008-12-01
To allow scientists further capabilities in the area of data mining and web services, the Goddard Earth Sciences Data and Information Services Center (GES DISC) and researchers at the University of Alabama in Huntsville (UAH) have developed a system to mine data at the source without the need of network transfers. The system has been constructed by linking together several pre-existing technologies: the Simple Scalable Script-based Science Processor for Measurements (S4PM), a processing engine at the GES DISC; the Algorithm Development and Mining (ADaM) system, a data mining toolkit from UAH that can be configured in a variety of ways to create customized mining processes; ActiveBPEL, a workflow execution engine based on BPEL (Business Process Execution Language); XBaya, a graphical workflow composer; and the EOS Clearinghouse (ECHO). XBaya is used to construct an analysis workflow at UAH using ADaM components, which are also installed remotely at the GES DISC, wrapped as Web Services. The S4PM processing engine searches ECHO for data using space-time criteria, staging them to cache, allowing the ActiveBPEL engine to remotely orchestrates the processing workflow within S4PM. As mining is completed, the output is placed in an FTP holding area for the end user. The goals are to give users control over the data they want to process, while mining data at the data source using the server's resources rather than transferring the full volume over the internet. These diverse technologies have been infused into a functioning, distributed system with only minor changes to the underlying technologies. The key to this infusion is the loosely coupled, Web- Services based architecture: All of the participating components are accessible (one way or another) through (Simple Object Access Protocol) SOAP-based Web Services.
Earth Science Mining Web Services
NASA Technical Reports Server (NTRS)
Pham, Long; Lynnes, Christopher; Hegde, Mahabaleshwa; Graves, Sara; Ramachandran, Rahul; Maskey, Manil; Keiser, Ken
2008-01-01
To allow scientists further capabilities in the area of data mining and web services, the Goddard Earth Sciences Data and Information Services Center (GES DISC) and researchers at the University of Alabama in Huntsville (UAH) have developed a system to mine data at the source without the need of network transfers. The system has been constructed by linking together several pre-existing technologies: the Simple Scalable Script-based Science Processor for Measurements (S4PM), a processing engine at he GES DISC; the Algorithm Development and Mining (ADaM) system, a data mining toolkit from UAH that can be configured in a variety of ways to create customized mining processes; ActiveBPEL, a workflow execution engine based on BPEL (Business Process Execution Language); XBaya, a graphical workflow composer; and the EOS Clearinghouse (ECHO). XBaya is used to construct an analysis workflow at UAH using ADam components, which are also installed remotely at the GES DISC, wrapped as Web Services. The S4PM processing engine searches ECHO for data using space-time criteria, staging them to cache, allowing the ActiveBPEL engine to remotely orchestras the processing workflow within S4PM. As mining is completed, the output is placed in an FTP holding area for the end user. The goals are to give users control over the data they want to process, while mining data at the data source using the server's resources rather than transferring the full volume over the internet. These diverse technologies have been infused into a functioning, distributed system with only minor changes to the underlying technologies. The key to the infusion is the loosely coupled, Web-Services based architecture: All of the participating components are accessible (one way or another) through (Simple Object Access Protocol) SOAP-based Web Services.
pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.
Halic, Tansel; Ahn, Woojin; De, Suvranu
2014-01-01
This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions.
CliniWeb: managing clinical information on the World Wide Web.
Hersh, W R; Brown, K E; Donohoe, L C; Campbell, E M; Horacek, A E
1996-01-01
The World Wide Web is a powerful new way to deliver on-line clinical information, but several problems limit its value to health care professionals: content is highly distributed and difficult to find, clinical information is not separated from non-clinical information, and the current Web technology is unable to support some advanced retrieval capabilities. A system called CliniWeb has been developed to address these problems. CliniWeb is an index to clinical information on the World Wide Web, providing a browsing and searching interface to clinical content at the level of the health care student or provider. Its database contains a list of clinical information resources on the Web that are indexed by terms from the Medical Subject Headings disease tree and retrieved with the assistance of SAPHIRE. Limitations of the processes used to build the database are discussed, together with directions for future research.
Pots, Wendy T M; Trompetter, Hester R; Schreurs, Karlein M G; Bohlmeijer, Ernst T
2016-05-23
Acceptance and Commitment Therapy (ACT) has been demonstrated to be effective in reducing depressive symptoms. However, little is known how and for whom therapeutic change occurs, specifically in web-based interventions. This study focuses on the mediators, moderators and predictors of change during a web-based ACT intervention. Data from 236 adults from the general population with mild to moderate depressive symptoms, randomized to either web-based ACT (n = 82) or one of two control conditions (web-based Expressive Writing (EW; n = 67) and a waiting list (n = 87)), were analysed. Single and multiple mediation analyses, and exploratory linear regression analyses were performed using PROCESS and linear regression analyses, to examine mediators, moderators and predictors on pre- to post- and follow-up treatment change of depressive symptoms. The treatment effect of ACT versus the waiting list was mediated by psychological flexibility and two mindfulness facets. The treatment effect of ACT versus EW was not significantly mediated. The moderator analyses demonstrated that the effects of web-based ACT did not vary according to baseline patient characteristics when compared to both control groups. However, higher baseline depressive symptoms and positive mental health and lower baseline anxiety were identified as predictors of outcome across all conditions. Similar results are found for follow-up. The findings of this study corroborate the evidence that psychological flexibility and mindfulness are distinct process mechanisms that mediate the effects of web-based ACT intervention. The results indicate that there are no restrictions to the allocation of web-based ACT intervention and that web-based ACT can work for different subpopulations. Netherlands Trial Register NTR2736 . Registered 6 February 2011.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface.
Jeliazkova, Nina; Jeliazkov, Vedrin
2011-05-16
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface
2011-01-01
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202
NASA Astrophysics Data System (ADS)
Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.
2007-12-01
NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.
ERIC Educational Resources Information Center
Blodgett, Cynthia S.
2008-01-01
The purpose of this grounded theory study was to examine the process by which people with Mild Traumatic Brain Injury (MTBI) access information on the web. Recent estimates include amateur sports and recreation injuries, non-hospital clinics and treatment facilities, private and public emergency department visits and admissions, providing…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-07
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM12-3-000] Revisions to Electric Quarterly Report Filing Process; Notice of Availability of Video Showing How To File Electric Quarterly Reports Using the Web Interface Take notice that the Federal Energy Regulatory Commission (Commission) is making available on its Web site ...
NASA Technical Reports Server (NTRS)
Fieldler, F. S.; Ast, D.
1982-01-01
Experimental techniques for the preparation of electron beam induced current samples of Web-dentritic silicon are described. Both as grown and processed material were investigated. High density dislocation networks were found close to twin planes in the bulk of the material. The electrical activity of these networks is reduced in processed material.
Flat-plate solar array project process development area process research of non-CZ silicon material
NASA Technical Reports Server (NTRS)
1985-01-01
Three sets of samples were laser processed and then cell processed. The laser processing was carried out on P-type and N-type web at laser power levels from 0.5 joule/sq cm to 2.5 joule/sq cm. Six different liquid dopants were tested (3 phosphorus dopants, 2 boron dopants, 1 aluminum dopant). The laser processed web strips were fabricated into solar cells immediately after laser processing and after various annealing cycles. Spreading resistance measurements made on a number of these samples indicate that the N(+)P (phosphorus doped) junction is approx. 0.2 micrometers deep and suitable for solar cells. However, the P(+)N (or P(+)P) junction is very shallow ( 0.1 micrometers) with a low surface concentration and resulting high resistance. Due to this effect, the fabricated cells are of low efficiency. The maximum efficiency attained was 9.6% on P-type web after a 700 C anneal. The main reason for the low efficiency was a high series resistance in the cell due to a high resistance back contact.
A Web-Based Monitoring System for Multidisciplinary Design Projects
NASA Technical Reports Server (NTRS)
Rogers, James L.; Salas, Andrea O.; Weston, Robert P.
1998-01-01
In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary computational environments, is defined as a hardware and software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, integrated with an existing framework, can improve these areas of weakness. This paper describes a Web-based system that optimizes and controls the execution sequence of design processes; and monitors the project status and results. The three-stage evolution of the system with increasingly complex problems demonstrates the feasibility of this approach.
A Web-Based System for Monitoring and Controlling Multidisciplinary Design Projects
NASA Technical Reports Server (NTRS)
Salas, Andrea O.; Rogers, James L.
1997-01-01
In today's competitive environment, both industry and government agencies are under enormous pressure to reduce the time and cost of multidisciplinary design projects. A number of frameworks have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. An examination of current frameworks reveals weaknesses in various areas such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, in conjunction with an existing framework, can improve these areas of weakness. This paper describes a system that executes a sequence of programs, monitors and controls the design process through a Web-based interface, and visualizes intermediate and final results through the use of Java(Tm) applets. A small sample problem, which includes nine processes with two analysis programs that are coupled to an optimizer, is used to demonstrate the feasibility of this approach.
Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser
NASA Astrophysics Data System (ADS)
Christen, M.
2016-06-01
Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.
Automatic Earth observation data service based on reusable geo-processing workflow
NASA Astrophysics Data System (ADS)
Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min
2008-12-01
A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.
An Image Retrieval and Processing Expert System for the World Wide Web
NASA Technical Reports Server (NTRS)
Rodriguez, Ricardo; Rondon, Angelica; Bruno, Maria I.; Vasquez, Ramon
1998-01-01
This paper presents a system that is being developed in the Laboratory of Applied Remote Sensing and Image Processing at the University of P.R. at Mayaguez. It describes the components that constitute its architecture. The main elements are: a Data Warehouse, an Image Processing Engine, and an Expert System. Together, they provide a complete solution to researchers from different fields that make use of images in their investigations. Also, since it is available to the World Wide Web, it provides remote access and processing of images.
The Use of Web-Based Portfolios in College Physical Education Activity Courses
ERIC Educational Resources Information Center
Hastie, Peter A.; Sinelnikov, Oleg A.
2007-01-01
This paper describes the introduction of web-based portfolios as a means of authentic assessment in collegiate physical education classes. Students in three volleyball classes were required to contribute to web-based team portfolios, and at the end of the semester, were able to make comment upon this process. A six-item on-line survey used to…
Integration of Web 2.0 Tools in Learning a Programming Course
ERIC Educational Resources Information Center
Majid, Nazatul Aini Abd
2014-01-01
Web 2.0 tools are expected to assist students to acquire knowledge effectively in their university environment. However, the lack of effort from lecturers in planning the learning process can make it difficult for the students to optimize their learning experiences. The aim of this paper is to integrate Web 2.0 tools with learning strategy in…