NASA Astrophysics Data System (ADS)
Wang, Jian
2017-01-01
In order to change traditional PE teaching mode and realize the interconnection, interworking and sharing of PE teaching resources, a distance PE teaching platform based on broadband network is designed and PE teaching information resource database is set up. The designing of PE teaching information resource database takes Windows NT 4/2000Server as operating system platform, Microsoft SQL Server 7.0 as RDBMS, and takes NAS technology for data storage and flow technology for video service. The analysis of system designing and implementation shows that the dynamic PE teaching information resource sharing platform based on Web Service can realize loose coupling collaboration, realize dynamic integration and active integration and has good integration, openness and encapsulation. The distance PE teaching platform based on Web Service and the design scheme of PE teaching information resource database can effectively solve and realize the interconnection, interworking and sharing of PE teaching resources and adapt to the informatization development demands of PE teaching.
A methodology toward manufacturing grid-based virtual enterprise operation platform
NASA Astrophysics Data System (ADS)
Tan, Wenan; Xu, Yicheng; Xu, Wei; Xu, Lida; Zhao, Xianhua; Wang, Li; Fu, Liuliu
2010-08-01
Virtual enterprises (VEs) have become one of main types of organisations in the manufacturing sector through which the consortium companies organise their manufacturing activities. To be competitive, a VE relies on the complementary core competences among members through resource sharing and agile manufacturing capacity. Manufacturing grid (M-Grid) is a platform in which the production resources can be shared. In this article, an M-Grid-based VE operation platform (MGVEOP) is presented as it enables the sharing of production resources among geographically distributed enterprises. The performance management system of the MGVEOP is based on the balanced scorecard and has the capacity of self-learning. The study shows that a MGVEOP can make a semi-automated process possible for a VE, and the proposed MGVEOP is efficient and agile.
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Schrock, Mitchell; Baldwin, John R.; Borden, Charles S.
2010-01-01
The Ground Resource Allocation and Planning Environment (GRAPE 1.0) is a Web-based, collaborative team environment based on the Microsoft SharePoint platform, which provides Deep Space Network (DSN) resource planners tools and services for sharing information and performing analysis.
ERIC Educational Resources Information Center
Quere, Nolwenn
2017-01-01
Designing and sharing Open Educational Resources (OERs) requires teachers to develop new competences, in particular with digital resources. In this paper, the case of a language resource production group is introduced. Due to the centrality of the OERs in their collective activity, I show that the documents they produce are essential to the…
Global Social Media Directory. A Resource Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noonan, Christine F.; Piatt, Andrew W.
Social media platforms are internet-based applications focused on broadcasting user-generated content. While primarily web-based, these services are increasingly available on mobile platforms. Communities and individuals share information, photos, music, videos, provide commentary and ratings/reviews, and more. In essence, social media is about sharing information, consuming information, and repurposing content. Social media technologies identified in this report are centered on social networking services, media sharing, blogging and microblogging. The purpose of this Resource Guide is to provide baseline information about use and application of social media platforms around the globe. It is not intended to be comprehensive as social media evolvesmore » on an almost daily basis. The long-term goal of this work is to identify social media information about all geographic regions and nations. The primary objective is that of understanding the evolution and spread of social networking and user-generated content technologies internationally.« less
A resilient and secure software platform and architecture for distributed spacecraft
NASA Astrophysics Data System (ADS)
Otte, William R.; Dubey, Abhishek; Karsai, Gabor
2014-06-01
A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.
ERIC Educational Resources Information Center
Lee, Ashley; Hobson, Joe; Bienkowski, Marie; Midgley, Steve; Currier, Sarah; Campbell, Lorna M.; Novoselova, Tatiana
2012-01-01
In this article, the authors describe an open-source, open-data digital infrastructure for sharing information about open educational resources (OERs) across disparate systems and platforms. The Learning Registry, which began as a project funded by the U.S. Departments of Education and Defense, currently has an active international community…
HydroShare: A Platform for Collaborative Data and Model Sharing in Hydrology
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Couch, A.; Hooper, R. P.; Dash, P. K.; Stealey, M.; Yi, H.; Bandaragoda, C.; Castronova, A. M.
2017-12-01
HydroShare is an online, collaboration system for sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined by standardized content types for data formats and models commonly used in hydrology. With HydroShare you can: Share your data and models with colleagues; Manage who has access to the content that you share; Share, access, visualize and manipulate a broad set of hydrologic data types and models; Use the web services application programming interface (API) to program automated and client access; Publish data and models and obtain a citable digital object identifier (DOI); Aggregate your resources into collections; Discover and access data and models published by others; Use web apps to visualize, analyze and run models on data in HydroShare. This presentation will describe the functionality and architecture of HydroShare highlighting its use as a virtual environment supporting education and research. HydroShare has components that support: (1) resource storage, (2) resource exploration, and (3) web apps for actions on resources. The HydroShare data discovery, sharing and publishing functions as well as HydroShare web apps provide the capability to analyze data and execute models completely in the cloud (servers remote from the user) overcoming desktop platform limitations. The HydroShare GIS app provides a basic capability to visualize spatial data. The HydroShare JupyterHub Notebook app provides flexible and documentable execution of Python code snippets for analysis and modeling in a way that results can be shared among HydroShare users and groups to support research collaboration and education. We will discuss how these developments can be used to support different types of educational efforts in Hydrology where being completely web based is of value in an educational setting as students can all have access to the same functionality regardless of their computer.
ERIC Educational Resources Information Center
Selwyn, N.; Banaji, S.; Hadjithoma-Garstka, C.; Clark, W.
2011-01-01
This paper investigates how schools are supporting parents' involvement with their children's education through the use of "Learning Platform" technologies--i.e. the integrated use of virtual learning environments, management information systems, communications, and other information and resource-sharing technologies. Based on in-depth…
Yang, Liqun
2016-01-01
Through the establishment of electronic health records, health education and measures such as regional information sharing platform, we explored the management of patients with alcohol dependence living in communities and established a medical information resource sharing model between mental hospital-community to strengthen the supportive intervention management of patients with alcohol dependence, improve the effect of intervention and reduce the rate of compound drink. To design the questionnaire of health state for patients with alcohol dependence. After data collection. We should establish electronic health records and community support intervention, make medical health card with terminal configuration card reader in both mental hospitals and community, develop information platform, establish a variety of supporting interventions and the service function modules, unblock information sharing between hospitals and community to make full use of the platform to carry out health education and health intervention management. The effectives of community supportive intervention are improved, rehabilitation rate of patients is reduced greatly, bad ways of life behavior are better. Establishing electronic health records is an important mean of community supportive interventions which is good for Real-time, dynamic management and promoting self-management skills making the dream of medical information resource between hospital-community sharing come true.
A Geospatial Information Grid Framework for Geological Survey.
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.
A Geospatial Information Grid Framework for Geological Survey
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255
[The digital information platform after-sale service of medical equipment].
Cao, Shaoping; Li, Bin
2015-01-01
This paper describes the after-sale service of medical equipment information management platform, with large data sharing resources to further enhance customer service in the whole management process of medical service, to strengthen quality management, to control medical risk.
Image Sharing in Radiology-A Primer.
Chatterjee, Arindam R; Stalcup, Seth; Sharma, Arjun; Sato, T Shawn; Gupta, Pushpender; Lee, Yueh Z; Malone, Christopher; McBee, Morgan; Hotaling, Elise L; Kansagra, Akash P
2017-03-01
By virtue of its information technology-oriented infrastructure, the specialty of radiology is uniquely positioned to be at the forefront of efforts to promote data sharing across the healthcare enterprise, including particularly image sharing. The potential benefits of image sharing for clinical, research, and educational applications in radiology are immense. In this work, our group-the Association of University Radiologists (AUR) Radiology Research Alliance Task Force on Image Sharing-reviews the benefits of implementing image sharing capability, introduces current image sharing platforms and details their unique requirements, and presents emerging platforms that may see greater adoption in the future. By understanding this complex ecosystem of image sharing solutions, radiologists can become important advocates for the successful implementation of these powerful image sharing resources. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Collaborative workbench for cyberinfrastructure to accelerate science algorithm development
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Maskey, M.; Kuo, K.; Lynnes, C.
2013-12-01
There are significant untapped resources for information and knowledge creation within the Earth Science community in the form of data, algorithms, services, analysis workflows or scripts, and the related knowledge about these resources. Despite the huge growth in social networking and collaboration platforms, these resources often reside on an investigator's workstation or laboratory and are rarely shared. A major reason for this is that there are very few scientific collaboration platforms, and those that exist typically require the use of a new set of analysis tools and paradigms to leverage the shared infrastructure. As a result, adoption of these collaborative platforms for science research is inhibited by the high cost to an individual scientist of switching from his or her own familiar environment and set of tools to a new environment and tool set. This presentation will describe an ongoing project developing an Earth Science Collaborative Workbench (CWB). The CWB approach will eliminate this barrier by augmenting a scientist's current research environment and tool set to allow him or her to easily share diverse data and algorithms. The CWB will leverage evolving technologies such as commodity computing and social networking to design an architecture for scalable collaboration that will support the emerging vision of an Earth Science Collaboratory. The CWB is being implemented on the robust and open source Eclipse framework and will be compatible with widely used scientific analysis tools such as IDL. The myScience Catalog built into CWB will capture and track metadata and provenance about data and algorithms for the researchers in a non-intrusive manner with minimal overhead. Seamless interfaces to multiple Cloud services will support sharing algorithms, data, and analysis results, as well as access to storage and computer resources. A Community Catalog will track the use of shared science artifacts and manage collaborations among researchers.
A Research on E - learning Resources Construction Based on Semantic Web
NASA Astrophysics Data System (ADS)
Rui, Liu; Maode, Deng
Traditional e-learning platforms have the flaws that it's usually difficult to query or positioning, and realize the cross platform sharing and interoperability. In the paper, the semantic web and metadata standard is discussed, and a kind of e - learning system framework based on semantic web is put forward to try to solve the flaws of traditional elearning platforms.
NASA Astrophysics Data System (ADS)
Valentine, G. A.
2012-12-01
VHub (short for VolcanoHub, and accessible at vhub.org) is an online platform for collaboration in research and training related to volcanoes, the hazards they pose, and risk mitigation. The underlying concept is to provide a mechanism that enables workers to share information with colleagues around the globe; VHub and similar hub technologies could prove very powerful in collaborating and communicating about circum-Pacific volcanic hazards. Collaboration occurs around several different points: (1) modeling and simulation; (2) data sharing; (3) education and training; (4) volcano observatories; and (5) project-specific groups. VHub promotes modeling and simulation in two ways: (1) some models can be implemented on VHub for online execution. This eliminates the need to download and compile a code on a local computer. VHub can provide a central "warehouse" for such models that should result in broader dissemination. VHub also provides a platform that supports the more complex CFD models by enabling the sharing of code development and problem-solving knowledge, benchmarking datasets, and the development of validation exercises. VHub also provides a platform for sharing of data and datasets. The VHub development team is implementing the iRODS data sharing middleware (see irods.org). iRODS allows a researcher to access data that are located at participating data sources around the world (a "cloud" of data) as if the data were housed in a single virtual database. Education and training is another important use of the VHub platform. Audio-video recordings of seminars, PowerPoint slide sets, and educational simulations are all items that can be placed onto VHub for use by the community or by selected collaborators. An important point is that the "manager" of a given educational resource (or any other resource, such as a dataset or a model) can control the privacy of that resource, ranging from private (only accessible by, and known to, specific collaborators) to completely public. Materials for use in the classroom can be shared via VHub. VHub is a very useful platform for project-specific collaborations. With a group site on VHub where collaborators share documents, datasets, maps, and have ongoing discussions using the discussion board function. VHub is funded by the U.S. National Science Foundation, and is participating in development of larger earth-science cyberinfrastructure initiatives (EarthCube), as well as supporting efforts such as the Global Volcano Model.
Win–win data sharing in neuroscience
Ascoli, Giorgio A; Maraver, Patricia; Nanda, Sumit; Polavaram, Sridevi; Armañanzas, Rubén
2017-01-01
Most neuroscientists have yet to embrace a culture of data sharing. Using our decade-long experience at NeuroMorpho.Org as an example, we discuss how publicly available repositories may benefit data producers and end-users alike. We outline practical recipes for resource developers to maximize the research impact of data sharing platforms for both contributors and users. PMID:28139675
NASA Astrophysics Data System (ADS)
Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.
2013-10-01
In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.
SEEK: a systems biology data and model management platform.
Wolstencroft, Katherine; Owen, Stuart; Krebs, Olga; Nguyen, Quyen; Stanford, Natalie J; Golebiewski, Martin; Weidemann, Andreas; Bittkowski, Meik; An, Lihua; Shockley, David; Snoep, Jacky L; Mueller, Wolfgang; Goble, Carole
2015-07-11
Systems biology research typically involves the integration and analysis of heterogeneous data types in order to model and predict biological processes. Researchers therefore require tools and resources to facilitate the sharing and integration of data, and for linking of data to systems biology models. There are a large number of public repositories for storing biological data of a particular type, for example transcriptomics or proteomics, and there are several model repositories. However, this silo-type storage of data and models is not conducive to systems biology investigations. Interdependencies between multiple omics datasets and between datasets and models are essential. Researchers require an environment that will allow the management and sharing of heterogeneous data and models in the context of the experiments which created them. The SEEK is a suite of tools to support the management, sharing and exploration of data and models in systems biology. The SEEK platform provides an access-controlled, web-based environment for scientists to share and exchange data and models for day-to-day collaboration and for public dissemination. A plug-in architecture allows the linking of experiments, their protocols, data, models and results in a configurable system that is available 'off the shelf'. Tools to run model simulations, plot experimental data and assist with data annotation and standardisation combine to produce a collection of resources that support analysis as well as sharing. Underlying semantic web resources additionally extract and serve SEEK metadata in RDF (Resource Description Format). SEEK RDF enables rich semantic queries, both within SEEK and between related resources in the web of Linked Open Data. The SEEK platform has been adopted by many systems biology consortia across Europe. It is a data management environment that has a low barrier of uptake and provides rich resources for collaboration. This paper provides an update on the functions and features of the SEEK software, and describes the use of the SEEK in the SysMO consortium (Systems biology for Micro-organisms), and the VLN (virtual Liver Network), two large systems biology initiatives with different research aims and different scientific communities.
The Development of GIS Educational Resources Sharing among Central Taiwan Universities
NASA Astrophysics Data System (ADS)
Chou, T.-Y.; Yeh, M.-L.; Lai, Y.-C.
2011-09-01
Using GIS in the classroom enhance students' computer skills and explore the range of knowledge. The paper highlights GIS integration on e-learning platform and introduces a variety of abundant educational resources. This research project will demonstrate tools for e-learning environment and delivers some case studies for learning interaction from Central Taiwan Universities. Feng Chia University (FCU) obtained a remarkable academic project subsidized by Ministry of Education and developed e-learning platform for excellence in teaching/learning programs among Central Taiwan's universities. The aim of the project is to integrate the educational resources of 13 universities in central Taiwan. FCU is serving as the hub of Center University. To overcome the problem of distance, e-platforms have been established to create experiences with collaboration enhanced learning. The e-platforms provide coordination of web service access among the educational community and deliver GIS educational resources. Most of GIS related courses cover the development of GIS, principles of cartography, spatial data analysis and overlaying, terrain analysis, buffer analysis, 3D GIS application, Remote Sensing, GPS technology, and WebGIS, MobileGIS, ArcGIS manipulation. In each GIS case study, students have been taught to know geographic meaning, collect spatial data and then use ArcGIS software to analyze spatial data. On one of e-Learning platforms provide lesson plans and presentation slides. Students can learn Arc GIS online. As they analyze spatial data, they can connect to GIS hub to get data they need including satellite images, aerial photos, and vector data. Moreover, e-learning platforms provide solutions and resources. Different levels of image scales have been integrated into the systems. Multi-scale spatial development and analyses in Central Taiwan integrate academic research resources among CTTLRC partners. Thus, establish decision-making support mechanism in teaching and learning. Accelerate communication, cooperation and sharing among academic units
An Open Software Platform for Sharing Water Resource Models, Code and Data
NASA Astrophysics Data System (ADS)
Knox, Stephen; Meier, Philipp; Mohamed, Khaled; Korteling, Brett; Matrosov, Evgenii; Huskova, Ivana; Harou, Julien; Rosenberg, David; Tilmant, Amaury; Medellin-Azuara, Josue; Wicks, Jon
2016-04-01
The modelling of managed water resource systems requires new approaches in the face of increasing future uncertainty. Water resources management models, even if applied to diverse problem areas, use common approaches such as representing the problem as a network of nodes and links. We propose a data management software platform, called Hydra, that uses this commonality to allow multiple models using a node-link structure to be managed and run using a single software system. Hydra's user interface allows users to manage network topology and associated data. Hydra feeds this data directly into a model, importing from and exporting to different file formats using Apps. An App connects Hydra to a custom model, a modelling system such as GAMS or MATLAB or to different file formats such as MS Excel, CSV and ESRI Shapefiles. Hydra allows users to manage their data in a single, consistent place. Apps can be used to run domain-specific models and allow users to work with their own required file formats. The Hydra App Store offers a collaborative space where model developers can publish, review and comment on Apps, models and data. Example Apps and open-source libraries are available in a variety of languages (Python, Java and .NET). The App Store can act as a hub for water resource modellers to view and share Apps, models and data easily. This encourages an ecosystem of development using a shared platform, resulting in more model integration and potentially greater unity within resource modelling communities. www.hydraplatform.org www.hydraappstore.com
The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science
Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo
2008-01-01
The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570
CloudMan as a platform for tool, data, and analysis distribution.
Afgan, Enis; Chapman, Brad; Taylor, James
2012-11-27
Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions.
NASA Astrophysics Data System (ADS)
Knox, S.; Meier, P.; Mohammed, K.; Korteling, B.; Matrosov, E. S.; Hurford, A.; Huskova, I.; Harou, J. J.; Rosenberg, D. E.; Thilmant, A.; Medellin-Azuara, J.; Wicks, J.
2015-12-01
Capacity expansion on resource networks is essential to adapting to economic and population growth and pressures such as climate change. Engineered infrastructure systems such as water, energy, or transport networks require sophisticated and bespoke models to refine management and investment strategies. Successful modeling of such complex systems relies on good data management and advanced methods to visualize and share data.Engineered infrastructure systems are often represented as networks of nodes and links with operating rules describing their interactions. Infrastructure system management and planning can be abstracted to simulating or optimizing new operations and extensions of the network. By separating the data storage of abstract networks from manipulation and modeling we have created a system where infrastructure modeling across various domains is facilitated.We introduce Hydra Platform, a Free Open Source Software designed for analysts and modelers to store, manage and share network topology and data. Hydra Platform is a Python library with a web service layer for remote applications, called Apps, to connect. Apps serve various functions including network or results visualization, data export (e.g. into a proprietary format) or model execution. This Client-Server architecture allows users to manipulate and share centrally stored data. XML templates allow a standardised description of the data structure required for storing network data such that it is compatible with specific models.Hydra Platform represents networks in an abstract way and is therefore not bound to a single modeling domain. It is the Apps that create domain-specific functionality. Using Apps researchers from different domains can incorporate different models within the same network enabling cross-disciplinary modeling while minimizing errors and streamlining data sharing. Separating the Python library from the web layer allows developers to natively expand the software or build web-based apps in other languages for remote functionality. Partner CH2M is developing a commercial user-interface for Hydra Platform however custom interfaces and visualization tools can be built. Hydra Platform is available on GitHub while Apps will be shared on a central repository.
Geospatial Service Platform for Education and Research
NASA Astrophysics Data System (ADS)
Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.
2014-04-01
We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.
Online collaboration and model sharing in volcanology via VHub.org
NASA Astrophysics Data System (ADS)
Valentine, G.; Patra, A. K.; Bajo, J. V.; Bursik, M. I.; Calder, E.; Carn, S. A.; Charbonnier, S. J.; Connor, C.; Connor, L.; Courtland, L. M.; Gallo, S.; Jones, M.; Palma Lizana, J. L.; Moore-Russo, D.; Renschler, C. S.; Rose, W. I.
2013-12-01
VHub (short for VolcanoHub, and accessible at vhub.org) is an online platform for barrier free access to high end modeling and simulation and collaboration in research and training related to volcanoes, the hazards they pose, and risk mitigation. The underlying concept is to provide a platform, building upon the successful HUBzero software infrastructure (hubzero.org), that enables workers to collaborate online and to easily share information, modeling and analysis tools, and educational materials with colleagues around the globe. Collaboration occurs around several different points: (1) modeling and simulation; (2) data sharing; (3) education and training; (4) volcano observatories; and (5) project-specific groups. VHub promotes modeling and simulation in two ways: (1) some models can be implemented on VHub for online execution. VHub can provide a central warehouse for such models that should result in broader dissemination. VHub also provides a platform that supports the more complex CFD models by enabling the sharing of code development and problem-solving knowledge, benchmarking datasets, and the development of validation exercises. VHub also provides a platform for sharing of data and datasets. The VHub development team is implementing the iRODS data sharing middleware (see irods.org). iRODS allows a researcher to access data that are located at participating data sources around the world (a cloud of data) as if the data were housed in a single virtual database. Projects associated with VHub are also going to introduce the use of data driven workflow tools to support the use of multistage analysis processes where computing and data are integrated for model validation, hazard analysis etc. Audio-video recordings of seminars, PowerPoint slide sets, and educational simulations are all items that can be placed onto VHub for use by the community or by selected collaborators. An important point is that the manager of a given educational resource (or any other resource, such as a dataset or a model) can control the privacy of that resource, ranging from private (only accessible by, and known to, specific collaborators) to completely public. VHub is a very useful platform for project-specific collaborations. With a group site on VHub collaborators share documents, datasets, maps, and have ongoing discussions using the discussion board function. VHub is funded by the U.S. National Science Foundation, and is participating in development of larger earth-science cyberinfrastructure initiatives (EarthCube), as well as supporting efforts such as the Global Volcano Model. Emerging VHub-facilitated efforts include model benchmarking, collaborative code development, and growth in online modeling tools.
A Framework to Integrate Public, Dynamic Metrics into an OER Platform
ERIC Educational Resources Information Center
Cohen, Jaclyn Zetta; Omollo, Kathleen Ludewig; Malicke, Dave
2014-01-01
The usage metrics for open educational resources (OER) are often either hidden behind an authentication system or shared intermittently in static, aggregated format at the repository level. This paper discusses the first year of University of Michigan's project to share its OER usage data dynamically, publicly, to synthesize it across different…
[Tumor Data Interacted System Design Based on Grid Platform].
Liu, Ying; Cao, Jiaji; Zhang, Haowei; Zhang, Ke
2016-06-01
In order to satisfy demands of massive and heterogeneous tumor clinical data processing and the multi-center collaborative diagnosis and treatment for tumor diseases,a Tumor Data Interacted System(TDIS)was established based on grid platform,so that an implementing virtualization platform of tumor diagnosis service was realized,sharing tumor information in real time and carrying on standardized management.The system adopts Globus Toolkit 4.0tools to build the open grid service framework and encapsulats data resources based on Web Services Resource Framework(WSRF).The system uses the middleware technology to provide unified access interface for heterogeneous data interaction,which could optimize interactive process with virtualized service to query and call tumor information resources flexibly.For massive amounts of heterogeneous tumor data,the federated stored and multiple authorized mode is selected as security services mechanism,real-time monitoring and balancing load.The system can cooperatively manage multi-center heterogeneous tumor data to realize the tumor patient data query,sharing and analysis,and compare and match resources in typical clinical database or clinical information database in other service node,thus it can assist doctors in consulting similar case and making up multidisciplinary treatment plan for tumors.Consequently,the system can improve efficiency of diagnosis and treatment for tumor,and promote the development of collaborative tumor diagnosis model.
CloudMan as a platform for tool, data, and analysis distribution
2012-01-01
Background Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. Results CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. Conclusions With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions. PMID:23181507
Mobile VR in Education: From the Fringe to the Mainstream
ERIC Educational Resources Information Center
Cochrane, Thomas
2016-01-01
This paper explores the development of virtual reality (VR) use in education and the emergence of mobile VR based content creation and sharing as a platform for enabling learner-generated content and learner-generated contexts. The author argues that an ecology of resources that maps the user content creation and sharing affordances of mobile…
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Arrigo, J.; Hooper, R. P.; Valentine, D. W.; Maidment, D. R.
2013-12-01
HydroShare is an online, collaborative system being developed for sharing hydrologic data and models. The goal of HydroShare is to enable scientists to easily discover and access data and models, retrieve them to their desktop or perform analyses in a distributed computing environment that may include grid, cloud or high performance computing model instances as necessary. Scientists may also publish outcomes (data, results or models) into HydroShare, using the system as a collaboration platform for sharing data, models and analyses. HydroShare is expanding the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated, creating new capability to share models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. One of the fundamental concepts in HydroShare is that of a Resource. All content is represented using a Resource Data Model that separates system and science metadata and has elements common to all resources as well as elements specific to the types of resources HydroShare will support. These will include different data types used in the hydrology community and models and workflows that require metadata on execution functionality. HydroShare will use the integrated Rule-Oriented Data System (iRODS) to manage federated data content and perform rule-based background actions on data and model resources, including parsing to generate metadata catalog information and the execution of models and workflows. This presentation will introduce the HydroShare functionality developed to date, describe key elements of the Resource Data Model and outline the roadmap for future development.
NASA Astrophysics Data System (ADS)
Yang, Z. L.; Cao, J.; Hu, K.; Gui, Z. P.; Wu, H. Y.; You, L.
2016-06-01
Efficient online discovering and applying geospatial information resources (GIRs) is critical in Earth Science domain as while for cross-disciplinary applications. However, to achieve it is challenging due to the heterogeneity, complexity and privacy of online GIRs. In this article, GeoSquare, a collaborative online geospatial information sharing and geoprocessing platform, was developed to tackle this problem. Specifically, (1) GIRs registration and multi-view query functions allow users to publish and discover GIRs more effectively. (2) Online geoprocessing and real-time execution status checking help users process data and conduct analysis without pre-installation of cumbersome professional tools on their own machines. (3) A service chain orchestration function enables domain experts to contribute and share their domain knowledge with community members through workflow modeling. (4) User inventory management allows registered users to collect and manage their own GIRs, monitor their execution status, and track their own geoprocessing histories. Besides, to enhance the flexibility and capacity of GeoSquare, distributed storage and cloud computing technologies are employed. To support interactive teaching and training, GeoSquare adopts the rich internet application (RIA) technology to create user-friendly graphical user interface (GUI). Results show that GeoSquare can integrate and foster collaboration between dispersed GIRs, computing resources and people. Subsequently, educators and researchers can share and exchange resources in an efficient and harmonious way.
SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog
NASA Astrophysics Data System (ADS)
Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely
2014-05-01
Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.
[Design and Implementation of Intelligent Mobile ECG].
Cao, Shaoping; Liu, Jian
2016-05-01
This paper introduces the development of intelligent mobile ECG, and internet big data sharing resources to further improve the remote diagnosis of medical service platform , to enhance the level of mobile medical standard and control medical risks.
Athey, Brian D; Braxenthaler, Michael; Haas, Magali; Guo, Yike
2013-01-01
tranSMART is an emerging global open source public private partnership community developing a comprehensive informatics-based analysis and data-sharing cloud platform for clinical and translational research. The tranSMART consortium includes pharmaceutical and other companies, not-for-profits, academic entities, patient advocacy groups, and government stakeholders. The tranSMART value proposition relies on the concept that the global community of users, developers, and stakeholders are the best source of innovation for applications and for useful data. Continued development and use of the tranSMART platform will create a means to enable "pre-competitive" data sharing broadly, saving money and, potentially accelerating research translation to cures. Significant transformative effects of tranSMART includes 1) allowing for all its user community to benefit from experts globally, 2) capturing the best of innovation in analytic tools, 3) a growing 'big data' resource, 4) convergent standards, and 5) new informatics-enabled translational science in the pharma, academic, and not-for-profit sectors.
[The role of social media in academic training in Urology. Adequate use.
Gómez Rivas, Juan; Tortolero Blanco, Leonardo; Rodríguez Socarras, Moises; García Sanz, Miguel; Carrión, Diego M; Okhunov, Zhamshid; Veneziano, Domenico
2018-01-01
Social media is characterized because all its services are participative. Users of 2.0 technologies can interact easily and openly with other people, share resources and communicate immediately and simultaneously. Research improves from participatory technologies by allowing groups to share reflections, methodologies, resources and results.The social media platform with greater diffusion and use in urology is possibly Twitter because it allows to realize what is known like "microblogging", the users generate comments and brief messages through the creation of "tweets". It is possible to determine that there are three broad areas from a scientific point of view in which social media are manifested: sharing research, resources and results. The use and applications of social media become a major responsibility in the area of health and urology, obviously for reasons of privacy, scientific rigor, ethics and the nature of the medical - legal content.
Advancing Collaboration through Hydrologic Data and Model Sharing
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.
2015-12-01
HydroShare is an online, collaborative system for open sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined primarily by standardized metadata, content data models for each resource type, and an overarching resource data model based on the Open Archives Initiative's Object Reuse and Exchange (OAI-ORE) standard and a hierarchical file packaging system called "BagIt". HydroShare expands the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated to include geospatial and multidimensional space-time datasets commonly used in hydrology. HydroShare also includes new capability for sharing models, model components, and analytical tools and will take advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. It also supports web services and server/cloud based computation operating on resources for the execution of hydrologic models and analysis and visualization of hydrologic data. HydroShare uses iRODS as a network file system for underlying storage of datasets and models. Collaboration is enabled by casting datasets and models as "social objects". Social functions include both private and public sharing, formation of collaborative groups of users, and value-added annotation of shared datasets and models. The HydroShare web interface and social media functions were developed using the Django web application framework coupled to iRODS. Data visualization and analysis is supported through the Tethys Platform web GIS software stack. Links to external systems are supported by RESTful web service interfaces to HydroShare's content. This presentation will introduce the HydroShare functionality developed to date and describe ongoing development of functionality to support collaboration and integration of data and models.
McQuilton, Peter; Gonzalez-Beltran, Alejandra; Rocca-Serra, Philippe; Thurston, Milo; Lister, Allyson; Maguire, Eamonn; Sansone, Susanna-Assunta
2016-01-01
BioSharing (http://www.biosharing.org) is a manually curated, searchable portal of three linked registries. These resources cover standards (terminologies, formats and models, and reporting guidelines), databases, and data policies in the life sciences, broadly encompassing the biological, environmental and biomedical sciences. Launched in 2011 and built by the same core team as the successful MIBBI portal, BioSharing harnesses community curation to collate and cross-reference resources across the life sciences from around the world. BioSharing makes these resources findable and accessible (the core of the FAIR principle). Every record is designed to be interlinked, providing a detailed description not only on the resource itself, but also on its relations with other life science infrastructures. Serving a variety of stakeholders, BioSharing cultivates a growing community, to which it offers diverse benefits. It is a resource for funding bodies and journal publishers to navigate the metadata landscape of the biological sciences; an educational resource for librarians and information advisors; a publicising platform for standard and database developers/curators; and a research tool for bench and computer scientists to plan their work. BioSharing is working with an increasing number of journals and other registries, for example linking standards and databases to training material and tools. Driven by an international Advisory Board, the BioSharing user-base has grown by over 40% (by unique IP address), in the last year thanks to successful engagement with researchers, publishers, librarians, developers and other stakeholders via several routes, including a joint RDA/Force11 working group and a collaboration with the International Society for Biocuration. In this article, we describe BioSharing, with a particular focus on community-led curation.Database URL: https://www.biosharing.org. © The Author(s) 2016. Published by Oxford University Press.
Building a School District's Wide Area Network.
ERIC Educational Resources Information Center
Mastel, Vern L.
1996-01-01
Describes the development of a wide area network (WAN) in the Bismarck Public School District (North Dakota). Topics include design goals, network infrastructure, implementing library access, sharing resources across platforms, electronic mail, dial-in access, Internet access, adhering to software licenses, shareware and freeware, and monitoring…
What Size Is Your Digital Footprint?
ERIC Educational Resources Information Center
Hewson, Kurtis
2013-01-01
The Professional Learning Network (PLN) is gaining momentum in the education lexicon. It records and reflects the personal development of a community of learners--primarily online through a variety of platforms and social networks--in which educators share resources, provide support, introduce and debate ideas and celebrate learning. These…
Corpas, Manuel; Jimenez, Rafael C.; Bongcam-Rudloff, Erik; Budd, Aidan; Brazas, Michelle D.; Fernandes, Pedro L.; Gaeta, Bruno; van Gelder, Celia; Korpelainen, Eija; Lewitter, Fran; McGrath, Annette; MacLean, Daniel; Palagi, Patricia M.; Rother, Kristian; Taylor, Jan; Via, Allegra; Watson, Mick; Schneider, Maria Victoria; Attwood, Teresa K.
2015-01-01
Summary: Rapid technological advances have led to an explosion of biomedical data in recent years. The pace of change has inspired new collaborative approaches for sharing materials and resources to help train life scientists both in the use of cutting-edge bioinformatics tools and databases and in how to analyse and interpret large datasets. A prototype platform for sharing such training resources was recently created by the Bioinformatics Training Network (BTN). Building on this work, we have created a centralized portal for sharing training materials and courses, including a catalogue of trainers and course organizers, and an announcement service for training events. For course organizers, the portal provides opportunities to promote their training events; for trainers, the portal offers an environment for sharing materials, for gaining visibility for their work and promoting their skills; for trainees, it offers a convenient one-stop shop for finding suitable training resources and identifying relevant training events and activities locally and worldwide. Availability and implementation: http://mygoblet.org/training-portal Contact: manuel.corpas@tgac.ac.uk PMID:25189782
DOT National Transportation Integrated Search
2007-09-01
A Virtual Container Yard (VCY) is a mean of developing a shared resource information system to match empty equipment needs through the adoption of next generation internet and new technology information platforms. The project examines the feasibility...
Integrating SAP to Information Systems Curriculum: Design and Delivery
ERIC Educational Resources Information Center
Wang, Ming
2011-01-01
Information Systems (IS) education is being transformed from the segmented applications toward the integrated enterprise-wide system software Enterprise Resource Planning (ERP). ERP is a platform that integrates all business functions with its centralized data repository shared by all the business operations in the enterprise. This tremendous…
AccrualNet: Addressing Low Accrual Via a Knowledge-Based, Community of Practice Platform
Massett, Holly A.; Parreco, Linda K.; Padberg, Rose Mary; Richmond, Ellen S.; Rienzo, Marie E.; Leonard, Colleen E. Ryan; Quesenbery, Whitney; Killiam, H. William; Johnson, Lenora E.; Dilts, David M.
2011-01-01
Purpose: Present the design and initial evaluation of a unique, Web-enabled platform for the development of a community of practice around issues of oncology clinical trial accrual. Methods: The National Cancer Institute (NCI) conducted research with oncology professionals to identify unmet clinical trial accrual needs in the field. In response, a comprehensive platform for accrual resources, AccrualNet, was created by using an agile development process, storyboarding, and user testing. Literature and resource searches identified relevant content to populate the site. Descriptive statistics were tracked for resource and site usage. Use cases were defined to support implementation. Results: AccrualNet has five levels: (1) clinical trial macrostages (prestudy, active study, and poststudy); (2) substages (developing a protocol, selecting a trial, preparing to open, enrolling patients, managing the trial, retaining participants, and lessons learned); (3) strategies for each substage; (4) multiple activities for each strategy; and (5) multiple resources for each activity. Since its launch, AccrualNet has had more than 45,000 page views, with the Tools & Resources, Conversations, and Training sections being the most viewed. Total resources have increased 69%, to 496 items. Analysis of articles in the site reveals that 22% are from two journals and 46% of the journals supplied a single article. To date, there are 29 conversations with 43 posts. Four use cases are discussed. Conclusion: AccrualNet represents a unique, centralized comprehensive-solution platform to systematically capture accrual knowledge for all stages of a clinical trial. It is designed to foster a community of practice by encouraging users to share additional strategies, resources, and ideas. PMID:22379429
AccrualNet: Addressing Low Accrual Via a Knowledge-Based, Community of Practice Platform.
Massett, Holly A; Parreco, Linda K; Padberg, Rose Mary; Richmond, Ellen S; Rienzo, Marie E; Leonard, Colleen E Ryan; Quesenbery, Whitney; Killiam, H William; Johnson, Lenora E; Dilts, David M
2011-11-01
Present the design and initial evaluation of a unique, Web-enabled platform for the development of a community of practice around issues of oncology clinical trial accrual. The National Cancer Institute (NCI) conducted research with oncology professionals to identify unmet clinical trial accrual needs in the field. In response, a comprehensive platform for accrual resources, AccrualNet, was created by using an agile development process, storyboarding, and user testing. Literature and resource searches identified relevant content to populate the site. Descriptive statistics were tracked for resource and site usage. Use cases were defined to support implementation. ACCRUALNET HAS FIVE LEVELS: (1) clinical trial macrostages (prestudy, active study, and poststudy); (2) substages (developing a protocol, selecting a trial, preparing to open, enrolling patients, managing the trial, retaining participants, and lessons learned); (3) strategies for each substage; (4) multiple activities for each strategy; and (5) multiple resources for each activity. Since its launch, AccrualNet has had more than 45,000 page views, with the Tools & Resources, Conversations, and Training sections being the most viewed. Total resources have increased 69%, to 496 items. Analysis of articles in the site reveals that 22% are from two journals and 46% of the journals supplied a single article. To date, there are 29 conversations with 43 posts. Four use cases are discussed. AccrualNet represents a unique, centralized comprehensive-solution platform to systematically capture accrual knowledge for all stages of a clinical trial. It is designed to foster a community of practice by encouraging users to share additional strategies, resources, and ideas.
Research on website construction based on website group platform of Chengdu sport institution
NASA Astrophysics Data System (ADS)
Hu, Zunyu
2018-04-01
This paper describes the necessity of website construction based on the website group of Chengdu sport institute, and discusses the technical features of the website group, Based on the website group platform architecture, the key technologies such as Web Service, AJAX, RSS and other key technologies are used to realize the construction of the website. Based on the website group platform architecture of the site, it effectively solves the information isolated island between the sites, and realizes the information sharing and resource integration. It is also more convenient that site and other sites have composed of site group integrated operation and maintenance.
Dynamic VM Provisioning for TORQUE in a Cloud Environment
NASA Astrophysics Data System (ADS)
Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.
2014-06-01
Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.
Making Spatial Statistics Service Accessible On Cloud Platform
NASA Astrophysics Data System (ADS)
Mu, X.; Wu, J.; Li, T.; Zhong, Y.; Gao, X.
2014-04-01
Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existing in development such as calculation efficiency, maintenance cost and data security. In this paper, we offer a spatial statistics service based on Microsoft cloud. An experiment was carried out to evaluate the availability and efficiency of this service. The results show that this spatial statistics service is accessible for the public conveniently with high processing efficiency.
Social media: physicians-to-physicians education and communication.
Fehring, Keith A; De Martino, Ivan; McLawhorn, Alexander S; Sculco, Peter K
2017-06-01
Physician to physician communication is essential for the transfer of ideas, surgical experience, and education. Social networks and online video educational contents have grown exponentially in recent years changing the interaction among physicians. Social media platforms can improve physician-to-physician communication mostly through video education and social networking. There are several online video platforms for orthopedic surgery with educational content on diagnosis, treatment, outcomes, and surgical technique. Social networking instead is mostly centered on sharing of data, discussion of confidential topics, and job seeking. Quality of educational contents and data confidentiality represent the major drawbacks of these platforms. Orthopedic surgeons must be aware that the quality of the videos should be better controlled and regulated to avoid inaccurate information that may have a significant impact especially on trainees that are more prone to use this type of resources. Sharing of data and discussion of confidential topics should be extremely secure according the HIPAA regulations in order to protect patients' confidentiality.
Using Social Media to Support the Learning Needs of Future IS Security Professionals
ERIC Educational Resources Information Center
Neville, Karen; Heavin, Ciara
2013-01-01
The emergence of social media has forced educators to think differently about the way learning occurs. Students and practitioners alike are using new technologies to connect with peers/colleagues, share ideas, resources and experiences for extracurricular activities. The social business gaming platform considered in this study leverages the social…
C3: A Collaborative Web Framework for NASA Earth Exchange
NASA Astrophysics Data System (ADS)
Foughty, E.; Fattarsi, C.; Hardoyo, C.; Kluck, D.; Wang, L.; Matthews, B.; Das, K.; Srivastava, A.; Votava, P.; Nemani, R. R.
2010-12-01
The NASA Earth Exchange (NEX) is a new collaboration platform for the Earth science community that provides a mechanism for scientific collaboration and knowledge sharing. NEX combines NASA advanced supercomputing resources, Earth system modeling, workflow management, NASA remote sensing data archives, and a collaborative communication platform to deliver a complete work environment in which users can explore and analyze large datasets, run modeling codes, collaborate on new or existing projects, and quickly share results among the Earth science communities. NEX is designed primarily for use by the NASA Earth science community to address scientific grand challenges. The NEX web portal component provides an on-line collaborative environment for sharing of Eearth science models, data, analysis tools and scientific results by researchers. In addition, the NEX portal also serves as a knowledge network that allows researchers to connect and collaborate based on the research they are involved in, specific geographic area of interest, field of study, etc. Features of the NEX web portal include: Member profiles, resource sharing (data sets, algorithms, models, publications), communication tools (commenting, messaging, social tagging), project tools (wikis, blogs) and more. The NEX web portal is built on the proven technologies and policies of DASHlink.arc.nasa.gov, (one of NASA's first science social media websites). The core component of the web portal is a C3 framework, which was built using Django and which is being deployed as a common framework for a number of collaborative sites throughout NASA.
Cui, Wenbin; Zheng, Peiyong; Yang, Jiahong; Zhao, Rong; Gao, Jiechun; Yu, Guangjun
2015-02-01
Biobanks are important resources and central tools for translational medicine, which brings scientific research outcomes to clinical practice. The key purpose of biobanking in translational medicine and other medical research is to provide biological samples that are integrated with clinical information. In 2008, the Shanghai Municipal Government launched the "Shanghai Tissue Bank" in an effort to promote research in translational medicine. Now a sharing service platform has been constructed to integrate clinical practice and biological information that can be used in diverse medical and pharmaceutical research studies. The platform collects two kinds of data: sample data and clinical data. The sample data are obtained from the hospital biobank management system, and mainly include the donors' age, gender, marital status, sample source, sample type, collection time, deposit time, and storage method. The clinical data are collected from the "Hospital-Link" system (a medical information sharing system that connects 23 tertiary hospitals in Shanghai). The main contents include donors' corresponding medication information, test reports, inspection reports, and hospital information. As of the end of September 2014, the project has a collection of 16,020 donors and 148,282 samples, which were obtained from 12 medical institutions, and automatically acquired donors' corresponding clinical data from the "Hospital-Link" system for 6830 occurrences. This project will contribute to scientific research at medical institutions in Shanghai, and will also support the development of the biopharmaceutical industry. In this article, we will describe the significance, the construction phases, the application prospects, and benefits of the sample repository and information sharing service platform.
Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun
2012-01-01
Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.
A General Water Resources Regulation Software System in China
NASA Astrophysics Data System (ADS)
LEI, X.
2017-12-01
To avoid iterative development of core modules in water resource normal regulation and emergency regulation and improve the capability of maintenance and optimization upgrading of regulation models and business logics, a general water resources regulation software framework was developed based on the collection and analysis of common demands for water resources regulation and emergency management. It can provide a customizable, secondary developed and extensible software framework for the three-level platform "MWR-Basin-Province". Meanwhile, this general software system can realize business collaboration and information sharing of water resources regulation schemes among the three-level platforms, so as to improve the decision-making ability of national water resources regulation. There are four main modules involved in the general software system: 1) A complete set of general water resources regulation modules allows secondary developer to custom-develop water resources regulation decision-making systems; 2) A complete set of model base and model computing software released in the form of Cloud services; 3) A complete set of tools to build the concept map and model system of basin water resources regulation, as well as a model management system to calibrate and configure model parameters; 4) A database which satisfies business functions and functional requirements of general water resources regulation software can finally provide technical support for building basin or regional water resources regulation models.
[Current status and prospects of traditional Chinese medicine resource ex-situ conservation].
Que, Ling; Yang, Guang; Miao, Jian-Hua; Wang, Hai-Yang; Chen, Min; Zang, Chun-Xin
2016-10-01
Protection of traditional Chinese medicine (TCM) resources is the foundation of sustainable development of TCM industry, which includes the in-situ and ex-situ conservation. The development of TCM resource ex-situ conservation was reviewed, and hotpots in the conservation and its development practices were analyzed. Therefore national TCM resource ex-situ conservation systems were proposed, including the establishment of TCM resources introduction gardens, TCM resource in vitro conservation library and TCM resource bio-information sharing platform, rational distribution of TCM resources ex-situ conservation agencies, along with the advancement of TCM varieties breeding, and the perfection of Chinese herbal medicines seed and seedlings market, which are of significant importance on the guidance of TCM resource ex-situ conservation development. Copyright© by the Chinese Pharmaceutical Association.
Using Web 2.0 Technology to Enhance, Scaffold and Assess Problem-Based Learning
ERIC Educational Resources Information Center
Hack, Catherine
2013-01-01
Web 2.0 technologies, such as social networks, wikis, blogs, and virtual worlds provide a platform for collaborative working, facilitating sharing of resources and joint document production. They can act as a stimulus to promote active learning and provide an engaging and interactive environment for students, and as such align with the philosophy…
SMT-Aware Instantaneous Footprint Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy, Probir; Liu, Xu; Song, Shuaiwen
Modern architectures employ simultaneous multithreading (SMT) to increase thread-level parallelism. SMT threads share many functional units and the whole memory hierarchy of a physical core. Without a careful code design, SMT threads can easily contend with each other for these shared resources, causing severe performance degradation. Minimizing SMT thread contention for HPC applications running on dedicated platforms is very challenging, because they usually spawn threads within Single Program Multiple Data (SPMD) models. To address this important issue, we introduce a simple scheme for SMT-aware code optimization, which aims to reduce the memory contention across SMT threads.
Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing
NASA Technical Reports Server (NTRS)
Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce
2011-01-01
Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases
Discrete event command and control for networked teams with multiple missions
NASA Astrophysics Data System (ADS)
Lewis, Frank L.; Hudas, Greg R.; Pang, Chee Khiang; Middleton, Matthew B.; McMurrough, Christopher
2009-05-01
During mission execution in military applications, the TRADOC Pamphlet 525-66 Battle Command and Battle Space Awareness capabilities prescribe expectations that networked teams will perform in a reliable manner under changing mission requirements, varying resource availability and reliability, and resource faults. In this paper, a Command and Control (C2) structure is presented that allows for computer-aided execution of the networked team decision-making process, control of force resources, shared resource dispatching, and adaptability to change based on battlefield conditions. A mathematically justified networked computing environment is provided called the Discrete Event Control (DEC) Framework. DEC has the ability to provide the logical connectivity among all team participants including mission planners, field commanders, war-fighters, and robotic platforms. The proposed data management tools are developed and demonstrated on a simulation study and an implementation on a distributed wireless sensor network. The results show that the tasks of multiple missions are correctly sequenced in real-time, and that shared resources are suitably assigned to competing tasks under dynamically changing conditions without conflicts and bottlenecks.
Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform
NASA Astrophysics Data System (ADS)
Shrestha, S. R.; Viswambharan, V.; Doshi, A.
2017-12-01
Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics capabilities that exploit distributed computing in an enterprise environment.
Data Mashups: Linking Human Health and Wellbeing with Weather, Climate and the Environment
NASA Astrophysics Data System (ADS)
Fleming, L. E.; Sarran, C.; Golding, B.; Haines, A.; Kessel, A.; Djennad, M.; Hajat, S.; Nichols, G.; Gordon Brown, H.; Depledge, M.
2016-12-01
A large part of the global disease burden can be linked to environmental factors, underpinned by unhealthy behaviours. Research into these linkages suffers from lack of common tools and databases for investigations across many different scientific disciplines to explore these complex associations. The MEDMI (Medical and Environmental Data-a Mash-up Infrastructure) Partnership brings together leading organisations and researchers in climate, weather, environment, and human health. We have created a proof-of-concept central data and analysis system with the UK Met Office and Public Health England data as the internet-based MEDMI Platform (www.data-mashup.org.uk) to serve as a common resource for researchers to link and analyse complex meteorological, environmental and epidemiological data in the UK. The Platform is hosted on its own dedicated server, with secure internet and in-person access with appropriate safeguards for ethical, copyright, security, preservation, and data sharing issues. Via the Platform, there is a demonstration Browser Application with access to user-selected subsets of the data for: a) analyses using time series (e.g. mortality/environmental variables), and b) data visualizations (e.g. infectious diseases/environmental variables). One demonstration project is linking climate change, harmful algal blooms and oceanographic modelling building on the hydrodynamic-biogeochemical coupled models; in situ and satellite observations as well as UK HAB data and hospital episode statistics data are being used for model verification and future forecasting. The MEDMI Project provides a demonstration of the potential, barriers and challenges, of these "data mashups" of environment and health data. Although there remain many challenges to creating and sustaining such a shared resource, these activities and resources are essential to truly explore the complex interactions between climate and other environmental change and health at the local and global scale.
NASA Astrophysics Data System (ADS)
Yen, Y.-N.; Wu, Y.-W.; Weng, K.-H.
2013-07-01
E-learning assisted teaching and learning is the trend of the 21st century and has many advantages - freedom from the constraints of time and space, hypertext and multimedia rich resources - enhancing the interaction between students and the teaching materials. The purpose of this study is to explore how rich Internet resources assisted students with the Western Architectural History course. First, we explored the Internet resources which could assist teaching and learning activities. Second, according to course objectives, we built a web-based platform which integrated the Google spreadsheets form, SIMILE widget, Wikipedia and the Google Maps and applied it to the course of Western Architectural History. Finally, action research was applied to understanding the effectiveness of this teaching/learning mode. Participants were the students of the Department of Architecture in the Private University of Technology in northern Taiwan. Results showed that students were willing to use the web-based platform to assist their learning. They found this platform to be useful in understanding the relationship between different periods of buildings. Through the view of the map mode, this platform also helped students expand their international perspective. However, we found that the information shared by students via the Internet were not completely correct. One possible reason was that students could easily acquire information on Internet but they could not determine the correctness of the information. To conclude, this study found some useful and rich resources that could be well-integrated, from which we built a web-based platform to collect information and present this information in diverse modes to stimulate students' learning motivation. We recommend that future studies should consider hiring teaching assistants in order to ease the burden on teachers, and to assist in the maintenance of information quality.
GeoChronos: An On-line Collaborative Platform for Earth Observation Scientists
NASA Astrophysics Data System (ADS)
Gamon, J. A.; Kiddle, C.; Curry, R.; Markatchev, N.; Zonta-Pastorello, G., Jr.; Rivard, B.; Sanchez-Azofeifa, G. A.; Simmonds, R.; Tan, T.
2009-12-01
Recent advances in cyberinfrastructure are offering new solutions to the growing challenges of managing and sharing large data volumes. Web 2.0 and social networking technologies, provide the means for scientists to collaborate and share information more effectively. Cloud computing technologies can provide scientists with transparent and on-demand access to applications served over the Internet in a dynamic and scalable manner. Semantic Web technologies allow for data to be linked together in a manner understandable by machines, enabling greater automation. Combining all of these technologies together can enable the creation of very powerful platforms. GeoChronos (http://geochronos.org/), part of a CANARIE Network Enabled Platforms project, is an online collaborative platform that incorporates these technologies to enable members of the earth observation science community to share data and scientific applications and to collaborate more effectively. The GeoChronos portal is built on an open source social networking platform called Elgg. Elgg provides a full set of social networking functionalities similar to Facebook including blogs, tags, media/document sharing, wikis, friends/contacts, groups, discussions, message boards, calendars, status, activity feeds and more. An underlying cloud computing infrastructure enables scientists to access dynamically provisioned applications via the portal for visualizing and analyzing data. Users are able to access and run the applications from any computer that has a Web browser and Internet connectivity and do not need to manage and maintain the applications themselves. Semantic Web Technologies, such as the Resource Description Framework (RDF) are being employed for relating and linking together spectral, satellite, meteorological and other data. Social networking functionality plays an integral part in facilitating the sharing of data and applications. Examples of recent GeoChronos users during the early testing phase have included the IAI International Wireless Sensor Networking Summer School at the University of Alberta, and the IAI Tropi-Dry community. Current GeoChronos activities include the development of a web-based spectral library and related analytical and visualization tools, in collaboration with members of the SpecNet community. The GeoChronos portal will be open to all members of the earth observation science community when the project nears completion at the end of 2010.
A Model Collaborative Platform for Geoscience Education
NASA Astrophysics Data System (ADS)
Fox, S.; Manduca, C. A.; Iverson, E. A.
2012-12-01
Over the last decade SERC at Carleton College has developed a collaborative platform for geoscience education that has served dozens of projects, thousands of community authors and millions of visitors. The platform combines a custom technical infrastructure: the SERC Content Management system (CMS), and a set of strategies for building web-resources that can be disseminated through a project site, reused by other projects (with attribution) or accessed via an integrated geoscience education resource drawing from all projects using the platform. The core tools of the CMS support geoscience education projects in building project-specific websites. Each project uses the CMS to engage their specific community in collecting, authoring and disseminating the materials of interest to them. At the same time the use of a shared central infrastructure allows cross-fertilization among these project websites. Projects are encouraged to use common templates and common controlled vocabularies for organizing and displaying their resources. This standardization is then leveraged through cross-project search indexing which allow projects to easily incorporate materials from other projects within their own collection in ways that are relevant and automated. A number of tools are also in place to help visitors move among project websites based on their personal interests. Related links help visitors discover content related topically to their current location that is in a 'separate' project. A 'best bets' feature in search helps guide visitors to pages that are good starting places to explore resources on a given topic across the entire range of hosted projects. In many cases these are 'site guide' pages created specifically to promote a cross-project view of the available resources. In addition to supporting the cross-project exploration of specific themes the CMS also allows visitors to view the combined suite of resources authored by any particular community member. Automatically generated author profiles highlight the contributions an individual has made through any of the projects with an option for customization by the author. An overarching portal site provides a unified view of resources within this diverse set of geoscience education projects. The SERC CMS provides a common platform upon which individual projects can build their own identities, while allowing cross-project pollination and synergies to be realized without significant extra investment by each project. This is a sustainable model for a collaborative platform that takes advantage of the energy and resources of individual projects to advance larger community goals.
Selection and Use of Online Learning Resources by First-Year Medical Students: Cross-Sectional Study
Elliott, Kristine
2017-01-01
Background Medical students have access to a wide range of learning resources, many of which have been specifically developed for or identified and recommended to them by curriculum developers or teaching staff. There is an expectation that students will access and use these resources to support their self-directed learning. However, medical educators lack detailed and reliable data about which of these resources students use to support their learning and how this use relates to key learning events or activities. Objective The purpose of this study was to comprehensively document first-year medical student selection and use of online learning resources to support their bioscience learning within a case-based curriculum and assess these data in relation to our expectations of student learning resource requirements and use. Methods Study data were drawn from 2 sources: a survey of student learning resource selection and use (2013 cohort; n=326) and access logs from the medical school learning platform (2012 cohort; n=337). The paper-based survey, which was distributed to all first-year students, was designed to assess the frequency and types of online learning resources accessed by students and included items about their perceptions of the usefulness, quality, and reliability of various resource types and sources. Of 237 surveys returned, 118 complete responses were analyzed (36.2% response rate). Usage logs from the learning platform for an entire semester were processed to provide estimates of first-year student resource use on an individual and cohort-wide basis according to method of access, resource type, and learning event. Results According to the survey data, students accessed learning resources via the learning platform several times per week on average, slightly more often than they did for resources from other online sources. Google and Wikipedia were the most frequently used nonuniversity sites, while scholarly information sites (eg, online journals and scholarly databases) were accessed relatively infrequently. Students were more likely to select learning resources based on the recommendation of peers than of teaching staff. The overwhelming majority of the approximately 70,000 resources accessed by students via the learning platform were lecture notes, with each accessed an average of 167 times. By comparison, recommended journal articles and (online) textbook chapters were accessed only 49 and 31 times, respectively. The number and type of learning resources accessed by students through the learning platform was highly variable, with a cluster analysis revealing that a quarter of students accessed very few resources in this way. Conclusions Medical students have easy access to a wide range of quality learning resources, and while some make good use of the learning resources recommended to them, many ignore most and access the remaining ones infrequently. Learning analytics can provide useful measures of student resource access through university learning platforms but fails to account for resources accessed via external online sources or sharing of resources using social media. PMID:28970187
Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.
Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y
2018-02-01
The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate model sharing, and there are corresponding initiatives by the scientific journals. Outside the publishing enterprise, infrastructure to facilitate model sharing in biomechanics exists, and simulation software developers are interested in accommodating the community's needs for sharing of modeling resources. Encouragement for the use of standardized markups, concerns related to quality assurance, acknowledgement of increased burden, and importance of stewardship of resources are noted. In the short-term, it is advisable that the community builds upon recent strategies and experiments with new pathways for continued demonstration of model sharing, its promotion, and its utility. Nonetheless, the need for a long-term strategy to unify approaches in sharing computational models and related resources is acknowledged. Development of a sustainable platform supported by a culture of open model sharing will likely evolve through continued and inclusive discussions bringing all stakeholders at the table, e.g., by possibly establishing a consortium.
NASA Astrophysics Data System (ADS)
Nelson, J.; Ames, D. P.; Jones, N.; Tarboton, D. G.; Li, Z.; Qiao, X.; Crawley, S.
2016-12-01
As water resources data continue to move to the web in the form of well-defined, open access, machine readable web services provided by government, academic, and private institutions, there is increased opportunity to move additional parts of the water science workflow to the web (e.g. analysis, modeling, decision support, and collaboration.) Creating such web-based functionality can be extremely time-consuming and resource-intensive and can lead the erstwhile water scientist down a veritable cyberinfrastructure rabbit hole, through an unintended tunnel of transformation to become a Cyber-Wonderland software engineer. We posit that such transformations were never the intention of the research programs that fund earth science cyberinfrastructure, nor is it in the best interest of water researchers to spend exorbitant effort developing and deploying such technologies. This presentation will introduce a relatively simple and ready-to-use water science web app environment funded by the National Science Foundation that couples the new HydroShare data publishing system with the Tethys Platform web app development toolkit. The coupled system has already been shown to greatly lower the barrier to deploying of web based visualization and analysis tools for the CUAHSI Water Data Center and for the National Weather Service's National Water Model. The design and implementation of the developed web app architecture will be presented together key examples of existing apps created using this system. In each of the cases presented, water resources students with basic programming skills were able to develop and deploy highly functional web apps in a relatively short period of time (days to weeks) - allowing the focus to remain on water science rather on cyberinfrastructure. This presentation is accompanied by an open invitation for new collaborations that use the HydroShare-Tethys web app environment.
Glover, Jason; Man, Tsz-Kwong; Barkauskas, Donald A; Hall, David; Tello, Tanya; Sullivan, Mary Beth; Gorlick, Richard; Janeway, Katherine; Grier, Holcombe; Lau, Ching; Toretsky, Jeffrey A; Borinstein, Scott C; Khanna, Chand; Fan, Timothy M
2017-01-01
The prospective banking of osteosarcoma tissue samples to promote research endeavors has been realized through the establishment of a nationally centralized biospecimen repository, the Children's Oncology Group (COG) biospecimen bank located at the Biopathology Center (BPC)/Nationwide Children's Hospital in Columbus, Ohio. Although the physical inventory of osteosarcoma biospecimens is substantive (>15,000 sample specimens), the nature of these resources remains exhaustible. Despite judicious allocation of these high-value biospecimens for conducting sarcoma-related research, a deeper understanding of osteosarcoma biology, in particular metastases, remains unrealized. In addition the identification and development of novel diagnostics and effective therapeutics remain elusive. The QuadW-COG Childhood Sarcoma Biostatistics and Annotation Office (CSBAO) has developed the High Dimensional Data (HDD) platform to complement the existing physical inventory and to promote in silico hypothesis testing in sarcoma biology. The HDD is a relational biologic database derived from matched osteosarcoma biospecimens in which diverse experimental readouts have been generated and digitally deposited. As proof-of-concept, we demonstrate that the HDD platform can be utilized to address previously unrealized biologic questions though the systematic juxtaposition of diverse datasets derived from shared biospecimens. The continued population of the HDD platform with high-value, high-throughput and mineable datasets allows a shared and reusable resource for researchers, both experimentalists and bioinformatics investigators, to propose and answer questions in silico that advance our understanding of osteosarcoma biology.
Wen, Can-Hong; Ou, Shao-Min; Guo, Xiao-Bo; Liu, Chen-Feng; Shen, Yan-Bo; You, Na; Cai, Wei-Hong; Shen, Wen-Jun; Wang, Xue-Qin; Tan, Hai-Zhu
2017-12-12
Breast cancer is a high-risk heterogeneous disease with myriad subtypes and complicated biological features. The Cancer Genome Atlas (TCGA) breast cancer database provides researchers with the large-scale genome and clinical data via web portals and FTP services. Researchers are able to gain new insights into their related fields, and evaluate experimental discoveries with TCGA. However, it is difficult for researchers who have little experience with database and bioinformatics to access and operate on because of TCGA's complex data format and diverse files. For ease of use, we build the breast cancer (B-CAN) platform, which enables data customization, data visualization, and private data center. The B-CAN platform runs on Apache server and interacts with the backstage of MySQL database by PHP. Users can customize data based on their needs by combining tables from original TCGA database and selecting variables from each table. The private data center is applicable for private data and two types of customized data. A key feature of the B-CAN is that it provides single table display and multiple table display. Customized data with one barcode corresponding to many records and processed customized data are allowed in Multiple Tables Display. The B-CAN is an intuitive and high-efficient data-sharing platform.
ERIC Educational Resources Information Center
Loya, Melody Aye; Klemm, Terri
2016-01-01
Focusing on TED Talks (online videos) as a resource for social work educators, this teaching note shares our ideas regarding the use of the online videos as an avenue for reaching students and encouraging discussions in the social work classroom. The article first explores the TED platform and then discusses using TED as a teaching tool. Finally,…
Biopiracy of natural products and good bioprospecting practice.
Efferth, Thomas; Banerjee, Mita; Paul, Norbert W; Abdelfatah, Sara; Arend, Joachim; Elhassan, Gihan; Hamdoun, Sami; Hamm, Rebecca; Hong, Chunlan; Kadioglu, Onat; Naß, Janine; Ochwangi, Dominic; Ooko, Edna; Ozenver, Nadire; Saeed, Mohamed E M; Schneider, Mathias; Seo, Ean-Jeong; Wu, Ching-Fen; Yan, Ge; Zeino, Maen; Zhao, Qiaoli; Abu-Darwish, Mohammad S; Andersch, Kai; Alexie, Gladys; Bessarab, Dawn; Bhakta-Guha, Dipita; Bolzani, Vanderlan; Dapat, Else; Donenko, Fedor V; Efferth, Monika; Greten, Henry J; Gunatilaka, Leslie; Hussein, Ahmed A; Karadeniz, Asuman; Khalid, Hassan E; Kuete, Victor; Lee, Ik-Soo; Liu, Liang; Midiwo, Jacob; Mora, Rodrigo; Nakagawa, Hiroshi; Ngassapa, Olipa; Noysang, Chanai; Omosa, Leonida K; Roland, Fred Hwiemtun; Shahat, Abdelaaty A; Saab, Antoine; Saeed, Elfatih M; Shan, Letian; Titinchi, Salam J J
2016-02-15
Biopiracy mainly focuses on the use of biological resources and/or knowledge of indigenous tribes or communities without allowing them to share the revenues generated out of economic exploitation or other non-monetary incentives associated with the resource/knowledge. Based on collaborations of scientists from five continents, we have created a communication platform to discuss not only scientific topics, but also more general issues with social relevance. This platform was termed 'PhytCancer -Phytotherapy to Fight Cancer' (www.phyt-cancer.uni-mainz.de). As a starting point, we have chosen the topic "biopiracy", since we feel this is of pragmatic significance for scientists working with medicinal plants. It was argued that the patenting of herbs or natural products by pharmaceutical corporations disregarded the ownership of the knowledge possessed by the indigenous communities on how these substances worked. Despite numerous court decisions in U.S.A. and Europe, several international treaties, (e.g. from United Nations, World Health Organization, World Trade Organization, the African Unity and others), sharing of a rational set of benefits amongst producers (mainly pharmaceutical companies) and indigenous communities is yet a distant reality. In this paper, we present an overview of the legal frameworks, discuss some exemplary cases of biopiracy and bioprospecting as excellent forms of utilization of natural resources. We suggest certain perspectives, by which we as scientists, may contribute towards prevention of biopiracy and also to foster the fair utilization of natural resources. We discuss ways, in which the interests of indigenous people especially from developing countries can be secured. Copyright © 2015 Elsevier GmbH. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-25
The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less
Open-source platforms for navigated image-guided interventions.
Ungi, Tamas; Lasso, Andras; Fichtinger, Gabor
2016-10-01
Navigation technology is changing the clinical standards in medical interventions by making existing procedures more accurate, and new procedures possible. Navigation is based on preoperative or intraoperative imaging combined with 3-dimensional position tracking of interventional tools registered to the images. Research of navigation technology in medical interventions requires significant engineering efforts. The difficulty of developing such complex systems has been limiting the clinical translation of new methods and ideas. A key to the future success of this field is to provide researchers with platforms that allow rapid implementation of applications with minimal resources spent on reimplementing existing system features. A number of platforms have been already developed that can share data in real time through standard interfaces. Complete navigation systems can be built using these platforms using a layered software architecture. In this paper, we review the most popular platforms, and show an effective way to take advantage of them through an example surgical navigation application. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Heath Pastore, Tracy; Barnes, Mitchell; Hallman, Rory
2005-05-01
Robot technology is developing at a rapid rate for both commercial and Department of Defense (DOD) applications. As a result, the task of managing both technology and experience information is growing. In the not-to-distant past, tracking development efforts of robot platforms, subsystems and components was not too difficult, expensive, or time consuming. To do the same today is a significant undertaking. The Mobile Robot Knowledge Base (MRKB) provides the robotics community with a web-accessible, centralized resource for sharing information, experience, and technology to more efficiently and effectively meet the needs of the robot system user. The resource includes searchable information on robot components, subsystems, mission payloads, platforms, and DOD robotics programs. In addition, the MRKB website provides a forum for technology and information transfer within the DOD robotics community and an interface for the Robotic Systems Pool (RSP). The RSP manages a collection of small teleoperated and semi-autonomous robotic platforms, available for loan to DOD and other qualified entities. The objective is to put robots in the hands of users and use the test data and fielding experience to improve robot systems.
Flexible workflow sharing and execution services for e-scientists
NASA Astrophysics Data System (ADS)
Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely
2013-04-01
The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already integrated with the execution engine of the SHIWA Portal. Other engines can be added when required. Through the SHIWA Portal one can define and run simulations on the SHIWA Virtual Organisation, an e-infrastructure that gathers computing and data resources from various DCIs, including the European Grid Infrastructure. The Portal via third party workflow engines provides support for the most widely used academic workflow engines and it can be extended with other engines on demand. Such extensions translate between workflow languages and facilitate the nesting of workflows into larger workflows even when those are written in different languages and require different interpreters for execution. Through the workflow repository and the portal lonely scientists and scientific collaborations can share and offer workflows for reuse and execution. Given the integrated nature of the SHIWA Simulation Platform the shared workflows can be executed online, without installing any special client environment and downloading workflows. The FP7 "Building a European Research Community through Interoperable Workflows and Data" (ER-flow) project disseminates the achievements of the SHIWA project and use these achievements to build workflow user communities across Europe. ER-flow provides application supports to research communities within and beyond the project consortium to develop, share and run workflows with the SHIWA Simulation Platform.
[Study on network architecture of a tele-medical information sharing platform].
Pan, Lin; Yu, Lun; Chen, Jin-xiong
2006-07-01
In the article,a plan of network construction which satisfies the demand of applications for a telemedical information sharing platform is proposed. We choice network access plans in view of user actual situation, through the analysis of the service demand and many kinds of network access technologies. Hospital servers that locate in LAN link sharing platform with node servers, should separate from the broadband network of sharing platform in order to ensure the security of the internal hospital network and the administration management. We use the VPN technology to realize the safe transmission of information in the platform network. Preliminary experiments have proved the plan is practicable.
Improvement of Resilience to Disasters in Local Community Using Information Sharing Platform
NASA Astrophysics Data System (ADS)
Hayama, Toru; Suzuki, Yuji; Park, Wonho; Hayashi, Akira
This paper presents a proposal for Disaster Information Sharing Platform, which enable local government and residents to share the disaster information, and to cope with the disaster under the proper balance of Self-help, Mutual-help and Public-help. Informagic, which has been developed as a concrete example of the information sharing platform, enable us to collect information from variety of sources, such as government, local government, research institutes, private contents providers and so forth, and to transmit these information to residents through multi-media, such as internet, mobile-phone network and wireless system. An experiment was conducted under the cooperation of City of Fujisawa, to investigate the effectiveness of such platform for the disaster mitigation. Further, the platform was utilized to provide information to refugees at refuges for the Iwate-Miyagi Inland Earthquake. Through these experiments, effectiveness and issues of the platform and information sharing were investigated.
Judd, Terry; Elliott, Kristine
2017-10-02
Medical students have access to a wide range of learning resources, many of which have been specifically developed for or identified and recommended to them by curriculum developers or teaching staff. There is an expectation that students will access and use these resources to support their self-directed learning. However, medical educators lack detailed and reliable data about which of these resources students use to support their learning and how this use relates to key learning events or activities. The purpose of this study was to comprehensively document first-year medical student selection and use of online learning resources to support their bioscience learning within a case-based curriculum and assess these data in relation to our expectations of student learning resource requirements and use. Study data were drawn from 2 sources: a survey of student learning resource selection and use (2013 cohort; n=326) and access logs from the medical school learning platform (2012 cohort; n=337). The paper-based survey, which was distributed to all first-year students, was designed to assess the frequency and types of online learning resources accessed by students and included items about their perceptions of the usefulness, quality, and reliability of various resource types and sources. Of 237 surveys returned, 118 complete responses were analyzed (36.2% response rate). Usage logs from the learning platform for an entire semester were processed to provide estimates of first-year student resource use on an individual and cohort-wide basis according to method of access, resource type, and learning event. According to the survey data, students accessed learning resources via the learning platform several times per week on average, slightly more often than they did for resources from other online sources. Google and Wikipedia were the most frequently used nonuniversity sites, while scholarly information sites (eg, online journals and scholarly databases) were accessed relatively infrequently. Students were more likely to select learning resources based on the recommendation of peers than of teaching staff. The overwhelming majority of the approximately 70,000 resources accessed by students via the learning platform were lecture notes, with each accessed an average of 167 times. By comparison, recommended journal articles and (online) textbook chapters were accessed only 49 and 31 times, respectively. The number and type of learning resources accessed by students through the learning platform was highly variable, with a cluster analysis revealing that a quarter of students accessed very few resources in this way. Medical students have easy access to a wide range of quality learning resources, and while some make good use of the learning resources recommended to them, many ignore most and access the remaining ones infrequently. Learning analytics can provide useful measures of student resource access through university learning platforms but fails to account for resources accessed via external online sources or sharing of resources using social media. ©Terry Judd, Kristine Elliott. Originally published in JMIR Medical Education (http://mededu.jmir.org), 02.10.2017.
Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)
NASA Astrophysics Data System (ADS)
Nebert, D. D.; Huang, Q.; Yang, C.
2013-12-01
The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This paper presents the background, architectural design, and activities of GeoCloud in support of the Geospatial Platform Initiative. System security strategies and approval processes for migrating federal geospatial data, information, and applications into cloud, and cost estimation for cloud operations are covered. Finally, some lessons learned from the GeoCloud project are discussed as reference for geoscientists to consider in the adoption of cloud computing.
Pyritz, Lennart W; Fichtel, Claudia; Huchard, Elise; Kappeler, Peter M
2013-01-01
Social animals have to coordinate joint movements to maintain group cohesion, but the latter is often compromised by diverging individual interests. A widespread behavioral mechanism to achieve coordination relies on shared or unshared consensus decision-making. If consensus costs are high, group fission represents an alternative tactic. Exploring determinants and outcomes of spontaneous group decisions and coordination of free-ranging animals is methodologically challenging. We therefore conducted a foraging experiment with a group of wild redfronted lemurs (Eulemur rufifrons) to study decision outcomes, coordination of movements, individual foraging benefits and social interactions in response to the presentation of drinking platforms with varying baiting patterns. Behavioral observations were complemented with data from recordings of motion detector cameras installed at the platforms. The animal's behavior in the experimental conditions was compared to natural group movements. We could not determine the type of consensus decision-making because the group visited platforms randomly. The group fissioned during 23.3% of platform visits, and fissioning resulted in more individuals drinking simultaneously. As under natural conditions, adult females initiated most group movements, but overtaking by individuals of different age and sex classes occurred in 67% of movements to platforms, compared to only 18% during other movements. As a result, individual resource intake at the platforms did not depend on departure position, age or sex, but on arrival order. Aggression at the platforms did not affect resource intake, presumably due to low supplanting rates. Our findings highlight the diversity of coordination processes and related consequences for individual foraging benefits in a primate group living under natural conditions.
FORCEnet Net Centric Architecture - A Standards View
2006-06-01
SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION DATA MANAGEMENT APPLICATION...R V I C E P L A T F O R M S E R V I C E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM...E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION
Glover, Jason; Man, Tsz-Kwong; Barkauskas, Donald A.; Hall, David; Tello, Tanya; Sullivan, Mary Beth; Gorlick, Richard; Janeway, Katherine; Grier, Holcombe; Lau, Ching; Toretsky, Jeffrey A.; Borinstein, Scott C.; Khanna, Chand
2017-01-01
The prospective banking of osteosarcoma tissue samples to promote research endeavors has been realized through the establishment of a nationally centralized biospecimen repository, the Children’s Oncology Group (COG) biospecimen bank located at the Biopathology Center (BPC)/Nationwide Children’s Hospital in Columbus, Ohio. Although the physical inventory of osteosarcoma biospecimens is substantive (>15,000 sample specimens), the nature of these resources remains exhaustible. Despite judicious allocation of these high-value biospecimens for conducting sarcoma-related research, a deeper understanding of osteosarcoma biology, in particular metastases, remains unrealized. In addition the identification and development of novel diagnostics and effective therapeutics remain elusive. The QuadW-COG Childhood Sarcoma Biostatistics and Annotation Office (CSBAO) has developed the High Dimensional Data (HDD) platform to complement the existing physical inventory and to promote in silico hypothesis testing in sarcoma biology. The HDD is a relational biologic database derived from matched osteosarcoma biospecimens in which diverse experimental readouts have been generated and digitally deposited. As proof-of-concept, we demonstrate that the HDD platform can be utilized to address previously unrealized biologic questions though the systematic juxtaposition of diverse datasets derived from shared biospecimens. The continued population of the HDD platform with high-value, high-throughput and mineable datasets allows a shared and reusable resource for researchers, both experimentalists and bioinformatics investigators, to propose and answer questions in silico that advance our understanding of osteosarcoma biology. PMID:28732082
A study of compositional verification based IMA integration method
NASA Astrophysics Data System (ADS)
Huang, Hui; Zhang, Guoquan; Xu, Wanmeng
2018-03-01
The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.
The asthma mobile health study, smartphone data collected using ResearchKit.
Chan, Yu-Feng Yvonne; Bot, Brian M; Zweig, Micol; Tignor, Nicole; Ma, Weiping; Suver, Christine; Cedeno, Rafhael; Scott, Erick R; Gregory Hershman, Steven; Schadt, Eric E; Wang, Pei
2018-05-22
Widespread adoption of smart mobile platforms coupled with a growing ecosystem of sensors including passive location tracking and the ability to leverage external data sources create an opportunity to generate an unprecedented depth of data on individuals. Mobile health technologies could be utilized for chronic disease management as well as research to advance our understanding of common diseases, such as asthma. We conducted a prospective observational asthma study to assess the feasibility of this type of approach, clinical characteristics of cohorts recruited via a mobile platform, the validity of data collected, user retention patterns, and user data sharing preferences. We describe data and descriptive statistics from the Asthma Mobile Health Study, whereby participants engaged with an iPhone application built using Apple's ResearchKit framework. Data from 6346 U.S. participants, who agreed to share their data broadly, have been made available for further research. These resources have the potential to enable the research community to work collaboratively towards improving our understanding of asthma as well as mobile health research best practices.
Integration of a neuroimaging processing pipeline into a pan-canadian computing grid
NASA Astrophysics Data System (ADS)
Lavoie-Courchesne, S.; Rioux, P.; Chouinard-Decorte, F.; Sherif, T.; Rousseau, M.-E.; Das, S.; Adalat, R.; Doyon, J.; Craddock, C.; Margulies, D.; Chu, C.; Lyttelton, O.; Evans, A. C.; Bellec, P.
2012-02-01
The ethos of the neuroimaging field is quickly moving towards the open sharing of resources, including both imaging databases and processing tools. As a neuroimaging database represents a large volume of datasets and as neuroimaging processing pipelines are composed of heterogeneous, computationally intensive tools, such open sharing raises specific computational challenges. This motivates the design of novel dedicated computing infrastructures. This paper describes an interface between PSOM, a code-oriented pipeline development framework, and CBRAIN, a web-oriented platform for grid computing. This interface was used to integrate a PSOM-compliant pipeline for preprocessing of structural and functional magnetic resonance imaging into CBRAIN. We further tested the capacity of our infrastructure to handle a real large-scale project. A neuroimaging database including close to 1000 subjects was preprocessed using our interface and publicly released to help the participants of the ADHD-200 international competition. This successful experiment demonstrated that our integrated grid-computing platform is a powerful solution for high-throughput pipeline analysis in the field of neuroimaging.
A web-portal for interactive data exploration, visualization, and hypothesis testing
Bartsch, Hauke; Thompson, Wesley K.; Jernigan, Terry L.; Dale, Anders M.
2014-01-01
Clinical research studies generate data that need to be shared and statistically analyzed by their participating institutions. The distributed nature of research and the different domains involved present major challenges to data sharing, exploration, and visualization. The Data Portal infrastructure was developed to support ongoing research in the areas of neurocognition, imaging, and genetics. Researchers benefit from the integration of data sources across domains, the explicit representation of knowledge from domain experts, and user interfaces providing convenient access to project specific data resources and algorithms. The system provides an interactive approach to statistical analysis, data mining, and hypothesis testing over the lifetime of a study and fulfills a mandate of public sharing by integrating data sharing into a system built for active data exploration. The web-based platform removes barriers for research and supports the ongoing exploration of data. PMID:24723882
Wen, Can-Hong; Ou, Shao-Min; Guo, Xiao-Bo; Liu, Chen-Feng; Shen, Yan-Bo; You, Na; Cai, Wei-Hong; Shen, Wen-Jun; Wang, Xue-Qin; Tan, Hai-Zhu
2017-01-01
Breast cancer is a high-risk heterogeneous disease with myriad subtypes and complicated biological features. The Cancer Genome Atlas (TCGA) breast cancer database provides researchers with the large-scale genome and clinical data via web portals and FTP services. Researchers are able to gain new insights into their related fields, and evaluate experimental discoveries with TCGA. However, it is difficult for researchers who have little experience with database and bioinformatics to access and operate on because of TCGA’s complex data format and diverse files. For ease of use, we build the breast cancer (B-CAN) platform, which enables data customization, data visualization, and private data center. The B-CAN platform runs on Apache server and interacts with the backstage of MySQL database by PHP. Users can customize data based on their needs by combining tables from original TCGA database and selecting variables from each table. The private data center is applicable for private data and two types of customized data. A key feature of the B-CAN is that it provides single table display and multiple table display. Customized data with one barcode corresponding to many records and processed customized data are allowed in Multiple Tables Display. The B-CAN is an intuitive and high-efficient data-sharing platform. PMID:29312567
Sharing and interoperation of Digital Dongying geospatial data
NASA Astrophysics Data System (ADS)
Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an
2006-10-01
Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.
NASA Astrophysics Data System (ADS)
Suzuki, Takeyasu
For the purpose of reducing disaster damage by applying information sharing technologies, "the research on disaster reduction using crisis-adaptive information sharing technologies" was carried out from July, 2004 through March 2007, as a three year joint project composed of a government office and agency, national research institutes, universities, lifeline corporations, a NPO and a private company. In this project, the disaster mitigating information sharing platform which is effective to disaster response activities mainly for local governments was developed, as a framework which enables information sharing in disasters. A prototype of the platform was built by integrating an individual system and tool. Then, it was applied to actual local governments and proved to be effective to disaster responses. This paper summarizes the research project. It defines the platform as a framework of both information contents and information systems first and describes information sharing technologies developed for utilization of the platform. It also introduces fields tests in which a prototype of the platform was applied to local governments.
ClearedLeavesDB: an online database of cleared plant leaf images
2014-01-01
Background Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. Description The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. Conclusions We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org. PMID:24678985
ClearedLeavesDB: an online database of cleared plant leaf images.
Das, Abhiram; Bucksch, Alexander; Price, Charles A; Weitz, Joshua S
2014-03-28
Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org.
NASA Astrophysics Data System (ADS)
Hugo, Wim
2013-04-01
Over the past 3 years, SAEON has worked with a number of stakeholders and funders to establish a shared platform for the management of dissemination of E&EO research outputs, data sets, and services. This platform is strongly aligned with GEO principles and architecture, allowing direct integration with the GEOSS Broker. The platform has two important characteristics: 1. It reduces the cost and lead time of provision of similar infrastructure for future initiatives. 2. The platform is domain-agnostic to some degree, and can be used for non E&EO applications. Projects to achive this is under way at present. The paper describes the application of the platform for a variety of user communities and initiatives (SAEON Data Portal, South African Earth Observation System, Risk and Vulnerability Atlas, BioEnergy Atlas, National Spatial Information Framework, ICSU World Data System Components, and many more), and demonstrates use cases utilising a distributed, service oriented architecture. Significant improvements have been made to the interoperability functions available to end users and content providers, and these are demonstrated and discussed in detail. Functions include • Creation and persistence of composite maps, as well as time series or scatter charts, supporting a variety of standardized data sources. • Search facilities have been extended to allow analysis and filtering of primary search results, and to deal with large meta-data collections. • In addition, data sources, data listings, news items, images, search results, and other platform content can, with increasing flexibility, be accessed as standardized services that are processed in standardized clients, allowing creation of a rich user interface, and permitting the inclusion of platform functionality into external websites and resources. This shift to explicit service-oriented, peer-to-peer architecture is a preparation for increased distributed processing and content composition, and will support the concept of virtualization of 'science gateways' based on the platform, in support of a growing number of domains and initiatives.
Design of the Hospital Integrated Information Management System Based on Cloud Platform.
Aijing, L; Jin, Y
2015-12-01
At present, the outdated information management style cannot meet the needs of hospital management, and has become the bottleneck of hospital's management and development. In order to improve the integrated management of information, hospitals have increased their investment in integrated information management systems. On account of the lack of reasonable and scientific design, some hospital integrated information management systems have common problems, such as unfriendly interface, poor portability and maintainability, low security and efficiency, lack of interactivity and information sharing. To solve the problem, this paper carries out the research and design of a hospital information management system based on cloud platform, which can realize the optimized integration of hospital information resources and save money.
A Novel Market-Oriented Dynamic Collaborative Cloud Service Platform
NASA Astrophysics Data System (ADS)
Hassan, Mohammad Mehedi; Huh, Eui-Nam
In today's world the emerging Cloud computing (Weiss, 2007) offer a new computing model where resources such as computing power, storage, online applications and networking infrastructures can be shared as "services" over the internet. Cloud providers (CPs) are incentivized by the profits to be made by charging consumers for accessing these services. Consumers, such as enterprises, are attracted by the opportunity for reducing or eliminating costs associated with "in-house" provision of these services.
Reproducing Epidemiologic Research and Ensuring Transparency.
Coughlin, Steven S
2017-08-15
Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
ROS-IGTL-Bridge: an open network interface for image-guided therapy using the ROS environment.
Frank, Tobias; Krieger, Axel; Leonard, Simon; Patel, Niravkumar A; Tokuda, Junichi
2017-08-01
With the growing interest in advanced image-guidance for surgical robot systems, rapid integration and testing of robotic devices and medical image computing software are becoming essential in the research and development. Maximizing the use of existing engineering resources built on widely accepted platforms in different fields, such as robot operating system (ROS) in robotics and 3D Slicer in medical image computing could simplify these tasks. We propose a new open network bridge interface integrated in ROS to ensure seamless cross-platform data sharing. A ROS node named ROS-IGTL-Bridge was implemented. It establishes a TCP/IP network connection between the ROS environment and external medical image computing software using the OpenIGTLink protocol. The node exports ROS messages to the external software over the network and vice versa simultaneously, allowing seamless and transparent data sharing between the ROS-based devices and the medical image computing platforms. Performance tests demonstrated that the bridge could stream transforms, strings, points, and images at 30 fps in both directions successfully. The data transfer latency was <1.2 ms for transforms, strings and points, and 25.2 ms for color VGA images. A separate test also demonstrated that the bridge could achieve 900 fps for transforms. Additionally, the bridge was demonstrated in two representative systems: a mock image-guided surgical robot setup consisting of 3D slicer, and Lego Mindstorms with ROS as a prototyping and educational platform for IGT research; and the smart tissue autonomous robot surgical setup with 3D Slicer. The study demonstrated that the bridge enabled cross-platform data sharing between ROS and medical image computing software. This will allow rapid and seamless integration of advanced image-based planning/navigation offered by the medical image computing software such as 3D Slicer into ROS-based surgical robot systems.
NASA Astrophysics Data System (ADS)
Wang, Tusheng; Yang, Yuanyuan; Zhang, Jianguo
2013-03-01
In order to enable multiple disciplines of medical researchers, clinical physicians and biomedical engineers working together in a secured, efficient, and transparent cooperative environment, we had designed an e-Science platform for biomedical imaging research and application cross multiple academic institutions and hospitals in Shanghai by using grid-based or cloud-based distributed architecture and presented this work in SPIE Medical Imaging conference held in San Diego in 2012. However, when the platform integrates more and more nodes over different networks, the first challenge is that how to monitor and maintain all the hosts and services operating cross multiple academic institutions and hospitals in the e-Science platform, such as DICOM and Web based image communication services, messaging services and XDS ITI transaction services. In this presentation, we presented a system design and implementation of intelligent monitoring and management which can collect system resource status of every node in real time, alert when node or service failure occurs, and can finally improve the robustness, reliability and service continuity of this e-Science platform.
NASA Astrophysics Data System (ADS)
Qun, Zeng; Xiaocheng, Zhong
Knowledge sharing means that an individual, team and organization share the knowledge with other members of the organization in the course of activities through the various ways. This paper analyzes the obstacle factors in knowledge sharing based on the technical point, and chooses the Blog technology to build a platform for improving knowledge sharing between individuals. The construction of the platform is an important foundation for information literacy education, and it also can be used to achieve online information literacy education. Finally, it gives a detailed analysis of its functions, advantages and disadvantages.
Data sharing platforms for de-identified data from human clinical trials.
Huser, Vojtech; Shmueli-Blumberg, Dikla
2018-04-01
Data sharing of de-identified individual participant data is being adopted by an increasing number of sponsors of human clinical trials. In addition to standardizing data syntax for shared trial data, semantic integration of various data elements is the focus of several initiatives that define research common data elements. This perspective article, in the first part, compares several data sharing platforms for de-identified clinical research data in terms of their size, policies and supported features. In the second part, we use a case study approach to describe in greater detail one data sharing platform (Data Share from National Institute of Drug Abuse). We present data on the past use of the platform, data formats offered, data de-identification approaches and its use of research common data elements. We conclude with a summary of current and expected future trends that facilitate secondary research use of data from completed human clinical trials.
Pyritz, Lennart W.; Fichtel, Claudia; Huchard, Elise; Kappeler, Peter M.
2013-01-01
Social animals have to coordinate joint movements to maintain group cohesion, but the latter is often compromised by diverging individual interests. A widespread behavioral mechanism to achieve coordination relies on shared or unshared consensus decision-making. If consensus costs are high, group fission represents an alternative tactic. Exploring determinants and outcomes of spontaneous group decisions and coordination of free-ranging animals is methodologically challenging. We therefore conducted a foraging experiment with a group of wild redfronted lemurs (Eulemur rufifrons) to study decision outcomes, coordination of movements, individual foraging benefits and social interactions in response to the presentation of drinking platforms with varying baiting patterns. Behavioral observations were complemented with data from recordings of motion detector cameras installed at the platforms. The animal's behavior in the experimental conditions was compared to natural group movements. We could not determine the type of consensus decision-making because the group visited platforms randomly. The group fissioned during 23.3% of platform visits, and fissioning resulted in more individuals drinking simultaneously. As under natural conditions, adult females initiated most group movements, but overtaking by individuals of different age and sex classes occurred in 67% of movements to platforms, compared to only 18% during other movements. As a result, individual resource intake at the platforms did not depend on departure position, age or sex, but on arrival order. Aggression at the platforms did not affect resource intake, presumably due to low supplanting rates. Our findings highlight the diversity of coordination processes and related consequences for individual foraging benefits in a primate group living under natural conditions. PMID:23326392
A Novel College Network Resource Management Method using Cloud Computing
NASA Astrophysics Data System (ADS)
Lin, Chen
At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.
Development and evaluation of task-specific NLP framework in China.
Ge, Caixia; Zhang, Yinsheng; Huang, Zhenzhen; Jia, Zheng; Ju, Meizhi; Duan, Huilong; Li, Haomin
2015-01-01
Natural language processing (NLP) has been designed to convert narrative text into structured data. Although some general NLP architectures have been developed, a task-specific NLP framework to facilitate the effective use of data is still a challenge in lexical resource limited regions, such as China. The purpose of this study is to design and develop a task-specific NLP framework to extract targeted information from particular documents by adopting dedicated algorithms on current limited lexical resources. In this framework, a shared and evolving ontology mechanism was designed. The result has shown that such a free text driven platform will accelerate the NLP technology acceptance in China.
CILogon: An Integrated Identity and Access Management Platform for Science
NASA Astrophysics Data System (ADS)
Basney, J.
2016-12-01
When scientists work together, they use web sites and other software to share their ideas and data. To ensure the integrity of their work, these systems require the scientists to log in and verify that they are part of the team working on a particular science problem. Too often, the identity and access verification process is a stumbling block for the scientists. Scientific research projects are forced to invest time and effort into developing and supporting Identity and Access Management (IAM) services, distracting them from the core goals of their research collaboration. CILogon provides an IAM platform that enables scientists to work together to meet their IAM needs more effectively so they can allocate more time and effort to their core mission of scientific research. The CILogon platform enables federated identity management and collaborative organization management. Federated identity management enables researchers to use their home organization identities to access cyberinfrastructure, rather than requiring yet another username and password to log on. Collaborative organization management enables research projects to define user groups for authorization to collaboration platforms (e.g., wikis, mailing lists, and domain applications). CILogon's IAM platform serves the unique needs of research collaborations, namely the need to dynamically form collaboration groups across organizations and countries, sharing access to data, instruments, compute clusters, and other resources to enable scientific discovery. CILogon provides a software-as-a-service platform to ease integration with cyberinfrastructure, while making all software components publicly available under open source licenses to enable re-use. Figure 1 illustrates the components and interfaces of this platform. CILogon has been operational since 2010 and has been used by over 7,000 researchers from more than 170 identity providers to access cyberinfrastructure including Globus, LIGO, Open Science Grid, SeedMe, and XSEDE. The "CILogon 2.0" platform, launched in 2016, adds support for virtual organization (VO) membership management, identity linking, international collaborations, and standard integration protocols, through integration with the Internet2 COmanage collaboration software.
GLIDE: a grid-based light-weight infrastructure for data-intensive environments
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Malek, Sam; Beckman, Nels; Mikic-Rakic, Marija; Medvidovic, Nenad; Chrichton, Daniel J.
2005-01-01
The promise of the grid is that it will enable public access and sharing of immense amounts of computational and data resources among dynamic coalitions of individuals and institutions. However, the current grid solutions make several limiting assumptions that curtail their widespread adoption. To address these limitations, we present GLIDE, a prototype light-weight, data-intensive middleware infrastructure that enables access to the robust data and computational power of the grid on DREAM platforms.
Security Broker—A Complementary Tool for SOA Security
NASA Astrophysics Data System (ADS)
Kamatchi, R.; Rakshit, Atanu
2011-09-01
The Service Oriented Architecture along with web services is providing a new dimension to the world of reusability and resource sharing. The services developed by a creator can be used by any service consumers from anywhere despite of their platforms used. This open nature of the SOA architecture is also raising the issues of security at various levels of usage. This is paper is discussing on the implementation benefits of a service broker with the Service Oriented Architecture.
Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N
2017-03-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.
Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.
2016-01-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692
Bensman, Rachel S; Slusher, Tina M; Butteris, Sabrina M; Pitt, Michael B; On Behalf Of The Sugar Pearls Investigators; Becker, Amanda; Desai, Brinda; George, Alisha; Hagen, Scott; Kiragu, Andrew; Johannsen, Ron; Miller, Kathleen; Rule, Amy; Webber, Sarah
2017-11-01
The authors describe a multiinstitutional collaborative project to address a gap in global health training by creating a free online platform to share a curriculum for performing procedures in resource-limited settings. This curriculum called PEARLS (Procedural Education for Adaptation to Resource-Limited Settings) consists of peer-reviewed instructional and demonstration videos describing modifications for performing common pediatric procedures in resource-limited settings. Adaptations range from the creation of a low-cost spacer for inhaled medications to a suction chamber for continued evacuation of a chest tube. By describing the collaborative process, we provide a model for educators in other fields to collate and disseminate procedural modifications adapted for their own specialty and location, ideally expanding this crowd-sourced curriculum to reach a wide audience of trainees and providers in global health.
An Open Data Platform in the framework of the EGI-LifeWatch Competence Center
NASA Astrophysics Data System (ADS)
Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Yaiza Rodríguez Marrero, Ana
2016-04-01
The working pilot of an Open Data Platform supporting the full data cycle in research is presented. It aims to preserve knowledge explicitly, starting with the description of the Case Studies, and integrating data and software management and preservation on equal basis. The uninterrupted support in the chain starts at the data acquisition level and covers up to the support for reuse and publication in an open framework, providing integrity and provenance controls. The Lifewatch Open Science Framework is a pilot web portal developed in collaboration with different commercial companies that tries to enrich and integrate different data lifecycle-related tools in order to address the management of the different steps: data planning, gathering, storing, curation, preservation, sharing, discovering, etc. To achieve this goal, the platform includes the following features: -Data Management Planning. Tool to set up an structure of the data, including what data will be generated, how it will be exploited, re-used, curated, preserved, etc. It has a semantic approach: includes reference to ontologies in order to express what data will be gathered. -Close to instrumentation. The portal includes a distributed storage system that can be used both for storing data from instruments and output data from analysis. All that data can be shared -Analysis. Resources from EGI Federated Cloud are accessible within the portal, so that users can exploit computing resources to perform analysis and other processes, including workflows. -Preservation. Data can be preserved in different systems and DOIs can be minted not only for datasets but also for software, DMPs, etc. The presentation will show the different components of the framework as well as how it can be extrapolated to other communities.
Globus Identity, Access, and Data Management: Platform Services for Collaborative Science
NASA Astrophysics Data System (ADS)
Ananthakrishnan, R.; Foster, I.; Wagner, R.
2016-12-01
Globus is software-as-a-service for research data management, developed at, and operated by, the University of Chicago. Globus, accessible at www.globus.org, provides high speed, secure file transfer; file sharing directly from existing storage systems; and data publication to institutional repositories. 40,000 registered users have used Globus to transfer tens of billions of files totaling hundreds of petabytes between more than 10,000 storage systems within campuses and national laboratories in the US and internationally. Web, command line, and REST interfaces support both interactive use and integration into applications and infrastructures. An important component of the Globus system is its foundational identity and access management (IAM) platform service, Globus Auth. Both Globus research data management and other applications use Globus Auth for brokering authentication and authorization interactions between end-users, identity providers, resource servers (services), and a range of clients, including web, mobile, and desktop applications, and other services. Compliant with important standards such as OAuth, OpenID, and SAML, Globus Auth provides mechanisms required for an extensible, integrated ecosystem of services and clients for the research and education community. It underpins projects such as the US National Science Foundation's XSEDE system, NCAR's Research Data Archive, and the DOE Systems Biology Knowledge Base. Current work is extending Globus services to be compliant with FEDRAMP standards for security assessment, authorization, and monitoring for cloud services. We will present Globus IAM solutions and give examples of Globus use in various projects for federated access to resources. We will also describe how Globus Auth and Globus research data management capabilities enable rapid development and low-cost operations of secure data sharing platforms that leverage Globus services and integrate them with local policy and security.
Perspectives of the optical coherence tomography community on code and data sharing
NASA Astrophysics Data System (ADS)
Lurie, Kristen L.; Mistree, Behram F. T.; Ellerbee, Audrey K.
2015-03-01
As optical coherence tomography (OCT) grows to be a mature and successful field, it is important for the research community to develop a stronger practice of sharing code and data. A prolific culture of sharing can enable new and emerging laboratories to enter the field, allow research groups to gain new exposure and notoriety, and enable benchmarking of new algorithms and methods. Our long-term vision is to build tools to facilitate a stronger practice of sharing within this community. In line with this goal, our first aim was to understand the perceptions and practices of the community with respect to sharing research contributions (i.e., as code and data). We surveyed 52 members of the OCT community using an online polling system. Our main findings indicate that while researchers infrequently share their code and data, they are willing to contribute their research resources to a shared repository, and they believe that such a repository would benefit both their research and the OCT community at large. We plan to use the results of this survey to design a platform targeted to the OCT research community - an effort that ultimately aims to facilitate a more prolific culture of sharing.
DIMP: an interoperable solution for software integration and product data exchange
NASA Astrophysics Data System (ADS)
Wang, Xi Vincent; Xu, Xun William
2012-08-01
Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.
Guidelines for ethical and professional use of social media in a hand surgery practice.
Lifchez, Scott D; McKee, Desirae M; Raven, Raymond B; Shafritz, Adam B; Tueting, Jonathan L
2012-12-01
In growing numbers, patients are using social media platforms as resources to obtain health information and report their experiences in the health care setting. More physicians are making use of these platforms as a means to reach prospective and existing patients, to share information with each other, and to educate the public. In this ever-expanding online dialogue, questions have arisen regarding appropriate conduct of the physician during these interactions. The purpose of this article is to review the laws that govern online communication as they pertain to physician presence in this forum and to discuss appropriate ethical and professional behavior in this setting. Copyright © 2012 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Design of the Hospital Integrated Information Management System Based on Cloud Platform
Aijing, L; Jin, Y
2015-01-01
ABSTRACT At present, the outdated information management style cannot meet the needs of hospital management, and has become the bottleneck of hospital's management and development. In order to improve the integrated management of information, hospitals have increased their investment in integrated information management systems. On account of the lack of reasonable and scientific design, some hospital integrated information management systems have common problems, such as unfriendly interface, poor portability and maintainability, low security and efficiency, lack of interactivity and information sharing. To solve the problem, this paper carries out the research and design of a hospital information management system based on cloud platform, which can realize the optimized integration of hospital information resources and save money. PMID:27399033
Sharing Health Big Data for Research - A Design by Use Cases: The INSHARE Platform Approach.
Bouzillé, Guillaume; Westerlynck, Richard; Defossez, Gautier; Bouslimi, Dalel; Bayat, Sahar; Riou, Christine; Busnel, Yann; Le Guillou, Clara; Cauvin, Jean-Michel; Jacquelinet, Christian; Pladys, Patrick; Oger, Emmanuel; Stindel, Eric; Ingrand, Pierre; Coatrieux, Gouenou; Cuggia, Marc
2017-01-01
Sharing and exploiting Health Big Data (HBD) allow tackling challenges: data protection/governance taking into account legal, ethical, and deontological aspects enables trust, transparent and win-win relationship between researchers, citizens, and data providers. Lack of interoperability: compartmentalized and syntactically/semantica heterogeneous data. INSHARE project using experimental proof of concept explores how recent technologies overcome such issues. Using 6 data providers, platform is designed via 3 steps to: (1) analyze use cases, needs, and requirements; (2) define data sharing governance, secure access to platform; and (3) define platform specifications. Three use cases - from 5 studies and 11 data sources - were analyzed for platform design. Governance derived from SCANNER model was adapted to data sharing. Platform architecture integrates: data repository and hosting, semantic integration services, data processing, aggregate computing, data quality and integrity monitoring, Id linking, multisource query builder, visualization and data export services, data governance, study management service and security including data watermarking.
NASA Astrophysics Data System (ADS)
Sushko, Iurii; Novotarskyi, Sergii; Körner, Robert; Pandey, Anil Kumar; Rupp, Matthias; Teetz, Wolfram; Brandmaier, Stefan; Abdelaziz, Ahmed; Prokopenko, Volodymyr V.; Tanchuk, Vsevolod Y.; Todeschini, Roberto; Varnek, Alexandre; Marcou, Gilles; Ertl, Peter; Potemkin, Vladimir; Grishina, Maria; Gasteiger, Johann; Schwab, Christof; Baskin, Igor I.; Palyulin, Vladimir A.; Radchenko, Eugene V.; Welsh, William J.; Kholodovych, Vladyslav; Chekmarev, Dmitriy; Cherkasov, Artem; Aires-de-Sousa, Joao; Zhang, Qing-You; Bender, Andreas; Nigsch, Florian; Patiny, Luc; Williams, Antony; Tkachenko, Valery; Tetko, Igor V.
2011-06-01
The Online Chemical Modeling Environment is a web-based platform that aims to automate and simplify the typical steps required for QSAR modeling. The platform consists of two major subsystems: the database of experimental measurements and the modeling framework. A user-contributed database contains a set of tools for easy input, search and modification of thousands of records. The OCHEM database is based on the wiki principle and focuses primarily on the quality and verifiability of the data. The database is tightly integrated with the modeling framework, which supports all the steps required to create a predictive model: data search, calculation and selection of a vast variety of molecular descriptors, application of machine learning methods, validation, analysis of the model and assessment of the applicability domain. As compared to other similar systems, OCHEM is not intended to re-implement the existing tools or models but rather to invite the original authors to contribute their results, make them publicly available, share them with other users and to become members of the growing research community. Our intention is to make OCHEM a widely used platform to perform the QSPR/QSAR studies online and share it with other users on the Web. The ultimate goal of OCHEM is collecting all possible chemoinformatics tools within one simple, reliable and user-friendly resource. The OCHEM is free for web users and it is available online at http://www.ochem.eu.
Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets
Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L
2014-01-01
Background As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Methods Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Results Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Conclusions Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. PMID:24464852
Staccini, Pascal; Dufour, Jean-Charles; Raps, Hervé; Fieschi, Marius
2005-01-01
Making educational material be available on a network cannot be reduced to merely implementing hypermedia and interactive resources on a server. A pedagogical schema has to be defined to guide students for learning and to provide teachers with guidelines to prepare valuable and upgradeable resources. Components of a learning environment, as well as interactions between students and other roles such as author, tutor and manager, can be deduced from cognitive foundations of learning, such as the constructivist approach. Scripting the way a student will to navigate among information nodes and interact with tools to build his/her own knowledge can be a good way of deducing the features of the graphic interface related to the management of the objects. We defined a typology of pedagogical resources, their data model and their logic of use. We implemented a generic and web-based authoring and publishing platform (called J@LON for Join And Learn On the Net) within an object-oriented and open-source programming environment (called Zope) embedding a content management system (called Plone). Workflow features have been used to mark the progress of students and to trace the life cycle of resources shared by the teaching staff. The platform integrated advanced on line authoring features to create interactive exercises and support live courses diffusion. The platform engine has been generalized to the whole curriculum of medical studies in our faculty; it also supports an international master of risk management in health care and will be extent to all other continuous training diploma.
Ethical sharing of health data in online platforms - which values should be considered?
Riso, Brígida; Tupasela, Aaro; Vears, Danya F; Felzmann, Heike; Cockbain, Julian; Loi, Michele; Kongsholm, Nana C H; Zullo, Silvia; Rakic, Vojin
2017-08-21
Intensified and extensive data production and data storage are characteristics of contemporary western societies. Health data sharing is increasing with the growth of Information and Communication Technology (ICT) platforms devoted to the collection of personal health and genomic data. However, the sensitive and personal nature of health data poses ethical challenges when data is disclosed and shared even if for scientific research purposes.With this in mind, the Science and Values Working Group of the COST Action CHIP ME 'Citizen's Health through public-private Initiatives: Public health, Market and Ethical perspectives' (IS 1303) identified six core values they considered to be essential for the ethical sharing of health data using ICT platforms. We believe that using this ethical framework will promote respectful scientific practices in order to maintain individuals' trust in research.We use these values to analyse five ICT platforms and explore how emerging data sharing platforms are reconfiguring the data sharing experience from a range of perspectives. We discuss which types of values, rights and responsibilities they entail and enshrine within their philosophy or outlook on what it means to share personal health information. Through this discussion we address issues of the design and the development process of personal health data and patient-oriented infrastructures, as well as new forms of technologically-mediated empowerment.
DREAM: Distributed Resources for the Earth System Grid Federation (ESGF) Advanced Management
NASA Astrophysics Data System (ADS)
Williams, D. N.
2015-12-01
The data associated with climate research is often generated, accessed, stored, and analyzed on a mix of unique platforms. The volume, variety, velocity, and veracity of this data creates unique challenges as climate research attempts to move beyond stand-alone platforms to a system that truly integrates dispersed resources. Today, sharing data across multiple facilities is often a challenge due to the large variance in supporting infrastructures. This results in data being accessed and downloaded many times, which requires significant amounts of resources, places a heavy analytic development burden on the end users, and mismanaged resources. Working across U.S. federal agencies, international agencies, and multiple worldwide data centers, and spanning seven international network organizations, the Earth System Grid Federation (ESGF) has begun to solve this problem. Its architecture employs a system of geographically distributed peer nodes that are independently administered yet united by common federation protocols and application programming interfaces. However, significant challenges remain, including workflow provenance, modular and flexible deployment, scalability of a diverse set of computational resources, and more. Expanding on the existing ESGF, the Distributed Resources for the Earth System Grid Federation Advanced Management (DREAM) will ensure that the access, storage, movement, and analysis of the large quantities of data that are processed and produced by diverse science projects can be dynamically distributed with proper resource management. This system will enable data from an infinite number of diverse sources to be organized and accessed from anywhere on any device (including mobile platforms). The approach offers a powerful roadmap for the creation and integration of a unified knowledge base of an entire ecosystem, including its many geophysical, geographical, social, political, agricultural, energy, transportation, and cyber aspects. The resulting aggregation of data combined with analytics services has the potential to generate an informational universe and knowledge system of unprecedented size and value to the scientific community, downstream applications, decision makers, and the public.
Exploring Characterizations of Learning Object Repositories Using Data Mining Techniques
NASA Astrophysics Data System (ADS)
Segura, Alejandra; Vidal, Christian; Menendez, Victor; Zapata, Alfredo; Prieto, Manuel
Learning object repositories provide a platform for the sharing of Web-based educational resources. As these repositories evolve independently, it is difficult for users to have a clear picture of the kind of contents they give access to. Metadata can be used to automatically extract a characterization of these resources by using machine learning techniques. This paper presents an exploratory study carried out in the contents of four public repositories that uses clustering and association rule mining algorithms to extract characterizations of repository contents. The results of the analysis include potential relationships between different attributes of learning objects that may be useful to gain an understanding of the kind of resources available and eventually develop search mechanisms that consider repository descriptions as a criteria in federated search.
Neuroimaging Data Sharing on the Neuroinformatics Database Platform
Book, Gregory A; Stevens, Michael; Assaf, Michal; Glahn, David; Pearlson, Godfrey D
2015-01-01
We describe the Neuroinformatics Database (NiDB), an open-source database platform for archiving, analysis, and sharing of neuroimaging data. Data from the multi-site projects Autism Brain Imaging Data Exchange (ABIDE), Bipolar-Schizophrenia Network on Intermediate Phenotypes parts one and two (B-SNIP1, B-SNIP2), and Monetary Incentive Delay task (MID) are available for download from the public instance of NiDB, with more projects sharing data as it becomes available. As demonstrated by making several large datasets available, NiDB is an extensible platform appropriately suited to archive and distribute shared neuroimaging data. PMID:25888923
Chase, Katherine J.; Bock, Andrew R.; Sando, Roy
2017-01-05
This report provides an overview of current (2016) U.S. Geological Survey policies and practices related to publishing data on ScienceBase, and an example interactive mapping application to display those data. ScienceBase is an integrated data sharing platform managed by the U.S. Geological Survey. This report describes resources that U.S. Geological Survey Scientists can use for writing data management plans, formatting data, and creating metadata, as well as for data and metadata review, uploading data and metadata to ScienceBase, and sharing metadata through the U.S. Geological Survey Science Data Catalog. Because data publishing policies and practices are evolving, scientists should consult the resources cited in this paper for definitive policy information.An example is provided where, using the content of a published ScienceBase data release that is associated with an interpretive product, a simple user interface is constructed to demonstrate how the open source capabilities of the R programming language and environment can interact with the properties and objects of the ScienceBase item and be used to generate interactive maps.
Invisible Mars: New Visuals for Communicating MAVEN's Story
NASA Astrophysics Data System (ADS)
Shupla, C. B.; Ali, N. A.; Jones, A. P.; Mason, T.; Schneider, N. M.; Brain, D. A.; Blackwell, J.
2016-12-01
Invisible Mars tells the story of Mars' evolving atmosphere, through a script and a series of visuals as a live presentation. Created for Science-On-A-Sphere, the presentation has also been made available to planetariums, and is being expanded to other platforms. The script has been updated to include results from the Mars Atmosphere and Volatile Evolution Mission (MAVEN), and additional visuals have been produced. This poster will share the current Invisible Mars resources available and the plans to further disseminate this presentation.
Algers, Anne; Silva-Fletcher, Ayona; Gregory, Neville; Hunt, Melvin
2013-11-01
Design science research was used for the generation, use and evaluation of a model for knowledge sharing in the user community through open educational resources (OER). The focus of interest was on the development process of a model for knowledge sharing that emphasizes the characteristics and the needs of the user community; the empowerment and democratic issues of openness; the collaboration between institutions and dialog with society; and the consideration of quality and sustainability issues. Initially, the community needs were analyzed through surveys and workshops, and the findings used, through negotiations, to formulate the development process. An open-training platform served as an infrastructure and included a repository with OER, a wiki and a discussion forum. The purpose of this article is an attempt to provide universities with a plan and template for integrated knowledge sharing that responds to societal needs. Usability and usefulness has not been evaluated. Copyright © 2013 Elsevier Ltd. All rights reserved.
Tamara Heartsill Scalley; Saara DeWalt; François Korysko; Guy Van Laere; Kasey Jacobs; Seth Panka; Joseph Torres
2016-01-01
We presented a new information-sharing platform at the 16th Caribbean Foresters Meeting in August 2013 to facilitate and promote collaboration among Caribbean foresters. The platform can be accessed through the Caribbean Foresters website where information and data on forest research sites can be shared. There is a special focus on identifying potential collaborations...
Raising Virtual Laboratories in Australia onto global platforms
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Barker, M.; Fraser, R.; Evans, B. J. K.; Moloney, G.; Proctor, R.; Moise, A. F.; Hamish, H.
2016-12-01
Across the globe, Virtual Laboratories (VLs), Science Gateways (SGs), and Virtual Research Environments (VREs) are being developed that enable users who are not co-located to actively work together at various scales to share data, models, tools, software, workflows, best practices, etc. Outcomes range from enabling `long tail' researchers to more easily access specific data collections, to facilitating complex workflows on powerful supercomputers. In Australia, government funding has facilitated the development of a range of VLs through the National eResearch Collaborative Tools and Resources (NeCTAR) program. The VLs provide highly collaborative, research-domain oriented, integrated software infrastructures that meet user community needs. Twelve VLs have been funded since 2012, including the Virtual Geophysics Laboratory (VGL); Virtual Hazards, Impact and Risk Laboratory (VHIRL); Climate and Weather Science Laboratory (CWSLab); Marine Virtual Laboratory (MarVL); and Biodiversity and Climate Change Virtual Laboratory (BCCVL). These VLs share similar technical challenges, with common issues emerging on integration of tools, applications and access data collections via both cloud-based environments and other distributed resources. While each VL began with a focus on a specific research domain, communities of practice have now formed across the VLs around common issues, and facilitate identification of best practice case studies, and new standards. As a result, tools are now being shared where the VLs access data via data services using international standards such as ISO, OGC, W3C. The sharing of these approaches is starting to facilitate re-usability of infrastructure and is a step towards supporting interdisciplinary research. Whilst the focus of the VLs are Australia-centric, by using standards, these environments are able to be extended to analysis on other international datasets. Many VL datasets are subsets of global datasets and so extension to global is a small (and often requested) step. Similarly, most of the tools, software, and other technologies could be shared across infrastructures globally. Therefore, it is now time to better connect the Australian VLs with similar initiatives elsewhere to create international platforms that can contribute to global research challenges.
Shared Medical Imaging Repositories.
Lebre, Rui; Bastião, Luís; Costa, Carlos
2018-01-01
This article describes the implementation of a solution for the integration of ownership concept and access control over medical imaging resources, making possible the centralization of multiple instances of repositories. The proposed architecture allows the association of permissions to repository resources and delegation of rights to third entities. It includes a programmatic interface for management of proposed services, made available through web services, with the ability to create, read, update and remove all components resulting from the architecture. The resulting work is a role-based access control mechanism that was integrated with Dicoogle Open-Source Project. The solution has several application scenarios like, for instance, collaborative platforms for research and tele-radiology services deployed at Cloud.
Pienaar, Rudolph; Rannou, Nicolas; Bernal, Jorge; Hahn, Daniel; Grant, P Ellen
2015-01-01
The utility of web browsers for general purpose computing, long anticipated, is only now coming into fruition. In this paper we present a web-based medical image data and information management software platform called ChRIS ([Boston] Children's Research Integration System). ChRIS' deep functionality allows for easy retrieval of medical image data from resources typically found in hospitals, organizes and presents information in a modern feed-like interface, provides access to a growing library of plugins that process these data - typically on a connected High Performance Compute Cluster, allows for easy data sharing between users and instances of ChRIS and provides powerful 3D visualization and real time collaboration.
EFEHR - the European Facilities for Earthquake Hazard and Risk: beyond the web-platform
NASA Astrophysics Data System (ADS)
Danciu, Laurentiu; Wiemer, Stefan; Haslinger, Florian; Kastli, Philipp; Giardini, Domenico
2017-04-01
European Facilities for Earthquake Hazard and Risk (EEFEHR) represents the sustainable community resource for seismic hazard and risk in Europe. The EFEHR web platform is the main gateway to access data, models and tools as well as provide expertise relevant for assessment of seismic hazard and risk. The main services (databases and web-platform) are hosted at ETH Zurich and operated by the Swiss Seismological Service (Schweizerischer Erdbebendienst SED). EFEHR web-portal (www.efehr.org) collects and displays (i) harmonized datasets necessary for hazard and risk modeling, e.g. seismic catalogues, fault compilations, site amplifications, vulnerabilities, inventories; (ii) extensive seismic hazard products, namely hazard curves, uniform hazard spectra and maps for national and regional assessments. (ii) standardized configuration files for re-computing the regional seismic hazard models; (iv) relevant documentation of harmonized datasets, models and web-services. Today, EFEHR distributes full output of the 2013 European Seismic Hazard Model, ESHM13, as developed within the SHARE project (http://www.share-eu.org/); the latest results of the 2014 Earthquake Model of the Middle East (EMME14), derived within the EMME Project (www.emme-gem.org); the 2001 Global Seismic Hazard Assessment Project (GSHAP) results and the 2015 updates of the Swiss Seismic Hazard. New datasets related to either seismic hazard or risk will be incorporated as they become available. We present the currents status of the EFEHR platform, with focus on the challenges, summaries of the up-to-date datasets, user experience and feedback, as well as the roadmap to future technological innovation beyond the web-platform development. We also show the new services foreseen to fully integrate with the seismological core services of European Plate Observing System (EPOS).
E-learning: controlling costs and increasing value.
Walsh, Kieran
2015-04-01
E-learning now accounts for a substantial proportion of medical education provision. This progress has required significant investment and this investment has in turn come under increasing scrutiny so that the costs of e-learning may be controlled and its returns maximised. There are multiple methods by which the costs of e-learning can be controlled and its returns maximised. This short paper reviews some of those methods that are likely to be most effective and that are likely to save costs without compromising quality. Methods might include accessing free or low-cost resources from elsewhere; create short learning resources that will work on multiple devices; using open source platforms to host content; using in-house faculty to create content; sharing resources between institutions; and promoting resources to ensure high usage. Whatever methods are used to control costs or increase value, it is most important to evaluate the impact of these methods.
Design and implementation of a telecare information platform.
Li, Shing-Han; Wang, Ching-Yao; Lu, Wen-Hui; Lin, Yuan-Yuan; Yen, David C
2012-06-01
For the aging population and for people with dominant chronic diseases, countries all over the world are promoting an "Aging in Place" program with its primary focus on the implementation of telecare. In 2009, Taiwan held a "Health Care Value-Added Platinum Program" with the goal of promoting the development of "Telecare" services by integrating medical treatment, healthcare, information communication, medical equipment and materials and by linking related cross-discipline professions to enable people to familiarize themselves with preventive healthcare services offered in their household and community environments. In addition, this program can be utilized to effectively provide diversified healthcare service benefitting society as a whole. This study aims to promote a diversified telecare service network in Taiwan's household and community environments, establish telecare information platforms, build an internal network of various healthcare service modes, standardize externally interfacing telecare information networks, effectively utilize related healthcare service resources, and complete reasonable service resource links forming an up-to-date health information exchange network. To this end, the telecare information platform based on service oriented architecture (SOA) is designed to promote an open telecare information interface and sharing environment to assist in such tasks as developing healthcare information exchange services, integrating service resources among various different healthcare service modes, accessing externally complex community affairs information, supporting remote physiological information transmissions, and providing diversified remote innovative services. Information system architecture and system monitoring indices of various types of healthcare service modes are used for system integrations for future development and/or expansions.
NASA Astrophysics Data System (ADS)
Yu, Bailang; Wu, Jianping
2006-10-01
Spatial Information Grid (SIG) is an infrastructure that has the ability to provide the services for spatial information according to users' needs by means of collecting, sharing, organizing and processing the massive distributed spatial information resources. This paper presents the architecture, technologies and implementation of the Shanghai City Spatial Information Application and Service System, a SIG based platform, which is an integrated platform that serves for administration, planning, construction and development of the city. In the System, there are ten categories of spatial information resources, including city planning, land-use, real estate, river system, transportation, municipal facility construction, environment protection, sanitation, urban afforestation and basic geographic information data. In addition, spatial information processing services are offered as a means of GIS Web Services. The resources and services are all distributed in different web-based nodes. A single database is created to store the metadata of all the spatial information. A portal site is published as the main user interface of the System. There are three main functions in the portal site. First, users can search the metadata and consequently acquire the distributed data by using the searching results. Second, some spatial processing web applications that developed with GIS Web Services, such as file format conversion, spatial coordinate transfer, cartographic generalization and spatial analysis etc, are offered to use. Third, GIS Web Services currently available in the System can be searched and new ones can be registered. The System has been working efficiently in Shanghai Government Network since 2005.
Johnson, Jane; Collins, Teresa; Degeling, Christopher; Fawcett, Anne; Fisher, Andrew D; Freire, Rafael; Hazel, Susan J; Hood, Jennifer; Lloyd, Janice; Phillips, Clive J C; Stafford, Kevin; Tzioumis, Vicky; McGreevy, Paul D
2015-05-29
The need for undergraduate teaching of Animal Welfare and Ethics (AWE) in Australian and New Zealand veterinary courses reflects increasing community concerns and expectations about AWE; global pressures regarding food security and sustainability; the demands of veterinary accreditation; and fears that, unless students encounter AWE as part of their formal education, as veterinarians they will be relatively unaware of the discipline of animal welfare science. To address this need we are developing online resources to ensure Australian and New Zealand veterinary graduates have the knowledge, and the research, communication and critical reasoning skills, to fulfill the AWE role demanded of them by contemporary society. To prioritize development of these resources we assembled leaders in the field of AWE education from the eight veterinary schools in Australia and New Zealand and used modified deliberative polling. This paper describes the role of the poll in developing the first shared online curriculum resource for veterinary undergraduate learning and teaching in AWE in Australia and New Zealand. The learning and teaching strategies that ranked highest in the exercise were: scenario-based learning; a quality of animal life assessment tool; the so-called 'Human Continuum' discussion platform; and a negotiated curriculum.
NASA Astrophysics Data System (ADS)
Seber, D.; Baru, C.
2007-05-01
The Geosciences Network (GEON) project is a collaboration among multiple institutions to develop a cyberinfrastructure (CI) platform in support of integrative geoscience research activities. Taking advantage of the state-of-the-art information technology resources GEON researchers are building a cyberinfrastructure designed to enable data sharing, resource discovery, semantic data integration, high-end computations and 4D visualization in an easy-to-use web-based environment. The cyberinfrastructure in GEON is required to support an inherently distributed system, since the scientists, who are users as well as providers of resources, are themselves distributed. International collaborations are a natural extension of GEON; the geoscience research requires strong international collaborations. The goals of the i-GEON activities are to collaborate with international partners and jointly build a cyberinfrastructure for the geosciences to enable collaborative work environments. International partners can participate in GEON efforts, establish GEON nodes at their universities, institutes, or agencies and also contribute data and tools to the network. Via jointly run cyberinfrastructure workshops, the GEON team also introduces students, scientists, and research professionals to the concepts of IT-based geoscience research and education. Currently, joint activities are underway with the Chinese Academy of Sciences in China, the GEO Grid project at AIST in Japan, and the University of Hyderabad in India (where the activity is funded by the Indo-US Science and Technology Forum). Several other potential international partnerships are under consideration. iGEON is open to all international partners who are interested in working towards the goal of data sharing, managing and integration via IT-based platforms. Information about GEON and its international activities can be found at http:www.geongrid.org/
NASA Astrophysics Data System (ADS)
Rose, K.; Rowan, C.; Rager, D.; Dehlin, M.; Baker, D. V.; McIntyre, D.
2015-12-01
Multi-organizational research teams working jointly on projects often encounter problems with discovery, access to relevant existing resources, and data sharing due to large file sizes, inappropriate file formats, or other inefficient options that make collaboration difficult. The Energy Data eXchange (EDX) from Department of Energy's (DOE) National Energy Technology Laboratory (NETL) is an evolving online research environment designed to overcome these challenges in support of DOE's fossil energy goals while offering improved access to data driven products of fossil energy R&D such as datasets, tools, and web applications. In 2011, development of NETL's Energy Data eXchange (EDX) was initiated and offers i) a means for better preserving of NETL's research and development products for future access and re-use, ii) efficient, discoverable access to authoritative, relevant, external resources, and iii) an improved approach and tools to support secure, private collaboration and coordination between multi-organizational teams to meet DOE mission and goals. EDX presently supports fossil energy and SubTER Crosscut research activities, with an ever-growing user base. EDX is built on a heavily customized instance of the open source platform, Comprehensive Knowledge Archive Network (CKAN). EDX connects users to externally relevant data and tools through connecting to external data repositories built on different platforms and other CKAN platforms (e.g. Data.gov). EDX does not download and repost data or tools that already have an online presence. This leads to redundancy and even error. If a relevant resource already has an online instance, is hosted by another online entity, EDX will point users to that external host either using web services, inventorying URLs and other methods. EDX offers users the ability to leverage private-secure capabilities custom built into the system. The team is presently working on version 3 of EDX which will incorporate big data analytical capabilities amongst other advanced features.
World Water Online (WWO) Status and Prospects
NASA Astrophysics Data System (ADS)
Arctur, David; Maidment, David
2013-04-01
Water resources, weather, and natural disasters are not constrained by local, regional or national boundaries. Effective research, planning, and response to major events call for improved coordination and data sharing among many organizations, which requires improved interoperability among the organizations' diverse information systems. Just for the historical time series records of surface freshwater resources data compiled by U.S. national agencies, there are over 23 million distributed datasets available today. Cataloguing and searching efficiently for specific content from this many datasets presents a challenge to current standards and practices for digital geospatial catalogues. This presentation summarizes a new global platform for water resource information discovery and sharing, that provides coordinated, interactive access to water resource metadata for the complete holdings of the Global Runoff Data Centre, the U.S. Geological Survey, and other primary sources. In cases where the data holdings are not restricted by national policy, this interface enables direct access to the water resource data, hydrographs, and other derived products. This capability represents a framework in which any number of other services can be integrated in user-accessible workflows, such as to perform watershed delineation from any point on the stream network. World Water Online web services for mapping and metadata have been registered with GEOSS. In addition to summarizing the architecture and capabilities of World Water Online, future plans for integration with GEOSS and EarthCube will be presented.
CIAN - Cell Imaging and Analysis Network at the Biology Department of McGill University
Lacoste, J.; Lesage, G.; Bunnell, S.; Han, H.; Küster-Schöck, E.
2010-01-01
CF-31 The Cell Imaging and Analysis Network (CIAN) provides services and tools to researchers in the field of cell biology from within or outside Montreal's McGill University community. CIAN is composed of six scientific platforms: Cell Imaging (confocal and fluorescence microscopy), Proteomics (2-D protein gel electrophoresis and DiGE, fluorescent protein analysis), Automation and High throughput screening (Pinning robot and liquid handler), Protein Expression for Antibody Production, Genomics (real-time PCR), and Data storage and analysis (cluster, server, and workstations). Users submit project proposals, and can obtain training and consultation in any aspect of the facility, or initiate projects with the full-service platforms. CIAN is designed to facilitate training, enhance interactions, as well as share and maintain resources and expertise.
Virtual Patients on the Semantic Web: A Proof-of-Application Study
Dafli, Eleni; Antoniou, Panagiotis; Ioannidis, Lazaros; Dombros, Nicholas; Topps, David
2015-01-01
Background Virtual patients are interactive computer simulations that are increasingly used as learning activities in modern health care education, especially in teaching clinical decision making. A key challenge is how to retrieve and repurpose virtual patients as unique types of educational resources between different platforms because of the lack of standardized content-retrieving and repurposing mechanisms. Semantic Web technologies provide the capability, through structured information, for easy retrieval, reuse, repurposing, and exchange of virtual patients between different systems. Objective An attempt to address this challenge has been made through the mEducator Best Practice Network, which provisioned frameworks for the discovery, retrieval, sharing, and reuse of medical educational resources. We have extended the OpenLabyrinth virtual patient authoring and deployment platform to facilitate the repurposing and retrieval of existing virtual patient material. Methods A standalone Web distribution and Web interface, which contains an extension for the OpenLabyrinth virtual patient authoring system, was implemented. This extension was designed to semantically annotate virtual patients to facilitate intelligent searches, complex queries, and easy exchange between institutions. The OpenLabyrinth extension enables OpenLabyrinth authors to integrate and share virtual patient case metadata within the mEducator3.0 network. Evaluation included 3 successive steps: (1) expert reviews; (2) evaluation of the ability of health care professionals and medical students to create, share, and exchange virtual patients through specific scenarios in extended OpenLabyrinth (OLabX); and (3) evaluation of the repurposed learning objects that emerged from the procedure. Results We evaluated 30 repurposed virtual patient cases. The evaluation, with a total of 98 participants, demonstrated the system’s main strength: the core repurposing capacity. The extensive metadata schema presentation facilitated user exploration and filtering of resources. Usability weaknesses were primarily related to standard computer applications’ ease of use provisions. Most evaluators provided positive feedback regarding educational experiences on both content and system usability. Evaluation results replicated across several independent evaluation events. Conclusions The OpenLabyrinth extension, as part of the semantic mEducator3.0 approach, is a virtual patient sharing approach that builds on a collection of Semantic Web services and federates existing sources of clinical and educational data. It is an effective sharing tool for virtual patients and has been merged into the next version of the app (OpenLabyrinth 3.3). Such tool extensions may enhance the medical education arsenal with capacities of creating simulation/game-based learning episodes, massive open online courses, curricular transformations, and a future robust infrastructure for enabling mobile learning. PMID:25616272
Virtual patients on the semantic Web: a proof-of-application study.
Dafli, Eleni; Antoniou, Panagiotis; Ioannidis, Lazaros; Dombros, Nicholas; Topps, David; Bamidis, Panagiotis D
2015-01-22
Virtual patients are interactive computer simulations that are increasingly used as learning activities in modern health care education, especially in teaching clinical decision making. A key challenge is how to retrieve and repurpose virtual patients as unique types of educational resources between different platforms because of the lack of standardized content-retrieving and repurposing mechanisms. Semantic Web technologies provide the capability, through structured information, for easy retrieval, reuse, repurposing, and exchange of virtual patients between different systems. An attempt to address this challenge has been made through the mEducator Best Practice Network, which provisioned frameworks for the discovery, retrieval, sharing, and reuse of medical educational resources. We have extended the OpenLabyrinth virtual patient authoring and deployment platform to facilitate the repurposing and retrieval of existing virtual patient material. A standalone Web distribution and Web interface, which contains an extension for the OpenLabyrinth virtual patient authoring system, was implemented. This extension was designed to semantically annotate virtual patients to facilitate intelligent searches, complex queries, and easy exchange between institutions. The OpenLabyrinth extension enables OpenLabyrinth authors to integrate and share virtual patient case metadata within the mEducator3.0 network. Evaluation included 3 successive steps: (1) expert reviews; (2) evaluation of the ability of health care professionals and medical students to create, share, and exchange virtual patients through specific scenarios in extended OpenLabyrinth (OLabX); and (3) evaluation of the repurposed learning objects that emerged from the procedure. We evaluated 30 repurposed virtual patient cases. The evaluation, with a total of 98 participants, demonstrated the system's main strength: the core repurposing capacity. The extensive metadata schema presentation facilitated user exploration and filtering of resources. Usability weaknesses were primarily related to standard computer applications' ease of use provisions. Most evaluators provided positive feedback regarding educational experiences on both content and system usability. Evaluation results replicated across several independent evaluation events. The OpenLabyrinth extension, as part of the semantic mEducator3.0 approach, is a virtual patient sharing approach that builds on a collection of Semantic Web services and federates existing sources of clinical and educational data. It is an effective sharing tool for virtual patients and has been merged into the next version of the app (OpenLabyrinth 3.3). Such tool extensions may enhance the medical education arsenal with capacities of creating simulation/game-based learning episodes, massive open online courses, curricular transformations, and a future robust infrastructure for enabling mobile learning.
Daniels, Felicity M; Khanyile, Thembisile D
2013-09-01
A fundamental purpose of mergers between higher education institutions (HEIs) in 2002 was to enable sharing of scarce resources between more advanced universities and those historically disadvantaged by the apartheid system of the South African Government. A common teaching platform for undergraduate nursing education in the Western Cape was established in 2005, in line with the transformation of the higher education system, as a collaborative initiative between three universities. In order to evaluate the common teaching platform, Stuffelbeam's context, input, process, product (CIPP) research model was employed. A sample of 108 participants was selected through stratified purposive sampling, and included three deputy vice-chancellors, three deans, three heads of department, 18 lecturers and 81 students. Semi-structured interviews were held with the staff members, whilst the students participated in focus group interviews. Open-ended questions informed by literature and the CIPP evaluation model were developed and used to guide the interviews. This enabled the researcher to obtain a rich description of the participants' experiences. The data were analysed inductively. The results revealed that the main purpose of collaboration was not achieved due to the lack of a common understanding of the concept of collaboration and its purpose; a lack of readiness to collaborate and a lack of sharing of resources. A framework for effective collaboration was developed based on the results. Copyright © 2012 Elsevier Ltd. All rights reserved.
Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets.
Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L
2014-01-01
As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Consolidation of cloud computing in ATLAS
NASA Astrophysics Data System (ADS)
Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration
2017-10-01
Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.
Knowledge Acquisition and Management for the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Nemani, R. R.
2013-12-01
NASA Earth Exchange (NEX) is a data, computing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform with access to large supercomputing resources. As more and more projects are being executed on NEX, we are increasingly focusing on capturing the knowledge of the NEX users and provide mechanisms for sharing it with the community in order to facilitate reuse and accelerate research. There are many possible knowledge contributions to NEX, it can be a wiki entry on the NEX portal contributed by a developer, information extracted from a publication in an automated way, or a workflow captured during code execution on the supercomputing platform. The goal of the NEX knowledge platform is to capture and organize this information and make it easily accessible to the NEX community and beyond. The knowledge acquisition process consists of three main faucets - data and metadata, workflows and processes, and web-based information. Once the knowledge is acquired, it is processed in a number of ways ranging from custom metadata parsers to entity extraction using natural language processing techniques. The processed information is linked with existing taxonomies and aligned with internal ontology (which heavily reuses number of external ontologies). This forms a knowledge graph that can then be used to improve users' search query results as well as provide additional analytics capabilities to the NEX system. Such a knowledge graph will be an important building block in creating a dynamic knowledge base for the NEX community where knowledge is both generated and easily shared.
Support for Taverna workflows in the VPH-Share cloud platform.
Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F
2017-07-01
To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.
Partnership For Edge Physics Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parashar, Manish
In this effort, we will extend our prior work as part of CPES (i.e., DART and DataSpaces) to support in-situ tight coupling between application codes that exploits data locality and core-level parallelism to maximize on-chip data exchange and reuse. This will be accomplished by mapping coupled simulations so that the data exchanges are more localized within the nodes. Coupled simulation workflows can more effectively utilize the resources available on emerging HEC platforms if they can be mapped and executed to exploit data locality as well as the communication patterns between application components. Scheduling and running such workflows requires an extendedmore » framework that should provide a unified hybrid abstraction to enable coordination and data sharing across computation tasks that run on the heterogeneous multi-core-based systems, and develop a data-locality based dynamic tasks scheduling approach to increase on-chip or intra-node data exchanges and in-situ execution. This effort will extend our prior work as part of CPES (i.e., DART and DataSpaces), which provided a simple virtual shared-space abstraction hosted at the staging nodes, to support application coordination, data sharing and active data processing services. Moreover, it will transparently manage the low-level operations associated with the inter-application data exchange, such as data redistributions, and will enable running coupled simulation workflow on multi-cores computing platforms.« less
Zens, Martin; Grotejohann, Birgit; Tassoni, Adrian; Duttenhoefer, Fabian; Südkamp, Norbert P; Niemeyer, Philipp
2017-05-23
Observational studies have proven to be a valuable resource in medical research, especially when performed on a large scale. Recently, mobile device-based observational studies have been discovered by an increasing number of researchers as a promising new source of information. However, the development and deployment of app-based studies is not trivial and requires profound programming skills. The aim of this project was to develop a modular online research platform that allows researchers to create medical studies for mobile devices without extensive programming skills. The platform approach for a modular research platform consists of three major components. A Web-based platform forms the researchers' main workplace. This platform communicates via a shared database with a platform independent mobile app. Furthermore, a separate Web-based login platform for physicians and other health care professionals is outlined and completes the concept. A prototype of the research platform has been developed and is currently in beta testing. Simple questionnaire studies can be created within minutes and published for testing purposes. Screenshots of an example study are provided, and the general working principle is displayed. In this project, we have created a basis for a novel research platform. The necessity and implications of a modular approach were displayed and an outline for future development given. International researchers are invited and encouraged to participate in this ongoing project. ©Martin Zens, Birgit Grotejohann, Adrian Tassoni, Fabian Duttenhoefer, Norbert P Südkamp, Philipp Niemeyer. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 23.05.2017.
Navigating the changing learning landscape: perspective from bioinformatics.ca
Ouellette, B. F. Francis
2013-01-01
With the advent of YouTube channels in bioinformatics, open platforms for problem solving in bioinformatics, active web forums in computing analyses and online resources for learning to code or use a bioinformatics tool, the more traditional continuing education bioinformatics training programs have had to adapt. Bioinformatics training programs that solely rely on traditional didactic methods are being superseded by these newer resources. Yet such face-to-face instruction is still invaluable in the learning continuum. Bioinformatics.ca, which hosts the Canadian Bioinformatics Workshops, has blended more traditional learning styles with current online and social learning styles. Here we share our growing experiences over the past 12 years and look toward what the future holds for bioinformatics training programs. PMID:23515468
Castillo, Andreina I; Nelson, Andrew D L; Haug-Baltzell, Asher K; Lyons, Eric
2018-01-01
Abstract Integrated platforms for storage, management, analysis and sharing of large quantities of omics data have become fundamental to comparative genomics. CoGe (https://genomevolution.org/coge/) is an online platform designed to manage and study genomic data, enabling both data- and hypothesis-driven comparative genomics. CoGe’s tools and resources can be used to organize and analyse both publicly available and private genomic data from any species. Here, we demonstrate the capabilities of CoGe through three example workflows using 17 Plasmodium genomes as a model. Plasmodium genomes present unique challenges for comparative genomics due to their rapidly evolving and highly variable genomic AT/GC content. These example workflows are intended to serve as templates to help guide researchers who would like to use CoGe to examine diverse aspects of genome evolution. In the first workflow, trends in genome composition and amino acid usage are explored. In the second, changes in genome structure and the distribution of synonymous (Ks) and non-synonymous (Kn) substitution values are evaluated across species with different levels of evolutionary relatedness. In the third workflow, microsyntenic analyses of multigene families’ genomic organization are conducted using two Plasmodium-specific gene families—serine repeat antigen, and cytoadherence-linked asexual gene—as models. In general, these example workflows show how to achieve quick, reproducible and shareable results using the CoGe platform. We were able to replicate previously published results, as well as leverage CoGe’s tools and resources to gain additional insight into various aspects of Plasmodium genome evolution. Our results highlight the usefulness of the CoGe platform, particularly in understanding complex features of genome evolution. Database URL: https://genomevolution.org/coge/
Challenges to complete and useful data sharing.
Mbuagbaw, Lawrence; Foster, Gary; Cheng, Ji; Thabane, Lehana
2017-02-14
Data sharing from clinical trials is one way of promoting fair and transparent conduct of clinical trials. It would maximise the use of data and permit the exploration of additional hypotheses. On the other hand, the quality of secondary analyses cannot always be ascertained, and it may be unfair to investigators who have expended resources to collect data to bear the additional burden of sharing. As the discussion on the best modalities of sharing data evolves, some of the practical issues that may arise need to be addressed. In this paper, we discuss issues which impede the use of data even when sharing should be possible: (1) multicentre studies requiring consent from all the investigators in each centre; (2) remote access platforms with software limitations and Internet requirements; (3) on-site data analysis when data cannot be moved; (4) governing bodies for data generated in one jurisdiction and analysed in another; (5) using programmatic data collected as part of routine care; (6) data collected in multiple languages; (7) poor data quality. We believe these issues apply to all primary data and cause undue difficulties in conducting analysis even when there is some willingness to share. They can be avoided by anticipating the possibility of sharing any clinical data and pre-emptively removing or addressing restrictions that limit complete sharing. These issues should be part of the data sharing discussion.
A picture tells a thousand words: A content analysis of concussion-related images online.
Ahmed, Osman H; Lee, Hopin; Struik, Laura L
2016-09-01
Recently image-sharing social media platforms have become a popular medium for sharing health-related images and associated information. However within the field of sports medicine, and more specifically sports related concussion, the content of images and meta-data shared through these popular platforms have not been investigated. The aim of this study was to analyse the content of concussion-related images and its accompanying meta-data on image-sharing social media platforms. We retrieved 300 images from Pinterest, Instagram and Flickr by using a standardised search strategy. All images were screened and duplicate images were removed. We excluded images if they were: non-static images; illustrations; animations; or screenshots. The content and characteristics of each image was evaluated using a customised coding scheme to determine major content themes, and images were referenced to the current international concussion management guidelines. From 300 potentially relevant images, 176 images were included for analysis; 70 from Pinterest, 63 from Flickr, and 43 from Instagram. Most images were of another person or a scene (64%), with the primary content depicting injured individuals (39%). The primary purposes of the images were to share a concussion-related incident (33%) and to dispense education (19%). For those images where it could be evaluated, the majority (91%) were found to reflect the Sports Concussion Assessment Tool 3 (SCAT3) guidelines. The ability to rapidly disseminate rich information though photos, images, and infographics to a wide-reaching audience suggests that image-sharing social media platforms could be used as an effective communication tool for sports concussion. Public health strategies could direct educative content to targeted populations via the use of image-sharing platforms. Further research is required to understand how image-sharing platforms can be used to effectively relay evidence-based information to patients and sports medicine clinicians. Copyright © 2016 Elsevier Ltd. All rights reserved.
Web Platform for Sharing Modeling Software in the Field of Nonlinear Optics
NASA Astrophysics Data System (ADS)
Dubenskaya, Julia; Kryukov, Alexander; Demichev, Andrey
2018-02-01
We describe the prototype of a Web platform intended for sharing software programs for computer modeling in the rapidly developing field of the nonlinear optics phenomena. The suggested platform is built on the top of the HUBZero open-source middleware. In addition to the basic HUBZero installation we added to our platform the capability to run Docker containers via an external application server and to send calculation programs to those containers for execution. The presented web platform provides a wide range of features and might be of benefit to nonlinear optics researchers.
Linking earth science informatics resources into uninterrupted digital value chains
NASA Astrophysics Data System (ADS)
Woodcock, Robert; Angreani, Rini; Cox, Simon; Fraser, Ryan; Golodoniuc, Pavel; Klump, Jens; Rankine, Terry; Robertson, Jess; Vote, Josh
2015-04-01
The CSIRO Mineral Resources Flagship was established to tackle medium- to long-term challenges facing the Australian mineral industry across the value chain from exploration and mining through mineral processing within the framework of an economically, environmentally and socially sustainable minerals industry. This broad portfolio demands collaboration and data exchange with a broad range of participants and data providers across government, research and industry. It is an ideal environment to link geoscience informatics platforms to application across the resource extraction industry and to unlock the value of data integration between traditionally discrete parts of the minerals digital value chain. Despite the potential benefits, data integration remains an elusive goal within research and industry. Many projects use only a subset of available data types in an integrated manner, often maintaining the traditional discipline-based data 'silos'. Integrating data across the entire minerals digital value chain is an expensive proposition involving multiple disciplines and, significantly, multiple data sources both internal and external to any single organisation. Differing vocabularies and data formats, along with access regimes to appropriate analysis software and equipment all hamper the sharing and exchange of information. AuScope has addressed the challenge of data exchange across organisations nationally, and established a national geosciences information infrastructure using open standards-based web services. Federated across a wide variety of organisations, the resulting infrastructure contains a wide variety of live and updated data types. The community data standards and infrastructure platforms that underpin AuScope provide important new datasets and multi-agency links independent of software and hardware differences. AuScope has thus created an infrastructure, a platform of technologies and the opportunity for new ways of working with and integrating disparate data at much lower cost. An early example of this approach is the value generated by combining geological and metallurgical data sets as part of the rapidly growing field of geometallurgy. This not only provides a far better understanding of the impact of geological variability on ore processing but also leads to new thinking on the types and characteristics of data sets collected at various stages of the exploration and mining process. The Minerals Resources Flagship is linking its research activities to the AuScope infrastructure, exploiting the technology internally to create a platform for integrated research across the minerals value chain and improved interaction with industry. Referred to as the 'Early Access Virtual Lab', the system will be fully interoperable with AuScope and international infrastructures using open standards like GeosciML. Secured access is provided to allow confidential collaboration with industry when required. This presentation will discuss how the CSIRO Mineral Resources Flagship is building on the AuScope infrastructure to transform the way that data and data products are identified, shared, integrated, and reused, to unlock the benefits of true integration of research efforts across the minerals digital value chain.
Ren, Shenghan; Chen, Xueli; Wang, Hailong; Qu, Xiaochao; Wang, Ge; Liang, Jimin; Tian, Jie
2013-01-01
The study of light propagation in turbid media has attracted extensive attention in the field of biomedical optical molecular imaging. In this paper, we present a software platform for the simulation of light propagation in turbid media named the “Molecular Optical Simulation Environment (MOSE)”. Based on the gold standard of the Monte Carlo method, MOSE simulates light propagation both in tissues with complicated structures and through free-space. In particular, MOSE synthesizes realistic data for bioluminescence tomography (BLT), fluorescence molecular tomography (FMT), and diffuse optical tomography (DOT). The user-friendly interface and powerful visualization tools facilitate data analysis and system evaluation. As a major measure for resource sharing and reproducible research, MOSE aims to provide freeware for research and educational institutions, which can be downloaded at http://www.mosetm.net. PMID:23577215
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community.
Krampis, Konstantinos; Booth, Tim; Chapman, Brad; Tiwari, Bela; Bicak, Mesude; Field, Dawn; Nelson, Karen E
2012-03-19
A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them.
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community
2012-01-01
Background A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Results Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Conclusions Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them. PMID:22429538
NASA Technical Reports Server (NTRS)
Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.
1992-01-01
Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.
Mougin, Christian; Azam, Didier; Caquet, Thierry; Cheviron, Nathalie; Dequiedt, Samuel; Le Galliard, Jean-François; Guillaume, Olivier; Houot, Sabine; Lacroix, Gérard; Lafolie, François; Maron, Pierre-Alain; Michniewicz, Radika; Pichot, Christian; Ranjard, Lionel; Roy, Jacques; Zeller, Bernd; Clobert, Jean; Chanzy, André
2015-10-01
The infrastructure for Analysis and Experimentation on Ecosystems (AnaEE-France) is an integrated network of the major French experimental, analytical, and modeling platforms dedicated to the biological study of continental ecosystems (aquatic and terrestrial). This infrastructure aims at understanding and predicting ecosystem dynamics under global change. AnaEE-France comprises complementary nodes offering access to the best experimental facilities and associated biological resources and data: Ecotrons, seminatural experimental platforms to manipulate terrestrial and aquatic ecosystems, in natura sites equipped for large-scale and long-term experiments. AnaEE-France also provides shared instruments and analytical platforms dedicated to environmental (micro) biology. Finally, AnaEE-France provides users with data bases and modeling tools designed to represent ecosystem dynamics and to go further in coupling ecological, agronomical, and evolutionary approaches. In particular, AnaEE-France offers adequate services to tackle the new challenges of research in ecotoxicology, positioning its various types of platforms in an ecologically advanced ecotoxicology approach. AnaEE-France is a leading international infrastructure, and it is pioneering the construction of AnaEE (Europe) infrastructure in the field of ecosystem research. AnaEE-France infrastructure is already open to the international community of scientists in the field of continental ecotoxicology.
NASA Astrophysics Data System (ADS)
Podger, G. M.; Cuddy, S. M.; Peeters, L.; Smith, T.; Bark, R. H.; Black, D. C.; Wallbrink, P.
2014-09-01
Water jurisdictions in Australia are required to prepare and implement water resource plans. In developing these plans the common goal is realising the best possible use of the water resources - maximising outcomes while minimising negative impacts. This requires managing the risks associated with assessing and balancing cultural, industrial, agricultural, social and environmental demands for water within a competitive and resource-limited environment. Recognising this, conformance to international risk management principles (ISO 31000:2009) have been embedded within the Murray-Darling Basin Plan. Yet, to date, there has been little strategic investment by water jurisdictions in bridging the gap between principle and practice. The ISO 31000 principles and the risk management framework that embodies them align well with an adaptive management paradigm within which to conduct water resource planning. They also provide an integrative framework for the development of workflows that link risk analysis with risk evaluation and mitigation (adaptation) scenarios, providing a transparent, repeatable and robust platform. This study, through a demonstration use case and a series of workflows, demonstrates to policy makers how these principles can be used to support the development of the next generation of water sharing plans in 2019. The workflows consider the uncertainty associated with climate and flow inputs, and model parameters on irrigation and hydropower production, meeting environmental flow objectives and recreational use of the water resource. The results provide insights to the risks associated with meeting a range of different objectives.
A quantitative model of application slow-down in multi-resource shared systems
Lim, Seung-Hwan; Kim, Youngjae
2016-12-26
Scheduling multiple jobs onto a platform enhances system utilization by sharing resources. The benefits from higher resource utilization include reduced cost to construct, operate, and maintain a system, which often include energy consumption. Maximizing these benefits comes at a price-resource contention among jobs increases job completion time. In this study, we analyze slow-downs of jobs due to contention for multiple resources in a system; referred to as dilation factor. We observe that multiple-resource contention creates non-linear dilation factors of jobs. From this observation, we establish a general quantitative model for dilation factors of jobs in multi-resource systems. A job ismore » characterized by a vector-valued loading statistics and dilation factors of a job set are given by a quadratic function of their loading vectors. We demonstrate how to systematically characterize a job, maintain the data structure to calculate the dilation factor (loading matrix), and calculate the dilation factor of each job. We validate the accuracy of the model with multiple processes running on a native Linux server, virtualized servers, and with multiple MapReduce workloads co-scheduled in a cluster. Evaluation with measured data shows that the D-factor model has an error margin of less than 16%. We extended the D-factor model to capture the slow-down of applications when multiple identical resources exist such as multi-core environments and multi-disks environments. Finally, validation results of the extended D-factor model with HPC checkpoint applications on the parallel file systems show that D-factor accurately captures the slow down of concurrent applications in such environments.« less
A quantitative model of application slow-down in multi-resource shared systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Seung-Hwan; Kim, Youngjae
Scheduling multiple jobs onto a platform enhances system utilization by sharing resources. The benefits from higher resource utilization include reduced cost to construct, operate, and maintain a system, which often include energy consumption. Maximizing these benefits comes at a price-resource contention among jobs increases job completion time. In this study, we analyze slow-downs of jobs due to contention for multiple resources in a system; referred to as dilation factor. We observe that multiple-resource contention creates non-linear dilation factors of jobs. From this observation, we establish a general quantitative model for dilation factors of jobs in multi-resource systems. A job ismore » characterized by a vector-valued loading statistics and dilation factors of a job set are given by a quadratic function of their loading vectors. We demonstrate how to systematically characterize a job, maintain the data structure to calculate the dilation factor (loading matrix), and calculate the dilation factor of each job. We validate the accuracy of the model with multiple processes running on a native Linux server, virtualized servers, and with multiple MapReduce workloads co-scheduled in a cluster. Evaluation with measured data shows that the D-factor model has an error margin of less than 16%. We extended the D-factor model to capture the slow-down of applications when multiple identical resources exist such as multi-core environments and multi-disks environments. Finally, validation results of the extended D-factor model with HPC checkpoint applications on the parallel file systems show that D-factor accurately captures the slow down of concurrent applications in such environments.« less
EcoliWiki: a wiki-based community resource for Escherichia coli
McIntosh, Brenley K.; Renfro, Daniel P.; Knapp, Gwendowlyn S.; Lairikyengbam, Chanchala R.; Liles, Nathan M.; Niu, Lili; Supak, Amanda M.; Venkatraman, Anand; Zweifel, Adrienne E.; Siegele, Deborah A.; Hu, James C.
2012-01-01
EcoliWiki is the community annotation component of the PortEco (http://porteco.org; formerly EcoliHub) project, an online data resource that integrates information on laboratory strains of Escherichia coli, its phages, plasmids and mobile genetic elements. As one of the early adopters of the wiki approach to model organism databases, EcoliWiki was designed to not only facilitate community-driven sharing of biological knowledge about E. coli as a model organism, but also to be interoperable with other data resources. EcoliWiki content currently covers genes from five laboratory E. coli strains, 21 bacteriophage genomes, F plasmid and eight transposons. EcoliWiki integrates the Mediawiki wiki platform with other open-source software tools and in-house software development to extend how wikis can be used for model organism databases. EcoliWiki can be accessed online at http://ecoliwiki.net. PMID:22064863
Johnson, Jane; Collins, Teresa; Degeling, Christopher; Fawcett, Anne; Fisher, Andrew D.; Freire, Rafael; Hazel, Susan J.; Hood, Jennifer; Lloyd, Janice; Phillips, Clive J. C.; Stafford, Kevin; Tzioumis, Vicky; McGreevy, Paul D.
2015-01-01
Simple Summary There is a need for teaching Animal Welfare and Ethics in veterinary schools and we are developing online resources to meet this need. In this paper we describe how we prioritized the development of these resources by polling experts in the field. Abstract The need for undergraduate teaching of Animal Welfare and Ethics (AWE) in Australian and New Zealand veterinary courses reflects increasing community concerns and expectations about AWE; global pressures regarding food security and sustainability; the demands of veterinary accreditation; and fears that, unless students encounter AWE as part of their formal education, as veterinarians they will be relatively unaware of the discipline of animal welfare science. To address this need we are developing online resources to ensure Australian and New Zealand veterinary graduates have the knowledge, and the research, communication and critical reasoning skills, to fulfill the AWE role demanded of them by contemporary society. To prioritize development of these resources we assembled leaders in the field of AWE education from the eight veterinary schools in Australia and New Zealand and used modified deliberative polling. This paper describes the role of the poll in developing the first shared online curriculum resource for veterinary undergraduate learning and teaching in AWE in Australia and New Zealand. The learning and teaching strategies that ranked highest in the exercise were: scenario-based learning; a quality of animal life assessment tool; the so-called ‘Human Continuum’ discussion platform; and a negotiated curriculum. PMID:26479241
Virtual Reality as a Story Telling Platform for Geoscience Communication
NASA Astrophysics Data System (ADS)
Lazar, K.; Moysey, S. M.
2017-12-01
Capturing the attention of students and the public is a critical step for increasing societal interest and literacy in earth science issues. Virtual reality (VR) provides a means for geoscience engagement that is well suited to place-based learning through exciting and immersive experiences. One approach is to create fully-immersive virtual gaming environments where players interact with physical objects, such as rock samples and outcrops, to pursue geoscience learning goals. Developing an experience like this, however, can require substantial programming expertise and resources. At the other end of the development spectrum, it is possible for anyone to create immersive virtual experiences with 360-degree imagery, which can be made interactive using easy to use VR editing software to embed videos, audio, images, and other content within the 360-degree image. Accessible editing tools like these make the creation of VR experiences something that anyone can tackle. Using the VR editor ThingLink and imagery from Google Maps, for example, we were able to create an interactive tour of the Grand Canyon, complete with embedded assessments, in a matter of hours. The true power of such platforms, however, comes from the potential to engage students as content authors to create and share stories of place that explore geoscience issues from their personal perspective. For example, we have used combinations of 360-degree images with interactive mapping and web platforms to enable students with no programming experience to create complex web apps as highly engaging story telling platforms. We highlight here examples of how we have implemented such story telling approaches with students to assess learning in courses, to share geoscience research outcomes, and to communicate issues of societal importance.
NASA Astrophysics Data System (ADS)
Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.
2009-12-01
As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.
Political economies and environmental futures for the sharing economy.
Frenken, Koen
2017-06-13
The sudden rise of the sharing economy has sparked an intense public debate about its definition, its effects and its future regulation. Here, I attempt to provide analytical guidance by defining the sharing economy as the practice that consumers grant each other temporary access to their under-utilized physical assets. Using this definition, the rise of the sharing economy can be understood as occurring at the intersection of three salient economic trends: peer-to-peer exchange, access over ownership and circular business models. I shortly discuss some of the environmental impacts of online sharing platforms and then articulate three possible futures of the sharing economy: a capitalist future cumulating in monopolistic super-platforms allowing for seamless services, a state-led future that shifts taxation from labour to capital and redistributes the gains of sharing from winners to losers, and a citizen-led future based on cooperatively owned platforms under democratic control. The nature and size of the social and environmental impacts are expected to differ greatly in each of the three scenarios.This article is part of the themed issue 'Material demand reduction'. © 2017 The Authors.
Political economies and environmental futures for the sharing economy
NASA Astrophysics Data System (ADS)
Frenken, Koen
2017-05-01
The sudden rise of the sharing economy has sparked an intense public debate about its definition, its effects and its future regulation. Here, I attempt to provide analytical guidance by defining the sharing economy as the practice that consumers grant each other temporary access to their under-utilized physical assets. Using this definition, the rise of the sharing economy can be understood as occurring at the intersection of three salient economic trends: peer-to-peer exchange, access over ownership and circular business models. I shortly discuss some of the environmental impacts of online sharing platforms and then articulate three possible futures of the sharing economy: a capitalist future cumulating in monopolistic super-platforms allowing for seamless services, a state-led future that shifts taxation from labour to capital and redistributes the gains of sharing from winners to losers, and a citizen-led future based on cooperatively owned platforms under democratic control. The nature and size of the social and environmental impacts are expected to differ greatly in each of the three scenarios. This article is part of the themed issue 'Material demand reduction'.
Political economies and environmental futures for the sharing economy
2017-01-01
The sudden rise of the sharing economy has sparked an intense public debate about its definition, its effects and its future regulation. Here, I attempt to provide analytical guidance by defining the sharing economy as the practice that consumers grant each other temporary access to their under-utilized physical assets. Using this definition, the rise of the sharing economy can be understood as occurring at the intersection of three salient economic trends: peer-to-peer exchange, access over ownership and circular business models. I shortly discuss some of the environmental impacts of online sharing platforms and then articulate three possible futures of the sharing economy: a capitalist future cumulating in monopolistic super-platforms allowing for seamless services, a state-led future that shifts taxation from labour to capital and redistributes the gains of sharing from winners to losers, and a citizen-led future based on cooperatively owned platforms under democratic control. The nature and size of the social and environmental impacts are expected to differ greatly in each of the three scenarios. This article is part of the themed issue ‘Material demand reduction’. PMID:28461431
ERIC Educational Resources Information Center
Young, Mei-Lien
2012-01-01
In this research, we explore the impact on teachers of implementation of the Faculty Student Knowledge-Sharing Platform (FSKSP) in their college. Specifically, we focus on the effect on those teachers of the need to share publicly their knowledge and teaching material as the result of FSKSP implementation. In addition, we report the experience and…
U.S. Army Public Affairs Officers and Social Media Training Requirements
2016-06-10
media platforms include Facebook, Twitter, LinkedIn, Google, YouTube, Pinterest, Instagram , and Slideshare.15 According to the Decidedly Social...includes a list of social media platforms that include: Facebook, Twitter, YouTube, Instagram , Tumblr, and private messaging as part of the social...2.62 Using two or more social or sharing media platforms in one campaign 4.72 Instagram 2.37 Print magazines 4.70 Crowdsourcing 2.19 Sharing of
A Qualitative Investigation on Patient Empowerment in Prostate Cancer
Renzi, Chiara; Fioretti, Chiara; Oliveri, Serena; Mazzocco, Ketti; Zerini, Dario; Alessandro, Ombretta; Rojas, Damaris P.; Jereczek-Fossa, Barbara A.; Pravettoni, Gabriella
2017-01-01
Purpose: Men with prostate cancer often describe low levels of empowerment. eHealth interventions may represent useful tools to deliver care and education and to meet patients' needs within an empowerment framework. In order to design a platform for cancer patients' empowerment within the H2020 iManageCancer project, the perspective of the target population for the platform was assessed. The present study aims to assess the qualitative experience of prostate cancer patients during treatment in order to provide insights for clinical practice with a particular focus on the design of a web platform to promote cancer patients' empowerment. Methods: Ten patients undergoing radiation therapy treatment took part in a semi-structured interview to explore different aspects of patient empowerment. Four main thematic areas were addressed: patient-healthcare providers' communication, decision-making, needs, and resources. A qualitative approach using thematic analysis was followed. Results: Half of the patients reported little to no possibility to share information and questions with healthcare providers. With regards to decision-making, the role of healthcare providers was perceived as directive/informative, but half of the patients perceived to assume an active role in at least one interaction. Difficulties and needs included the choice of the specialist or of the structure after diagnosis, clinicians' support in self-management, surgical consequences, and side effects, preparation for radiation therapy. Resources included family and social support both from a practical and from an emotional perspective, coping style, and work schedule management. Conclusions: These results suggest that relations with healthcare providers should be supported, especially immediately after diagnosis and after surgery. Support to self-management after surgery and at the beginning of radiation therapy treatment also constitutes a priority. The adoption of a personalized approach from the beginning of prostate cancer care flow may promote patient empowerment, overcoming the aforementioned needs and mobilizing resources. The social network represents an important resource that could be integrated in interventions. These considerations will be taken into account in the design of a cancer self-management platform aiming to increase patients' empowerment. PMID:28798701
Biomedical Informatics on the Cloud: A Treasure Hunt for Advancing Cardiovascular Medicine.
Ping, Peipei; Hermjakob, Henning; Polson, Jennifer S; Benos, Panagiotis V; Wang, Wei
2018-04-27
In the digital age of cardiovascular medicine, the rate of biomedical discovery can be greatly accelerated by the guidance and resources required to unearth potential collections of knowledge. A unified computational platform leverages metadata to not only provide direction but also empower researchers to mine a wealth of biomedical information and forge novel mechanistic insights. This review takes the opportunity to present an overview of the cloud-based computational environment, including the functional roles of metadata, the architecture schema of indexing and search, and the practical scenarios of machine learning-supported molecular signature extraction. By introducing several established resources and state-of-the-art workflows, we share with our readers a broadly defined informatics framework to phenotype cardiovascular health and disease. © 2018 American Heart Association, Inc.
Sharing Rare Attitudes Attracts.
Alves, Hans
2018-04-01
People like others who share their attitudes. Online dating platforms as well as other social media platforms regularly rely on the social bonding power of their users' shared attitudes. However, little is known about moderating variables. In the present work, I argue that sharing rare compared with sharing common attitudes should evoke stronger interpersonal attraction among people. In five studies, I tested this prediction for the case of shared interests from different domains. I found converging evidence that people's rare compared with their common interests are especially potent to elicit interpersonal attraction. I discuss the current framework's theoretical implications for impression formation and impression management as well as its practical implications for improving online dating services.
30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 2 2012-07-01 2012-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources BUREAU OF SAFETY AND... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.911 If my platform is subject...
30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 2 2013-07-01 2013-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources BUREAU OF SAFETY AND... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.911 If my platform is subject...
30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 2 2014-07-01 2014-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources BUREAU OF SAFETY AND... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.911 If my platform is subject...
NASA Astrophysics Data System (ADS)
Gil, Y.; Duffy, C.
2015-12-01
This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.
Autonomous self-organizing resource manager for multiple networked platforms
NASA Astrophysics Data System (ADS)
Smith, James F., III
2002-08-01
A fuzzy logic based expert system for resource management has been developed that automatically allocates electronic attack (EA) resources in real-time over many dissimilar autonomous naval platforms defending their group against attackers. The platforms can be very general, e.g., ships, planes, robots, land based facilities, etc. Potential foes the platforms deal with can also be general. This paper provides an overview of the resource manager including the four fuzzy decision trees that make up the resource manager; the fuzzy EA model; genetic algorithm based optimization; co-evolutionary data mining through gaming; and mathematical, computational and hardware based validation. Methods of automatically designing new multi-platform EA techniques are considered. The expert system runs on each defending platform rendering it an autonomous system requiring no human intervention. There is no commanding platform. Instead the platforms work cooperatively as a function of battlespace geometry; sensor data such as range, bearing, ID, uncertainty measures for sensor output; intelligence reports; etc. Computational experiments will show the defending networked platform's ability to self- organize. The platforms' ability to self-organize is illustrated through the output of the scenario generator, a software package that automates the underlying data mining problem and creates a computer movie of the platforms' interaction for evaluation.
Resource integration and shared outcomes at the watershed scale
Eleanor S. Towns
2000-01-01
Shared resources are universal resources that are vital for sustaining communities, enhancing our quality of life and preserving ecosystem health. We have a shared responsibility to conserve shared resources and preserve their integrity for future generations. Resource integration is accomplished through ecosystem management, often at a watershed scale. The shared...
Applications integration in a hybrid cloud computing environment: modelling and platform
NASA Astrophysics Data System (ADS)
Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang
2013-08-01
With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.
Bioinformatics and Microarray Data Analysis on the Cloud.
Calabrese, Barbara; Cannataro, Mario
2016-01-01
High-throughput platforms such as microarray, mass spectrometry, and next-generation sequencing are producing an increasing volume of omics data that needs large data storage and computing power. Cloud computing offers massive scalable computing and storage, data sharing, on-demand anytime and anywhere access to resources and applications, and thus, it may represent the key technology for facing those issues. In fact, in the recent years it has been adopted for the deployment of different bioinformatics solutions and services both in academia and in the industry. Although this, cloud computing presents several issues regarding the security and privacy of data, that are particularly important when analyzing patients data, such as in personalized medicine. This chapter reviews main academic and industrial cloud-based bioinformatics solutions; with a special focus on microarray data analysis solutions and underlines main issues and problems related to the use of such platforms for the storage and analysis of patients data.
Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric
2014-01-29
Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.
Creating and Sharing Understanding: GEOSS and ArcGIS Online
NASA Astrophysics Data System (ADS)
White, C. E.; Hogeweg, M.; Foust, J.
2014-12-01
The GEOSS program brokers various forms of earth observation data and information via its online platform Discovery and Access Broker (DAB). The platform connects relevant information systems and infrastructures through the world. Esri and the National Research Council of Italy Institute of Atmospheric Pollution Research (CNR-IIA) are building two-way technology between DAB framework and ArcGIS Online using the ArcGIS Online API. Developers will engineer Esri and DAB interfaces and build interoperable web services that connect the two systems. This collaboration makes GEOSS earth observation data and services available to the ArcGIS Online community, and ArcGIS Online a significant part of the GEOSS DAB infrastructure. ArcGIS Online subscribers can discover and access the resources published by GEOSS, use GEOSS data services, and build applications. Making GEOSS content available in ArcGIS Online increases opportunities for scientists in other communities to visualize information in greater context. Moreover, because the platform supports authoritative and crowd-sourcing information, GEOSS members can build networks into other disciplines. This talk will discuss the power of interoperable service architectures that make such a collaboration possible, and the results thus far.
PsyGeNET: a knowledge platform on psychiatric disorders and their genes.
Gutiérrez-Sacristán, Alba; Grosdidier, Solène; Valverde, Olga; Torrens, Marta; Bravo, Àlex; Piñero, Janet; Sanz, Ferran; Furlong, Laura I
2015-09-15
PsyGeNET (Psychiatric disorders and Genes association NETwork) is a knowledge platform for the exploratory analysis of psychiatric diseases and their associated genes. PsyGeNET is composed of a database and a web interface supporting data search, visualization, filtering and sharing. PsyGeNET integrates information from DisGeNET and data extracted from the literature by text mining, which has been curated by domain experts. It currently contains 2642 associations between 1271 genes and 37 psychiatric disease concepts. In its first release, PsyGeNET is focused on three psychiatric disorders: major depression, alcohol and cocaine use disorders. PsyGeNET represents a comprehensive, open access resource for the analysis of the molecular mechanisms underpinning psychiatric disorders and their comorbidities. The PysGeNET platform is freely available at http://www.psygenet.org/. The PsyGeNET database is made available under the Open Database License (http://opendatacommons.org/licenses/odbl/1.0/). lfurlong@imim.es Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
ARMOUR - A Rice miRNA: mRNA Interaction Resource.
Sanan-Mishra, Neeti; Tripathi, Anita; Goswami, Kavita; Shukla, Rohit N; Vasudevan, Madavan; Goswami, Hitesh
2018-01-01
ARMOUR was developed as A Rice miRNA:mRNA interaction resource. This informative and interactive database includes the experimentally validated expression profiles of miRNAs under different developmental and abiotic stress conditions across seven Indian rice cultivars. This comprehensive database covers 689 known and 1664 predicted novel miRNAs and their expression profiles in more than 38 different tissues or conditions along with their predicted/known target transcripts. The understanding of miRNA:mRNA interactome in regulation of functional cellular machinery is supported by the sequence information of the mature and hairpin structures. ARMOUR provides flexibility to users in querying the database using multiple ways like known gene identifiers, gene ontology identifiers, KEGG identifiers and also allows on the fly fold change analysis and sequence search query with inbuilt BLAST algorithm. ARMOUR database provides a cohesive platform for novel and mature miRNAs and their expression in different experimental conditions and allows searching for their interacting mRNA targets, GO annotation and their involvement in various biological pathways. The ARMOUR database includes a provision for adding more experimental data from users, with an aim to develop it as a platform for sharing and comparing experimental data contributed by research groups working on rice.
P2P proteomics -- data sharing for enhanced protein identification
2012-01-01
Background In order to tackle the important and challenging problem in proteomics of identifying known and new protein sequences using high-throughput methods, we propose a data-sharing platform that uses fully distributed P2P technologies to share specifications of peer-interaction protocols and service components. By using such a platform, information to be searched is no longer centralised in a few repositories but gathered from experiments in peer proteomics laboratories, which can subsequently be searched by fellow researchers. Methods The system distributively runs a data-sharing protocol specified in the Lightweight Communication Calculus underlying the system through which researchers interact via message passing. For this, researchers interact with the system through particular components that link to database querying systems based on BLAST and/or OMSSA and GUI-based visualisation environments. We have tested the proposed platform with data drawn from preexisting MS/MS data reservoirs from the 2006 ABRF (Association of Biomolecular Resource Facilities) test sample, which was extensively tested during the ABRF Proteomics Standards Research Group 2006 worldwide survey. In particular we have taken the data available from a subset of proteomics laboratories of Spain's National Institute for Proteomics, ProteoRed, a network for the coordination, integration and development of the Spanish proteomics facilities. Results and Discussion We performed queries against nine databases including seven ProteoRed proteomics laboratories, the NCBI Swiss-Prot database and the local database of the CSIC/UAB Proteomics Laboratory. A detailed analysis of the results indicated the presence of a protein that was supported by other NCBI matches and highly scored matches in several proteomics labs. The analysis clearly indicated that the protein was a relatively high concentrated contaminant that could be present in the ABRF sample. This fact is evident from the information that could be derived from the proposed P2P proteomics system, however it is not straightforward to arrive to the same conclusion by conventional means as it is difficult to discard organic contamination of samples. The actual presence of this contaminant was only stated after the ABRF study of all the identifications reported by the laboratories. PMID:22293032
Allarakhia, Minna
2013-01-01
Repurposing has the objective of targeting existing drugs and failed, abandoned, or yet-to-be-pursued clinical candidates to new disease areas. The open-source model permits for the sharing of data, resources, compounds, clinical molecules, small libraries, and screening platforms to cost-effectively advance old drugs and/or candidates into clinical re-development. Clearly, at the core of drug-repurposing activities is collaboration, in many cases progressing beyond the open sharing of resources, technology, and intellectual property, to the sharing of facilities and joint program development to foster drug-repurposing human-capacity development. A variety of initiatives under way for drug repurposing, including those targeting rare and neglected diseases, are discussed in this review and provide insight into the stakeholders engaged in drug-repurposing discovery, the models of collaboration used, the intellectual property-management policies crafted, and human capacity developed. In the case of neglected tropical diseases, it is suggested that the development of human capital be a central aspect of drug-repurposing programs. Open-source models can support human-capital development through collaborative data generation, open compound access, open and collaborative screening, preclinical and possibly clinical studies. Given the urgency of drug development for neglected tropical diseases, the review suggests elements from current repurposing programs be extended to the neglected tropical diseases arena. PMID:23966771
Allarakhia, Minna
2013-01-01
Repurposing has the objective of targeting existing drugs and failed, abandoned, or yet-to-be-pursued clinical candidates to new disease areas. The open-source model permits for the sharing of data, resources, compounds, clinical molecules, small libraries, and screening platforms to cost-effectively advance old drugs and/or candidates into clinical re-development. Clearly, at the core of drug-repurposing activities is collaboration, in many cases progressing beyond the open sharing of resources, technology, and intellectual property, to the sharing of facilities and joint program development to foster drug-repurposing human-capacity development. A variety of initiatives under way for drug repurposing, including those targeting rare and neglected diseases, are discussed in this review and provide insight into the stakeholders engaged in drug-repurposing discovery, the models of collaboration used, the intellectual property-management policies crafted, and human capacity developed. In the case of neglected tropical diseases, it is suggested that the development of human capital be a central aspect of drug-repurposing programs. Open-source models can support human-capital development through collaborative data generation, open compound access, open and collaborative screening, preclinical and possibly clinical studies. Given the urgency of drug development for neglected tropical diseases, the review suggests elements from current repurposing programs be extended to the neglected tropical diseases arena.
NASA Astrophysics Data System (ADS)
Asirin, Asirin; Azhari, Danang
2018-05-01
The growth of population and urban economy increased the need for humans’ mobility to support their activities. On the other hand, online Information and Communication Technology (ICT) is growing rapidly and more affordable. Within few years, there is some sharing economy business formed by using online platform. This condition brings through the emergence of ride-sharing business model using an online platform which can be beneficial to sustainability. This research aims to explore one of ridesharing business models which use the online platform and its impact on sustainability. This research used the procedure of case study method with a single case study of Nebengers. This research explores the case study with the scope of this research is limited by using several conceptual frameworks, they are sharing economy business model, four elements of a business model for sustainability (BMfS), Social Construction of Technology (SCoT), sustainable mobility and agency theory. Nebengers is a sharing economy business using online platform that historically can be explained using Social Construction of Technology (SCoT) Theory. There are conflicts between nebengers entrepreneur and the city government. Nebengers disrupts traditional and formal public transportation services which are managed by the government. However, nebengers also contributes to achieve the city government goal in developing sustainable mobility. The future challenge is how to arrange ride-sharing collaborative governance business model for sustainability in the cities in Indonesia.
Fast and Furious (At Publishers): The Motivations behind Crowdsourced Research Sharing
ERIC Educational Resources Information Center
Gardner, Carolyn Caffrey; Gardner, Gabriel J.
2017-01-01
Crowdsourced research sharing takes place across social media platforms including Twitter hashtags such as #icanhazpdf, Reddit Scholar, and Facebook. This study surveys users of these peer-to-peer exchanges on demographic information, frequency of use, and their motivations in both providing and obtaining scholarly information on these platforms.…
A Shared Platform for Studying Second Language Acquisition
ERIC Educational Resources Information Center
MacWhinney, Brian
2017-01-01
The study of second language acquisition (SLA) can benefit from the same process of datasharing that has proven effective in areas such as first language acquisition and aphasiology. Researchers can work together to construct a shared platform that combines data from spoken and written corpora, online tutors, and Web-based experimentation. Many of…
76 FR 47469 - Structure and Practices of the Video Relay Service Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-05
... requirements that have not been approved by the Office of Management and Budget (OMB). The Federal... use, sharing of the ACD platform, or sharing the management of the ACD platform may give providers an... require certified iTRS providers to append to their annual reports any documentary evidence required for...
aGEM: an integrative system for analyzing spatial-temporal gene-expression information
Jiménez-Lozano, Natalia; Segura, Joan; Macías, José Ramón; Vega, Juanjo; Carazo, José María
2009-01-01
Motivation: The work presented here describes the ‘anatomical Gene-Expression Mapping (aGEM)’ Platform, a development conceived to integrate phenotypic information with the spatial and temporal distributions of genes expressed in the mouse. The aGEM Platform has been built by extending the Distributed Annotation System (DAS) protocol, which was originally designed to share genome annotations over the WWW. DAS is a client-server system in which a single client integrates information from multiple distributed servers. Results: The aGEM Platform provides information to answer three main questions. (i) Which genes are expressed in a given mouse anatomical component? (ii) In which mouse anatomical structures are a given gene or set of genes expressed? And (iii) is there any correlation among these findings? Currently, this Platform includes several well-known mouse resources (EMAGE, GXD and GENSAT), hosting gene-expression data mostly obtained from in situ techniques together with a broad set of image-derived annotations. Availability: The Platform is optimized for Firefox 3.0 and it is accessed through a friendly and intuitive display: http://agem.cnb.csic.es Contact: natalia@cnb.csic.es Supplementary information: Supplementary data are available at http://bioweb.cnb.csic.es/VisualOmics/aGEM/home.html and http://bioweb.cnb.csic.es/VisualOmics/index_VO.html and Bioinformatics online. PMID:19592395
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
NASA Astrophysics Data System (ADS)
Rajib, M. A.; Merwade, V.; Song, C.; Zhao, L.; Kim, I. L.; Zhe, S.
2014-12-01
Setting up of any hydrologic model requires a large amount of efforts including compilation of all the data, creation of input files, calibration and validation. Given the amount of efforts involved, it is possible that models for a watershed get created multiple times by multiple groups or organizations to accomplish different research, educational or policy goals. To reduce the duplication of efforts and enable collaboration among different groups or organizations around an already existing hydrology model, a platform is needed where anyone can search for existing models, perform simple scenario analysis and visualize model results. The creator and users of a model on such a platform can then collaborate to accomplish new research or educational objectives. From this perspective, a prototype cyber-infrastructure (CI), called SWATShare, is developed for sharing, running and visualizing Soil Water Assessment Tool (SWAT) models in an interactive GIS-enabled web environment. Users can utilize SWATShare to publish or upload their own models, search and download existing SWAT models developed by others, run simulations including calibration using high performance resources provided by XSEDE and Cloud. Besides running and sharing, SWATShare hosts a novel spatio-temporal visualization system for SWAT model outputs. In temporal scale, the system creates time-series plots for all the hydrology and water quality variables available along the reach as well as in watershed-level. In spatial scale, the system can dynamically generate sub-basin level thematic maps for any variable at any user-defined date or date range; and thereby, allowing users to run animations or download the data for subsequent analyses. In addition to research, SWATShare can also be used within a classroom setting as an educational tool for modeling and comparing the hydrologic processes under different geographic and climatic settings. SWATShare is publicly available at https://www.water-hub.org/swatshare.
Fernandes, Pedro; Gevaert, Kris; Rothacker, Julie; Saiyed, Taslimarif; Detwiler, Michelle
2012-01-01
This roundtable will feature four international speakers who will discuss national and international collaborative initiatives and outreach efforts in which they participate. They will share how these efforts have facilitated access to cutting-edge technology, fostered new generations of scientists, and ultimately advanced the progression of global scientific research. Open discussion will follow the presentations! Centre for Cellular and Molecular Platforms, National Centre for Biological Sciences, India: experiences in implementing a national high-end core facility organization with the goals of improving regional technology access and enhancing the quality of research for scientists in academia, biotechnology companies, and the biopharmaceutical industry.Monash University Technology Platforms and Broader Victorian and Australian Networks: Australian initiatives to build global research capabilities and identify means to internationally benchmark regional capabilities to ensure delivery of world class infrastructure. Within the context of the current Australian strategic framework, funding considerations will be discussed, along with expectations for partner facilities to collaborate and be fully accessible to academia and industry.Instituto Gulbenkian de Ciencia, Portugal and beyond: Multiple roles of networking in science and extending outreach while consolidating community integration. Discussion will include achievement of community building and integration using concepts of sharing, training, resource availability, and the value and empowerment gained using acquired skills. The role of networking and institutional visibility will also be discussed.PRIME-XS: This EU-funded consortium provides an infrastructure of proteomics technologies to the European research community. The core is formed by six access facilities through which the consortium provides access to their technologies. Twelve partners work together to develop new resources to aid the community including the development of bioinformatic tools to analyze large-scale proteomics data and novel technologies to analyze protein interaction networks, post-translational modifications and more sensitive ways to detect protein and peptide biomarkers in complex samples.
Rogers Van Katwyk, Susan; Jones, Sara L; Hoffman, Steven J
2018-02-05
Antimicrobial resistance is an important global issue facing society. Healthcare workers need to be engaged in solving this problem, as advocates for rational antimicrobial use, stewards of sustainable effectiveness, and educators of their patients. To fulfill this role, healthcare workers need access to training and educational resources on antimicrobial resistance. To better understand the resources available to healthcare workers, we undertook a global environmental scan of educational programs and resources targeting healthcare workers on the topic of antimicrobial resistance and antimicrobial stewardship. Programs were identified through contact with key experts, web searching, and academic literature searching. We summarized programs in tabular form, including participating organizations, region, and intended audience. We developed a coding system to classify programs by program type and participating organization type, assigning multiple codes as necessary and creating summary charts for program types, organization types, and intended audience to illustrate the breadth of available resources. We identified 94 educational initiatives related to antimicrobial resistance and antimicrobial stewardship, which represent a diverse array of programs including courses, workshops, conferences, guidelines, public outreach materials, and online-resource websites. These resources were developed by a combination of government bodies, professional societies, universities, non-profit and community organizations, hospitals and healthcare centers, and insurance companies and industry. Most programs either targeted healthcare workers collectively or specifically targeted physicians. A smaller number of programs were aimed at other healthcare worker groups including pharmacists, nurses, midwives, and healthcare students. Our environmental scan shows that there are many organizations working to develop and share educational resources for healthcare workers on antimicrobial resistance and antimicrobial stewardship. Governments, hospitals, and professional societies appear to be driving action on this front, sometimes working with other types of organizations. A broad range of resources have been made freely available; however, we have noted several opportunities for action, including increased engagement with students, improvements to pre-service education, recognition of antimicrobial resistance courses as continuing medical education, and better platforms for resource-sharing online.
Halban, P A; Boulton, A J M; Smith, U
2013-03-01
Today, European biomedical and health-related research is insufficiently well funded and is fragmented, with no common vision, less-than-optimal sharing of resources, and inadequate support and training in clinical research. Improvements to the competitiveness of European biomedical research will depend on the creation of new infrastructures that must be dynamic and free of bureaucracy, involve all stakeholders and facilitate faster delivery of new discoveries from bench to bedside. Taking diabetes research as the model, a new paradigm for European biomedical research is presented, which offers improved co-ordination and common resources that will benefit both academic and industrial clinical research. This includes the creation of a European Council for Health Research, first proposed by the Alliance for Biomedical Research in Europe, which will bring together and consult with all health stakeholders to develop strategic and multidisciplinary research programmes addressing the full innovation cycle. A European Platform for Clinical Research in Diabetes is proposed by the Alliance for European Diabetes Research (EURADIA) in response to the special challenges and opportunities presented by research across the European region, with the need for common standards and shared expertise and data.
MIRASS: medical informatics research activity support system using information mashup network.
Kiah, M L M; Zaidan, B B; Zaidan, A A; Nabi, Mohamed; Ibraheem, Rabiu
2014-04-01
The advancement of information technology has facilitated the automation and feasibility of online information sharing. The second generation of the World Wide Web (Web 2.0) enables the collaboration and sharing of online information through Web-serving applications. Data mashup, which is considered a Web 2.0 platform, plays an important role in information and communication technology applications. However, few ideas have been transformed into education and research domains, particularly in medical informatics. The creation of a friendly environment for medical informatics research requires the removal of certain obstacles in terms of search time, resource credibility, and search result accuracy. This paper considers three glitches that researchers encounter in medical informatics research; these glitches include the quality of papers obtained from scientific search engines (particularly, Web of Science and Science Direct), the quality of articles from the indices of these search engines, and the customizability and flexibility of these search engines. A customizable search engine for trusted resources of medical informatics was developed and implemented through data mashup. Results show that the proposed search engine improves the usability of scientific search engines for medical informatics. Pipe search engine was found to be more efficient than other engines.
Implementation of a Web-Based Collaborative Process Planning System
NASA Astrophysics Data System (ADS)
Wang, Huifen; Liu, Tingting; Qiao, Li; Huang, Shuangxi
Under the networked manufacturing environment, all phases of product manufacturing involving design, process planning, machining and assembling may be accomplished collaboratively by different enterprises, even different manufacturing stages of the same part may be finished collaboratively by different enterprises. Based on the self-developed networked manufacturing platform eCWS(e-Cooperative Work System), a multi-agent-based system framework for collaborative process planning is proposed. In accordance with requirements of collaborative process planning, share resources provided by cooperative enterprises in the course of collaboration are classified into seven classes. Then a reconfigurable and extendable resource object model is built. Decision-making strategy is also studied in this paper. Finally a collaborative process planning system e-CAPP is developed and applied. It provides strong support for distributed designers to collaboratively plan and optimize product process though network.
NASA Astrophysics Data System (ADS)
Johnson, R. M.; Herrold, A.; Holzer, M. A.; Passow, M. J.
2010-12-01
The geoscience research and education community is interested in developing scalable and effective user-friendly strategies for reaching the public, students and educators with information about the Earth and space sciences. Based on experience developed over the past decade with education and outreach programs seeking to reach these populations, there is a growing consensus that this will be best achieved through collaboration, leveraging the resources and networks already in existence. While it is clear that gifted researchers and developers can create wonderful online educational resources, many programs have been stymied by the difficulty of attracting an audience to these resources. The National Earth Science Teachers Association (NESTA) has undertaken an exciting new project, with support from the William and Flora Hewlett Foundation, that provides a new platform for the geoscience education and research community to share their research, resources, programs, products and services with a wider audience. In April 2010, the Windows to the Universe project (http://windows2universe.org) moved from the University Corporation for Atmospheric Research to NESTA. Windows to the Universe, which started in 1995 at the University of Michigan, is one of the most popular Earth and space science education websites globally, with over 16 million visits annually. The objective of this move is to develop a suite of new opportunities and capabilities on the website that will allow it become a sustainable education and outreach platform for the geoscience research and education community hosting open educational resources. This presentation will provide an update on our progress, highlighting our new strategies, synergies with community needs, and opportunities for collaboration.
30 CFR 250.609 - Well-workover structures on fixed platforms.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Well-workover structures on fixed platforms. 250.609 Section 250.609 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR... consideration the corrosion protection, age of the platform, and previous stresses to the platform. ...
Vest, Joshua R; Shah, Gulzar H
2012-11-01
Resource sharing, arrangements between local health departments (LHDs) for joint programs or to share staff, is a growing occurrence. The post-9/11 influx of federal funding and new public health preparedness responsibilities dramatically increased the occurrence of these inter-LHD relationships, and several states have pursed more intrastate collaboration. This article describes the current state of resource sharing among LHDs and identifies the factors associated with resource sharing. Using the National Association of County & City Health Officials' 2010 Profile Survey, we determined the self-reported number of shared programmatic activities and the number of shared organizational functions for a sample of LHDs. Negative binomial regression models described the relationships between factors suggested by interorganizational theory and the counts of sharing activities. We examined the extent of resource sharing using 2 different count variables: (1) number of shared programmatic activities and (2) number of shared organizational functions. About one-half of all LHDs are engaged in resource sharing. The extent of sharing was lower for those serving larger populations, with city jurisdictions, or of larger size. Sharing was more extensive for state-governed LHDs, those covering multiple jurisdictions, states with centralized governance, and in instances of financial constraint. Many LHDs are engaged in a greater extent of resource sharing than others. Leaders of LHDs can work within the context of these factors to leverage resource sharing to meet their organizational needs.
An Interprofessional Web-Based Resource for Health Professions Preceptors
McLeod, Elizabeth; Kwong, Mona; Tidball, Glynnis; Collins, John; Neufeld, Lois; Drynan, Donna
2012-01-01
Objective. To develop a Web-based preceptor education resource for healthcare professionals and evaluate its usefulness. Methods. Using an open source platform, 8 online modules called “E-tips for Practice Education” (E-tips) were developed that focused on topics identified relevant across healthcare disciplines. A cross-sectional survey design was used to evaluate the online resource. Ninety preceptors from 10 health disciplines affiliated with the University of British Columbia evaluated the E-tips. Results. The modules were well received by preceptors, with all participants indicating that they would recommend these modules to their colleagues, over 80% indicating the modules were very to extremely applicable, and over 60% indicating that E-tips had increased their confidence in their ability to teach. Conclusion. Participants reported E-tips to be highly applicable to their teaching role as preceptors. Given their multidisciplinary focus, these modules address a shared language and ideas about clinical teaching among those working in multi-disciplinary settings. PMID:23193332
NASA Astrophysics Data System (ADS)
Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.
2015-12-01
Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.
30 CFR 250.509 - Well-completion structures on fixed platforms.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Well-completion structures on fixed platforms. 250.509 Section 250.509 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR... consideration the corrosion protection, age of platform, and previous stresses to the platform. [53 FR 10690...
Intelligent support of e-management for consumer-focused virtual enterprises
NASA Astrophysics Data System (ADS)
Chandra, Charu; Smirnov, Alexander V.
2000-10-01
The interest in consumer-focused virtual enterprises (VE) decision-making problem is growing fast. The purpose of this type of enterprise is to transform incomplete information about customer orders and available resources into-co-ordinated plans for production and replenishment of goods and services in the temporal network formed by collaborating units. This implies that information in the consumer-focused VE can be shared via Internet, Intranet, and Extranet for business-to-consumer (B2C), business-to-business service (B2B-S), and business-to-business goods (B2B-G) transactions. One of the goals of Internet-Based Management (e-management) is to facilitate transfer and sharing of data and knowledge in the context of enterprise collaboration. This paper discusses a generic framework of e-management that integrates intelligent information support group-decision making, and agreement modeling for a VE network. It offers the platform for design and modeling of diverse implementation strategies related to the type of agreement, optimization policies, decision-making strategies, organization structures, and information sharing strategies and mechanisms, and business policies for the VE.
Advancing the Implementation of Hydrologic Models as Web-based Applications
NASA Astrophysics Data System (ADS)
Dahal, P.; Tarboton, D. G.; Castronova, A. M.
2017-12-01
Advanced computer simulations are required to understand hydrologic phenomenon such as rainfall-runoff response, groundwater hydrology, snow hydrology, etc. Building a hydrologic model instance to simulate a watershed requires investment in data (diverse geospatial datasets such as terrain, soil) and computer resources, typically demands a wide skill set from the analyst, and the workflow involved is often difficult to reproduce. This work introduces a web-based prototype infrastructure in the form of a web application that provides researchers with easy to use access to complete hydrological modeling functionality. This includes creating the necessary geospatial and forcing data, preparing input files for a model by applying complex data preprocessing, running the model for a user defined watershed, and saving the results to a web repository. The open source Tethys Platform was used to develop the web app front-end Graphical User Interface (GUI). We used HydroDS, a webservice that provides data preparation processing capability to support backend computations used by the app. Results are saved in HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. The TOPographic Kinematic APproximation and Integration (TOPKAPI) model served as the example for which we developed a complete hydrologic modeling service to demonstrate the approach. The final product is a complete modeling system accessible through the web to create input files, and run the TOPKAPI hydrologic model for a watershed of interest. We are investigating similar functionality for the preparation of input to Regional Hydro-Ecological Simulation System (RHESSys). Key Words: hydrologic modeling, web services, hydrologic information system, HydroShare, HydroDS, Tethys Platform
Halewood, Michael; Chiurugwi, Tinashe; Sackville Hamilton, Ruaraidh; Kurtz, Brad; Marden, Emily; Welch, Eric; Michiels, Frank; Mozafari, Javad; Sabran, Muhamad; Patron, Nicola; Kersey, Paul; Bastow, Ruth; Dorius, Shawn; Dias, Sonia; McCouch, Susan; Powell, Wayne
2018-03-01
Contents Summary 1407 I. Introduction 1408 II. Technological advances and their utility for gene banks and breeding, and longer-term contributions to SDGs 1408 III. The challenges that must be overcome to realise emerging R&D opportunities 1410 IV. Renewed governance structures for PGR (and related big data) 1413 V. Access and benefit sharing and big data 1416 VI. Conclusion 1417 Acknowledgements 1417 ORCID 1417 References 1417 SUMMARY: Over the last decade, there has been an ongoing revolution in the exploration, manipulation and synthesis of biological systems, through the development of new technologies that generate, analyse and exploit big data. Users of Plant Genetic Resources (PGR) can potentially leverage these capacities to significantly increase the efficiency and effectiveness of their efforts to conserve, discover and utilise novel qualities in PGR, and help achieve the Sustainable Development Goals (SDGs). This review advances the discussion on these emerging opportunities and discusses how taking advantage of them will require data integration and synthesis across disciplinary, organisational and international boundaries, and the formation of multi-disciplinary, international partnerships. We explore some of the institutional and policy challenges that these efforts will face, particularly how these new technologies may influence the structure and role of research for sustainable development, ownership of resources, and access and benefit sharing. We discuss potential responses to political and institutional challenges, ranging from options for enhanced structure and governance of research discovery platforms to internationally brokered benefit-sharing agreements, and identify a set of broad principles that could guide the global community as it seeks or considers solutions. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.
Shah, Gulzar H; Badana, Adrian N S; Robb, Claire; Livingood, William C
2016-01-01
Local health departments (LHDs) are striving to meet public health needs within their jurisdictions, amidst fiscal restraints and complex dynamic environment. Resource sharing across jurisdictions is a critical opportunity for LHDs to continue to enhance effectiveness and increase efficiency. This research examines the extent of cross-jurisdictional resource sharing among LHDs, the programmatic areas and organizational functions for which LHDs share resources, and LHD characteristics associated with resource sharing. Data from the National Association of County & City Health Officials' 2013 National Profile of LHDs were used. Descriptive statistics and multinomial logistic regression were performed for the 5 implementation-oriented outcome variables of interest, with 3 levels of implementation. More than 54% of LHDs shared resources such as funding, staff, or equipment with 1 or more other LHDs on a continuous, recurring basis. Results from the multinomial regression analysis indicate that economies of scale (population size and metropolitan status) had significant positive influences (at P ≤ .05) on resource sharing. Engagement in accreditation, community health assessment, community health improvement planning, quality improvement, and use of the Community Guide were associated with lower levels of engagement in resource sharing. Doctoral degree of the top executive and having 1 or more local boards of health carried a positive influence on resource sharing. Cross-jurisdictional resource sharing is a viable and commonly used process to overcome the challenges of new and emerging public health problems within the constraints of restricted budgets. LHDs, particularly smaller LHDs with limited resources, should consider increased resource sharing to address emerging challenges.
NASA Astrophysics Data System (ADS)
Changyong, Dou; Huadong, Guo; Chunming, Han; Ming, Liu
2014-03-01
With more and more Earth observation data available to the community, how to manage and sharing these valuable remote sensing datasets is becoming an urgent issue to be solved. The web based Geographical Information Systems (GIS) technology provides a convenient way for the users in different locations to share and make use of the same dataset. In order to efficiently use the airborne Synthetic Aperture Radar (SAR) remote sensing data acquired in the Airborne Remote Sensing Center of the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS), a Web-GIS based platform for airborne SAR data management, distribution and sharing was designed and developed. The major features of the system include map based navigation search interface, full resolution imagery shown overlaid the map, and all the software adopted in the platform are Open Source Software (OSS). The functions of the platform include browsing the imagery on the map navigation based interface, ordering and downloading data online, image dataset and user management, etc. At present, the system is under testing in RADI and will come to regular operation soon.
Virtual Labs (Science Gateways) as platforms for Free and Open Source Science
NASA Astrophysics Data System (ADS)
Lescinsky, David; Car, Nicholas; Fraser, Ryan; Friedrich, Carsten; Kemp, Carina; Squire, Geoffrey
2016-04-01
The Free and Open Source Software (FOSS) movement promotes community engagement in software development, as well as provides access to a range of sophisticated technologies that would be prohibitively expensive if obtained commercially. However, as geoinformatics and eResearch tools and services become more dispersed, it becomes more complicated to identify and interface between the many required components. Virtual Laboratories (VLs, also known as Science Gateways) simplify the management and coordination of these components by providing a platform linking many, if not all, of the steps in particular scientific processes. These enable scientists to focus on their science, rather than the underlying supporting technologies. We describe a modular, open source, VL infrastructure that can be reconfigured to create VLs for a wide range of disciplines. Development of this infrastructure has been led by CSIRO in collaboration with Geoscience Australia and the National Computational Infrastructure (NCI) with support from the National eResearch Collaboration Tools and Resources (NeCTAR) and the Australian National Data Service (ANDS). Initially, the infrastructure was developed to support the Virtual Geophysical Laboratory (VGL), and has subsequently been repurposed to create the Virtual Hazards Impact and Risk Laboratory (VHIRL) and the reconfigured Australian National Virtual Geophysics Laboratory (ANVGL). During each step of development, new capabilities and services have been added and/or enhanced. We plan on continuing to follow this model using a shared, community code base. The VL platform facilitates transparent and reproducible science by providing access to both the data and methodologies used during scientific investigations. This is further enhanced by the ability to set up and run investigations using computational resources accessed through the VL. Data is accessed using registries pointing to catalogues within public data repositories (notably including the NCI National Environmental Research Data Interoperability Platform), or by uploading data directly from user supplied addresses or files. Similarly, scientific software is accessed through registries pointing to software repositories (e.g., GitHub). Runs are configured by using or modifying default templates designed by subject matter experts. After the appropriate computational resources are identified by the user, Virtual Machines (VMs) are spun up and jobs are submitted to service providers (currently the NeCTAR public cloud or Amazon Web Services). Following completion of the jobs the results can be reviewed and downloaded if desired. By providing a unified platform for science, the VL infrastructure enables sophisticated provenance capture and management. The source of input data (including both collection and queries), user information, software information (version and configuration details) and output information are all captured and managed as a VL resource which can be linked to output data sets. This provenance resource provides a mechanism for publication and citation for Free and Open Source Science.
NASA Astrophysics Data System (ADS)
Colomo-Palacios, Ricardo; Jiménez-López, Diego; García-Crespo, Ángel; Blanco-Iglesias, Borja
eLearning educative processes are a challenge for educative institutions and education professionals. In an environment in which learning resources are being produced, catalogued and stored using innovative ways, SOLE provides a platform in which exam questions can be produced supported by Web 2.0 tools, catalogued and labeled via semantic web and stored and distributed using eLearning standards. This paper presents, SOLE, a social network of exam questions sharing particularized for Software Engineering domain, based on semantics and built using semantic web and eLearning standards, such as IMS Question and Test Interoperability specification 2.1.
The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.
Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker
2016-01-01
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.
The Microbial Resource Research Infrastructure MIRRI: Strength through Coordination
Stackebrandt, Erko; Schüngel, Manuela; Martin, Dunja; Smith, David
2015-01-01
Microbial resources have been recognized as essential raw materials for the advancement of health and later for biotechnology, agriculture, food technology and for research in the life sciences, as their enormous abundance and diversity offer an unparalleled source of unexplored solutions. Microbial domain biological resource centres (mBRC) provide live cultures and associated data to foster and support the development of basic and applied science in countries worldwide and especially in Europe, where the density of highly advanced mBRCs is high. The not-for-profit and distributed project MIRRI (Microbial Resource Research Infrastructure) aims to coordinate access to hitherto individually managed resources by developing a pan-European platform which takes the interoperability and accessibility of resources and data to a higher level. Providing a wealth of additional information and linking to datasets such as literature, environmental data, sequences and chemistry will enable researchers to select organisms suitable for their research and enable innovative solutions to be developed. The current independent policies and managed processes will be adapted by partner mBRCs to harmonize holdings, services, training, and accession policy and to share expertise. The infrastructure will improve access to enhanced quality microorganisms in an appropriate legal framework and to resource-associated data in a more interoperable way. PMID:27682123
The Microbial Resource Research Infrastructure MIRRI: Strength through Coordination.
Stackebrandt, Erko; Schüngel, Manuela; Martin, Dunja; Smith, David
2015-11-18
Microbial resources have been recognized as essential raw materials for the advancement of health and later for biotechnology, agriculture, food technology and for research in the life sciences, as their enormous abundance and diversity offer an unparalleled source of unexplored solutions. Microbial domain biological resource centres (mBRC) provide live cultures and associated data to foster and support the development of basic and applied science in countries worldwide and especially in Europe, where the density of highly advanced mBRCs is high. The not-for-profit and distributed project MIRRI (Microbial Resource Research Infrastructure) aims to coordinate access to hitherto individually managed resources by developing a pan-European platform which takes the interoperability and accessibility of resources and data to a higher level. Providing a wealth of additional information and linking to datasets such as literature, environmental data, sequences and chemistry will enable researchers to select organisms suitable for their research and enable innovative solutions to be developed. The current independent policies and managed processes will be adapted by partner mBRCs to harmonize holdings, services, training, and accession policy and to share expertise. The infrastructure will improve access to enhanced quality microorganisms in an appropriate legal framework and to resource-associated data in a more interoperable way.
NASA Astrophysics Data System (ADS)
Klump, Jens; Fraser, Ryan; Wyborn, Lesley; Friedrich, Carsten; Squire, Geoffrey; Barker, Michelle; Moloney, Glenn
2017-04-01
The researcher of today is likely to be part of a team distributed over multiple sites that will access data from an external repository and then process the data on a public or private cloud or even on a large centralised supercomputer. They are increasingly likely to use a mixture of their own code, third party software and libraries, or even access global community codes. These components will be connected into a Virtual Research Environments (VREs) that will enable members of the research team who are not co-located to actively work together at various scales to share data, models, tools, software, workflows, best practices, infrastructures, etc. Many VRE's are built in isolation: designed to meet a specific research program with components tightly coupled and not capable of being repurposed for other use cases - they are becoming 'stovepipes'. The limited number of users of some VREs also means that the cost of maintenance per researcher can be unacceptably high. The alternative is to develop service-oriented Science Platforms that enable multiple communities to develop specialised solutions for specific research programs. The platforms can offer access to data, software tools and processing infrastructures (cloud, supercomputers) through globally distributed, interconnected modules. In Australia, the Virtual Geophysics Laboratory (VGL) was initially built to enable a specific set of researchers in government agencies access to specific data sets and a limited number of tools, that is now rapidly evolving into a multi-purpose Earth science platform with access to an increased variety of data, a broader range of tools, users from more sectors and a diversity of computational infrastructures. The expansion has been relatively easy, because of the architecture whereby data, tools and compute resources are loosely coupled via interfaces that are built on international standards and accessed as services wherever possible. In recent years, investments in discoverability and accessibility of data via online services in Australia mean that data resources can be easily added to the virtual environments as and when required. Another key to increasing to reusability and uptake of the VRE is the capability to capturing workflows so that they can be reused and repurposed both within and beyond the community that that defined the original use case. Unfortunately, Software-as-a-Service in the research sector is not yet mature. In response, we developed a Scientific Software solutions Center (SSSC) that enables researchers to discover, deploy and then share computational codes, code snippets or processes both in a human and machine-readable manner. Growth has come not only from within the Earth science community but from the Australian Virtual Laboratory community which is building VREs for a diversity of communities such as astronomy, genomics, environment, humanities, climate etc. Components such as access control, provenance, visualisation, accounting etc. are common to all scientific domains and sharing of these across multiple domains reduces costs, but more importantly increases the ability to undertake interdisciplinary science. These efforts are transitioning VREs to more sustainable Service-oriented Science Platforms that can be delivered in an agile, adaptable manner for broader community interests.
NASA Astrophysics Data System (ADS)
Sarkar, A.; Koohikamali, M.; Pick, J. B.
2017-10-01
In recent years, disruptive innovation by peer-to-peer platforms in a variety of industries, notably transportation and hospitality have altered the way individuals consume everyday essential services. With growth in sharing economy platforms such as Uber for ridesharing and Airbnb for short-term accommodations, interest in examining spatiotemporal patterns of participation in the sharing economy by suppliers and consumers is increasing. This research is motivated by key questions: who are the sharing economy workers, where are they located, and does their location influence their participation in the sharing economy? This paper is the first systematic effort to analyze spatiotemporal patterns of participation by hosts in the shared accommodation-based economy. Using three different kinds of shared accommodations listed in a 3-year period in the popular short-term accommodation platform, Airbnb, we examine spatiotemporal dimensions of host participation in a major U.S. market, Los Angeles CA. The paper also develops a conceptual model by positing associations of demographic, socioeconomic, occupational, and social capital attributes of hosts, along with their attitudes toward trust and greener consumption with hosts' participation in a shared accommodation market. Results confirm host participation to be influenced by young dependency ratio, the potential of supplemental income, as well as the sustainability potential of collaborative consumption, along with finance, insurance, and real estate occupation, but not so much by trust for our overall study area. These results add new insights to limited prior knowledge about the sharing economy worker and have policy implications.
Sharing health data in Belgium: A home care case study using the Vitalink platform.
De Backere, Femke; Bonte, Pieter; Verstichel, Stijn; Ongenae, Femke; De Turck, Filip
2018-01-01
In 2013, the Flemish Government launched the Vitalink platform. This initiative focuses on the sharing of health and welfare data to support primary healthcare. In this paper, the objectives and mission of the Vitalink initiative are discussed. Security and privacy measures are reviewed, and the technical implementation of the Vitalink platform is presented. Through a case study, the possibility of interaction with cloud solutions for healthcare is also investigated upon; this was initially not the focus of Vitalink. The Vitalink initiative provides support for secure data sharing in primary healthcare, which in the long term will improve the efficiency of care and will decrease costs. Based on the results of the case study, Vitalink allowed cloud solutions or applications not providing end-to-end security to use their system. The most important lesson learned during this research was the need for firm regulations and stipulations for cloud solutions to interact with the Vitalink platform. However, these are currently still vague.
Infrastructure-Less Communication Platform for Off-The-Shelf Android Smartphones
2018-01-01
As smartphones and other small portable devices become more sophisticated and popular, opportunities for communication and information sharing among such device users have increased. In particular, since it is known that infrastructure-less device-to-device (D2D) communication platforms consisting only of such devices are excellent in terms of, for example, bandwidth efficiency, efforts are being made to merge their information sharing capabilities with conventional infrastructure. However, efficient multi-hop communication is difficult with the D2D communication protocol, and many conventional D2D communication platforms require modifications of the protocol and terminal operating systems (OSs). In response to these issues, this paper reports on a proposed tree-structured D2D communication platform for Android devices that combines Wi-Fi Direct and Wi-Fi functions. The proposed platform, which is expected to be used with general Android 4.0 (or higher) OS equipped terminals, makes it possible to construct an ad hoc network instantaneously without sharing prior knowledge among participating devices. We will show the feasibility of our proposed platform through its design and demonstrate the implementation of a prototype using real devices. In addition, we will report on our investigation into communication delays and stability based on the number of hops and on terminal performance through experimental confirmation experiments. PMID:29510536
Infrastructure-Less Communication Platform for Off-The-Shelf Android Smartphones.
Oide, Takuma; Abe, Toru; Suganuma, Takuo
2018-03-04
As smartphones and other small portable devices become more sophisticated and popular, opportunities for communication and information sharing among such device users have increased. In particular, since it is known that infrastructure-less device-to-device (D2D) communication platforms consisting only of such devices are excellent in terms of, for example, bandwidth efficiency, efforts are being made to merge their information sharing capabilities with conventional infrastructure. However, efficient multi-hop communication is difficult with the D2D communication protocol, and many conventional D2D communication platforms require modifications of the protocol and terminal operating systems (OSs). In response to these issues, this paper reports on a proposed tree-structured D2D communication platform for Android devices that combines Wi-Fi Direct and Wi-Fi functions. The proposed platform, which is expected to be used with general Android 4.0 (or higher) OS equipped terminals, makes it possible to construct an ad hoc network instantaneously without sharing prior knowledge among participating devices. We will show the feasibility of our proposed platform through its design and demonstrate the implementation of a prototype using real devices. In addition, we will report on our investigation into communication delays and stability based on the number of hops and on terminal performance through experimental confirmation experiments.
U-Compare: share and compare text mining tools with UIMA.
Kano, Yoshinobu; Baumgartner, William A; McCrohon, Luke; Ananiadou, Sophia; Cohen, K Bretonnel; Hunter, Lawrence; Tsujii, Jun'ichi
2009-08-01
Due to the increasing number of text mining resources (tools and corpora) available to biologists, interoperability issues between these resources are becoming significant obstacles to using them effectively. UIMA, the Unstructured Information Management Architecture, is an open framework designed to aid in the construction of more interoperable tools. U-Compare is built on top of the UIMA framework, and provides both a concrete framework for out-of-the-box text mining and a sophisticated evaluation platform allowing users to run specific tools on any target text, generating both detailed statistics and instance-based visualizations of outputs. U-Compare is a joint project, providing the world's largest, and still growing, collection of UIMA-compatible resources. These resources, originally developed by different groups for a variety of domains, include many famous tools and corpora. U-Compare can be launched straight from the web, without needing to be manually installed. All U-Compare components are provided ready-to-use and can be combined easily via a drag-and-drop interface without any programming. External UIMA components can also simply be mixed with U-Compare components, without distinguishing between locally and remotely deployed resources. http://u-compare.org/
30 CFR 250.609 - Well-workover structures on fixed platforms.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false Well-workover structures on fixed platforms. 250.609 Section 250.609 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND..., and previous stresses to the platform. ...
Wienecke, Sarah; Ishwarbhai, Alka; Tsipa, Argyro; Aw, Rochelle; Kylilis, Nicolas; Bell, David J.; McClymont, David W.; Jensen, Kirsten; Biedendieck, Rebekka
2018-01-01
Native cell-free transcription–translation systems offer a rapid route to characterize the regulatory elements (promoters, transcription factors) for gene expression from nonmodel microbial hosts, which can be difficult to assess through traditional in vivo approaches. One such host, Bacillus megaterium, is a giant Gram-positive bacterium with potential biotechnology applications, although many of its regulatory elements remain uncharacterized. Here, we have developed a rapid automated platform for measuring and modeling in vitro cell-free reactions and have applied this to B. megaterium to quantify a range of ribosome binding site variants and previously uncharacterized endogenous constitutive and inducible promoters. To provide quantitative models for cell-free systems, we have also applied a Bayesian approach to infer ordinary differential equation model parameters by simultaneously using time-course data from multiple experimental conditions. Using this modeling framework, we were able to infer previously unknown transcription factor binding affinities and quantify the sharing of cell-free transcription–translation resources (energy, ribosomes, RNA polymerases, nucleotides, and amino acids) using a promoter competition experiment. This allows insights into resource limiting-factors in batch cell-free synthesis mode. Our combined automated and modeling platform allows for the rapid acquisition and model-based analysis of cell-free transcription–translation data from uncharacterized microbial cell hosts, as well as resource competition within cell-free systems, which potentially can be applied to a range of cell-free synthetic biology and biotechnology applications. PMID:29666238
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-01-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600
An incremental anomaly detection model for virtual machines.
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.
An incremental anomaly detection model for virtual machines
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-06-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.
A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations
NASA Astrophysics Data System (ADS)
Demir, I.; Agliamzanov, R.
2014-12-01
Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.
Basin Assessment Spatial Planning Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
The tool is intended to facilitate hydropower development and water resource planning by improving synthesis and interpretation of disparate spatial datasets that are considered in development actions (e.g., hydrological characteristics, environmentally and culturally sensitive areas, existing or proposed water power resources, climate-informed forecasts). The tool enables this capability by providing a unique framework for assimilating, relating, summarizing, and visualizing disparate spatial data through the use of spatial aggregation techniques, relational geodatabase platforms, and an interactive web-based Geographic Information Systems (GIS). Data are aggregated and related based on shared intersections with a common spatial unit; in this case, industry-standard hydrologic drainagemore » areas for the U.S. (National Hydrography Dataset) are used as the spatial unit to associate planning data. This process is performed using all available scalar delineations of drainage areas (i.e., region, sub-region, basin, sub-basin, watershed, sub-watershed, catchment) to create spatially hierarchical relationships among planning data and drainages. These entity-relationships are stored in a relational geodatabase that provides back-end structure to the web GIS and its widgets. The full technology stack was built using all open-source software in modern programming languages. Interactive widgets that function within the viewport are also compatible with all modern browsers.« less
Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander
2015-01-01
Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.
Data Sharing and Cardiology: Platforms and Possibilities.
Dey, Pranammya; Ross, Joseph S; Ritchie, Jessica D; Desai, Nihar R; Bhavnani, Sanjeev P; Krumholz, Harlan M
2017-12-19
Sharing deidentified patient-level research data presents immense opportunities to all stakeholders involved in cardiology research and practice. Sharing data encourages the use of existing data for knowledge generation to improve practice, while also allowing for validation of disseminated research. In this review, we discuss key initiatives and platforms that have helped to accelerate progress toward greater sharing of data. These efforts are being prompted by government, universities, philanthropic sponsors of research, major industry players, and collaborations among some of these entities. As data sharing becomes a more common expectation, policy changes will be required to encourage and assist data generators with the process of sharing the data they create. Patients also will need access to their own data and to be empowered to share those data with researchers. Although medicine still lags behind other fields in achieving data sharing's full potential, cardiology research has the potential to lead the way. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
The OOI Ocean Education Portal: Enabling the Development of Online Data Investigations
NASA Astrophysics Data System (ADS)
Lichtenwalner, C. S.; McDonnell, J. D.; Crowley, M. F.; deCharon, A.; Companion, C. J.; Glenn, S. M.
2016-02-01
The Ocean Observatories Initiative (OOI) was designed to transform ocean science, by establishing a long-term, multi-instrument, multi-platform research infrastructure at 7 arrays around the word. This unprecedented investment in ocean observation, funded by the National Science Foundation, provides a rich opportunity to reshape ocean science education as well. As part of the initial construction effort, an online Ocean Education Portal was developed to support the creation and sharing of educational resources by undergraduate faculty at universities and community colleges. The portal includes a suite of tools that enable the development of online activities for use as group or individual projects, which can be used during lectures or as homework assignments. The site includes: 1) a suite of interactive educational data visualization tools that provide simple and targeted interfaces to interact with OOI datasets; 2) a concept map builder that can be used by both educators and students to build networked diagrams of their knowledge; and 3) a "data investigation" builder that allows faculty to assemble resources into coherent learning modules. The site also includes a "vocabulary navigator" that provides a visual way to discover and learn about the OOI's infrastructure and scientific design. The site allows users to browse an ever-growing database of resources created by the community, and likewise, users can share resources they create with others. As the OOI begins its 25-year operational phase, it is our hope that faculty will be able to use the tools and investigations on the Ocean Education Portal to bring real ocean science research to their undergraduate students.
30 CFR 250.921 - How do I analyze my platform for cumulative fatigue?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false How do I analyze my platform for cumulative fatigue? 250.921 Section 250.921 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR... Inspection, Maintenance, and Assessment of Platforms § 250.921 How do I analyze my platform for cumulative...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2013-12-01
The National Renewable Energy Laboratory (NREL) has developed OpenEI.org, a public, open, data-sharing platform where consumers, analysts, industry experts, and energy decision makers can go to boost their energy IQs, search for energy data, share data, and get access to energy applications. The free site blends elements of social media, linked open-data practices, and MediaWiki-based technology to build a collaborative environment for creating and sharing energy data with the world. The result is a powerful platform that is helping government and industry leaders around the world define policy options, make informed investment decisions, and create new businesses.
ERIC Educational Resources Information Center
Mahoney, Brian D.
2000-01-01
States that several states are establishing networks for resource sharing. Florida offers these resources through the Florida Distance Learning Library Initiative, Wisconsin has BadgerLink and WISCAT, TexShare provides library resource sharing in Texas, and Louisiana has LOUIS and LLN. These are some of the states successfully demonstrating…
NASA Astrophysics Data System (ADS)
Juanle, Wang; Shuang, Li; Yunqiang, Zhu
2005-10-01
According to the requirements of China National Scientific Data Sharing Program (NSDSP), the research and development of web oriented RS Image Publication System (RSIPS) is based on Java Servlet technique. The designing of RSIPS framework is composed of 3 tiers, which is Presentation Tier, Application Service Tier and Data Resource Tier. Presentation Tier provides user interface for data query, review and download. For the convenience of users, visual spatial query interface is included. Served as a middle tier, Application Service Tier controls all actions between users and databases. Data Resources Tier stores RS images in file and relationship databases. RSIPS is developed with cross platform programming based on Java Servlet tools, which is one of advanced techniques in J2EE architecture. RSIPS's prototype has been developed and applied in the geosciences clearinghouse practice which is among the experiment units of NSDSP in China.
Design and deployment of an elastic network test-bed in IHEP data center based on SDN
NASA Astrophysics Data System (ADS)
Zeng, Shan; Qi, Fazhi; Chen, Gang
2017-10-01
High energy physics experiments produce huge amounts of raw data, while because of the sharing characteristics of the network resources, there is no guarantee of the available bandwidth for each experiment which may cause link congestion problems. On the other side, with the development of cloud computing technologies, IHEP have established a cloud platform based on OpenStack which can ensure the flexibility of the computing and storage resources, and more and more computing applications have been deployed on virtual machines established by OpenStack. However, under the traditional network architecture, network capability can’t be required elastically, which becomes the bottleneck of restricting the flexible application of cloud computing. In order to solve the above problems, we propose an elastic cloud data center network architecture based on SDN, and we also design a high performance controller cluster based on OpenDaylight. In the end, we present our current test results.
Winkler, Sabune J; Cagliero, Enrico; Witte, Elizabeth; Bierer, Barbara E
2014-08-01
The Harvard Clinical and Translational Science Center ("Harvard Catalyst") Research Subject Advocacy (RSA) Program has reengineered subject advocacy, distributing the delivery of advocacy functions through a multi-institutional, central platform rather than vesting these roles and responsibilities in a single individual functioning as a subject advocate. The program is process-oriented and output-driven, drawing on the strengths of participating institutions to engage local stakeholders both in the protection of research subjects and in advocacy for subjects' rights. The program engages stakeholder communities in the collaborative development and distributed delivery of accessible and applicable educational programming and resources. The Harvard Catalyst RSA Program identifies, develops, and supports the sharing and distribution of expertise, education, and resources for the benefit of all institutions, with a particular focus on the frontline: research subjects, researchers, research coordinators, and research nurses. © 2014 Wiley Periodicals, Inc.
Developing a Business Intelligence Process for a Training Module in SharePoint 2010
NASA Technical Reports Server (NTRS)
Schmidtchen, Bryce; Solano, Wanda M.; Albasini, Colby
2015-01-01
Prior to this project, training information for the employees of the National Center for Critical Processing and Storage (NCCIPS) was stored in an array of unrelated spreadsheets and SharePoint lists that had to be manually updated. By developing a content management system through a web application platform named SharePoint, this training system is now highly automated and provides a much less intensive method of storing training data and scheduling training courses. This system was developed by using SharePoint Designer and laying out the data structure for the interaction between different lists of data about the employees. The automation of data population inside of the lists was accomplished by implementing SharePoint workflows which essentially lay out the logic for how data is connected and calculated between certain lists. The resulting training system is constructed from a combination of five lists of data with a single list acting as the user-friendly interface. This interface is populated with the courses required for each employee and includes past and future information about course requirements. The employees of NCCIPS now have the ability to view, log, and schedule their training information and courses with much more ease. This system will relieve a significant amount of manual input and serve as a powerful informational resource for the employees of NCCIPS in the future.
Middleware for Plug and Play Integration of Heterogeneous Sensor Resources into the Sensor Web
Toma, Daniel M.; Jirka, Simon; Del Río, Joaquín
2017-01-01
The study of global phenomena requires the combination of a considerable amount of data coming from different sources, acquired by different observation platforms and managed by institutions working in different scientific fields. Merging this data to provide extensive and complete data sets to monitor the long-term, global changes of our oceans is a major challenge. The data acquisition and data archival procedures usually vary significantly depending on the acquisition platform. This lack of standardization ultimately leads to information silos, preventing the data to be effectively shared across different scientific communities. In the past years, important steps have been taken in order to improve both standardization and interoperability, such as the Open Geospatial Consortium’s Sensor Web Enablement (SWE) framework. Within this framework, standardized models and interfaces to archive, access and visualize the data from heterogeneous sensor resources have been proposed. However, due to the wide variety of software and hardware architectures presented by marine sensors and marine observation platforms, there is still a lack of uniform procedures to integrate sensors into existing SWE-based data infrastructures. In this work, a framework aimed to enable sensor plug and play integration into existing SWE-based data infrastructures is presented. First, an analysis of the operations required to automatically identify, configure and operate a sensor are analysed. Then, the metadata required for these operations is structured in a standard way. Afterwards, a modular, plug and play, SWE-based acquisition chain is proposed. Finally different use cases for this framework are presented. PMID:29244732
2014-01-01
Background Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. Results To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. Conclusions By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples. PMID:24475911
Collaborative Supercomputing for Global Change Science
NASA Astrophysics Data System (ADS)
Nemani, R.; Votava, P.; Michaelis, A.; Melton, F.; Milesi, C.
2011-03-01
There is increasing pressure on the science community not only to understand how recent and projected changes in climate will affect Earth's global environment and the natural resources on which society depends but also to design solutions to mitigate or cope with the likely impacts. Responding to this multidimensional challenge requires new tools and research frameworks that assist scientists in collaborating to rapidly investigate complex interdisciplinary science questions of critical societal importance. One such collaborative research framework, within the NASA Earth sciences program, is the NASA Earth Exchange (NEX). NEX combines state-of-the-art supercomputing, Earth system modeling, remote sensing data from NASA and other agencies, and a scientific social networking platform to deliver a complete work environment. In this platform, users can explore and analyze large Earth science data sets, run modeling codes, collaborate on new or existing projects, and share results within or among communities (see Figure S1 in the online supplement to this Eos issue (http://www.agu.org/eos_elec)).
Cloud Computing for the Grid: GridControl: A Software Platform to Support the Smart Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
GENI Project: Cornell University is creating a new software platform for grid operators called GridControl that will utilize cloud computing to more efficiently control the grid. In a cloud computing system, there are minimal hardware and software demands on users. The user can tap into a network of computers that is housed elsewhere (the cloud) and the network runs computer applications for the user. The user only needs interface software to access all of the cloud’s data resources, which can be as simple as a web browser. Cloud computing can reduce costs, facilitate innovation through sharing, empower users, and improvemore » the overall reliability of a dispersed system. Cornell’s GridControl will focus on 4 elements: delivering the state of the grid to users quickly and reliably; building networked, scalable grid-control software; tailoring services to emerging smart grid uses; and simulating smart grid behavior under various conditions.« less
NASA Astrophysics Data System (ADS)
Nandigam, V.; Crosby, C. J.; Baru, C.
2009-04-01
LiDAR (Light Distance And Ranging) topography data offer earth scientists the opportunity to study the earth's surface at very high resolutions. As a result, the popularity of these data is growing dramatically. However, the management, distribution, and analysis of community LiDAR data sets is a challenge due to their massive size (multi-billion point, mutli-terabyte). We have also found that many earth science users of these data sets lack the computing resources and expertise required to process these data. We have developed the OpenTopography Portal to democratize access to these large and computationally challenging data sets. The OpenTopography Portal uses cyberinfrastructure technology developed by the GEON project to provide access to LiDAR data in a variety of formats. LiDAR data products available range from simple Google Earth visualizations of LiDAR-derived hillshades to 1 km2 tiles of standard digital elevation model (DEM) products as well as LiDAR point cloud data and user generated custom-DEMs. We have found that the wide spectrum of LiDAR users have variable scientific applications, computing resources and technical experience and thus require a data system with multiple distribution mechanisms and platforms to serve a broader range of user communities. Because the volume of LiDAR topography data available is rapidly expanding, and data analysis techniques are evolving, there is a need for the user community to be able to communicate and interact to share knowledge and experiences. To address this need, the OpenTopography Portal enables social networking capabilities through a variety of collaboration tools, web 2.0 technologies and customized usage pattern tracking. Fundamentally, these tools offer users the ability to communicate, to access and share documents, participate in discussions, and to keep up to date on upcoming events and emerging technologies. The OpenTopography portal achieves the social networking capabilities by integrating various software technologies and platforms. These include the Expression Engine Content Management System (CMS) that comes with pre-packaged collaboration tools like blogs and wikis, the Gridsphere portal framework that contains the primary GEON LiDAR System portlet with user job monitoring capabilities and a java web based discussion forum (Jforums) application all seamlessly integrated under one portal. The OpenTopography Portal also provides integrated authentication mechanism between the various CMS collaboration tools and the core gridsphere based portlets. The integration of these various technologies allows for enhanced user interaction capabilities within the portal. By integrating popular collaboration tools like discussion forums and blogs we can promote conversation and openness among users. The ability to ask question and share expertise in forum discussions allows users to easily find information and interact with users facing similar challenges. The OpenTopography Blog enables our domain experts to post ideas, news items, commentary, and other resources in order to foster discussion and information sharing. The content management capabilities of the portal allow for easy updates to information in the form of publications, documents, and news articles. Access to the most current information fosters better decision-making. As has become the standard for web 2.0 technologies, the OpenTopography Portal is fully RSS enabled to allow users of the portal to keep track of news items, forum discussions, blog updates, and system outages. We are currently exploring how the information captured by user and job monitoring components of the Gridsphere based GEON LiDAR System can be harnessed to provide a recommender system that will help users to identify appropriate processing parameters and to locate related documents and data. By seamlessly integrating the various platforms and technologies under one single portal, we can take advantage of popular online collaboration tools that are either stand alone or software platform restricted. The availability of these collaboration tools along with the data will foster more community interaction and increase the strength and vibrancy of the LiDAR topography user community.
30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?
Code of Federal Regulations, 2010 CFR
2010-07-01
... a project management timeline, Gantt Chart, that depicts when interim and final reports required by... 30 Mineral Resources 2 2010-07-01 2010-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources MINERALS MANAGEMENT SERVICE...
Enabling Discoveries in Earth Sciences Through the Geosciences Network (GEON)
NASA Astrophysics Data System (ADS)
Seber, D.; Baru, C.; Memon, A.; Lin, K.; Youn, C.
2005-12-01
Taking advantage of the state-of-the-art information technology resources GEON researchers are building a cyberinfrastructure designed to enable data sharing, semantic data integration, high-end computations and 4D visualization in easy-to-use web-based environments. The GEON Network currently allows users to search and register Earth science resources such as data sets (GIS layers, GMT files, geoTIFF images, ASCII files, relational databases etc), software applications or ontologies. Portal based access mechanisms enable developers to built dynamic user interfaces to conduct advanced processing and modeling efforts across distributed computers and supercomputers. Researchers and educators can access the networked resources through the GEON portal and its portlets that were developed to conduct better and more comprehensive science and educational studies. For example, the SYNSEIS portlet in GEON enables users to access in near-real time seismic waveforms from the IRIS Data Management Center, easily build a 3D geologic model within the area of the seismic station(s) and the epicenter and perform a 3D synthetic seismogram analysis to understand the lithospheric structure and earthquake source parameters for any given earthquake in the US. Similarly, GEON's workbench area enables users to create their own work environment and copy, visualize and analyze any data sets within the network, and create subsets of the data sets for their own purposes. Since all these resources are built as part of a Service-oriented Architecture (SOA), they are also used in other development platforms. One such platform is Kepler Workflow system which can access web service based resources and provides users with graphical programming interfaces to build a model to conduct computations and/or visualization efforts using the networked resources. Developments in the area of semantic integration of the networked datasets continue to advance and prototype studies can be accessed via the GEON portal at www.geongrid.org
DOT National Transportation Integrated Search
1997-06-06
PUBLIC-PRIVATE PARTNERSHIPS SHARED RESOURCE PROJECTS ARE PUBLIC-PRIVATE ARRANGEMENTS THAT INVOLVE SHARING PUBLIC PROPERTY SUCH AS RIGHTS-OF-WAY AND PRIVATE RESOURCES SUCH AS TELECOMMUNICATIONS CAPACITY AND EXPERTISE. TYPICALLY, PRIVATE TELECOMMUNI...
The HydroShare Collaborative Repository for the Hydrology Community
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Couch, A.; Hooper, R. P.; Dash, P. K.; Stealey, M.; Yi, H.; Bandaragoda, C.; Castronova, A. M.
2017-12-01
HydroShare is an online, collaboration system for sharing of hydrologic data, analytical tools, and models. It supports the sharing of, and collaboration around, "resources" which are defined by standardized content types for data formats and models commonly used in hydrology. With HydroShare you can: Share your data and models with colleagues; Manage who has access to the content that you share; Share, access, visualize and manipulate a broad set of hydrologic data types and models; Use the web services application programming interface (API) to program automated and client access; Publish data and models and obtain a citable digital object identifier (DOI); Aggregate your resources into collections; Discover and access data and models published by others; Use web apps to visualize, analyze and run models on data in HydroShare. This presentation will describe the functionality and architecture of HydroShare highlighting our approach to making this system easy to use and serving the needs of the hydrology community represented by the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc. (CUAHSI). Metadata for uploaded files is harvested automatically or captured using easy to use web user interfaces. Users are encouraged to add or create resources in HydroShare early in the data life cycle. To encourage this we allow users to share and collaborate on HydroShare resources privately among individual users or groups, entering metadata while doing the work. HydroShare also provides enhanced functionality for users through web apps that provide tools and computational capability for actions on resources. HydroShare's architecture broadly is comprised of: (1) resource storage, (2) resource exploration website, and (3) web apps for actions on resources. System components are loosely coupled and interact through APIs, which enhances robustness, as components can be upgraded and advanced relatively independently. The full power of this paradigm is the extensibility it supports. Web apps are hosted on separate servers, which may be 3rd party servers. They are registered in HydroShare using a web app resource that configures the connectivity for them to be discovered and launched directly from resource types they are associated with.
6 DOF Nonlinear AUV Simulation Toolbox
1997-01-01
is to supply a flexible 3D -simulation platform for motion visualization, in-lab debugging and testing of mission-specific strategies as well as those...Explorer are modular designed [Smith] in order to cut time and cost for vehicle recontlguration. A flexible 3D -simulation platform is desired to... 3D models. Current implemented modules include a nonlinear dynamic model for the OEX, shared memory and semaphore manager tools, shared memory monitor
Study on the standard architecture for geoinformation common services
NASA Astrophysics Data System (ADS)
Zha, Z.; Zhang, L.; Wang, C.; Jiang, J.; Huang, W.
2014-04-01
The construction of platform for geoinformation common services was completed or on going in in most provinces and cities in these years in China, and the platforms plays an important role in the economic and social activities. Geoinfromation and geoinfromation based services are the key issues in the platform. The standards on geoinormation common services play as bridges among the users, systems and designers of the platform. The standard architecture for geoinformation common services is the guideline for designing and using the standard system in which the standards integrated to each other to promote the development, sharing and services of geoinformation resources. To establish the standard architecture for geoinformation common services is one of the tasks of "Study on important standards for geonformation common services and management of public facilities in city". The scope of the standard architecture is defined, such as data or information model, interoperability interface or service, information management. Some Research work on the status of international standards of geoinormation common services in organization and countries, like ISO/TC 211, OGC and other countries or unions like USA, EU, Japan have done. Some principles are set up to evaluate the standard, such as availability, suitability and extensible ability. Then the development requirement and practical situation are analyzed, and a framework of the standard architecture for geoinformation common services are proposed. Finally, a summary and prospects of the geoinformation standards are made.
NASA Astrophysics Data System (ADS)
Qi, Yuan; Zhao, Hongtao
2017-04-01
China is one of few several natural disaster prone countries, which has complex geological and geographical environment and abnormal climate. On August 8, 2010, a large debris flow disaster happened in Zhouqu Country, Gansu province, resulting in more than 1700 casualties and more than 200 buildings damaged. In order to percept landslide and debris flow, an early warning system was established in the county. Spatial information technologies, such as remote sensing, GIS, and GPS, play core role in the early warning system, due to their functions in observing, analyzing, and locating geological disasters. However, all of these spatial information technologies could play an important role only guided by the emergency response mechanism. This article takes the establishment of Zhouqu Country's Disaster Emergency Response Interaction Mechanism (DERIM) as an example to discuss the risk management of country-level administrative units. The country-level risk management aims to information sharing, resources integration, integrated prevention and unified command. Then, nine subsystems support DERIM, which included disaster prevention and emergency data collection and sharing system, joint duty system, disaster verification and evaluation system, disaster consultation system, emergency warning and information release system, emergency response system, disaster reporting system, plan management system, mass prediction and prevention management system. At last, an emergency command platform in Zhouqu Country built up to realize DERIM. The core mission of the platform consists of daily management of disaster, monitoring and warning, comprehensive analysis, information release, consultation and decision-making, emergency response, etc. Five functional modules, including module of disaster information management, comprehensive monitoring module (geological monitoring, meteorological monitoring, water conservancy and hydrological monitoring), alarm management module, emergency command and disaster dispatching management module are developed on the basis of this platform. Based on the internet technology, an web-based office platform is exploited for the nodes scattered in departments and towns, which includes daily business, monitoring and warning, alarm notification, alarm recording, personnel management and update in disaster region, query and analysis of real-time observation data, etc. The platform experienced 3 years' test of the duty in flood period since 2013, and two typical disaster cases during this period fully illustrates the effectiveness of the DERIM and the emergency command platform.
Refinement and dissemination of a digital platform for sharing transportation education materials.
DOT National Transportation Integrated Search
2015-07-01
National agencies have called for more widespread adoption of best practices in engineering education. To facilitate this sharing of practices a : web-based system framework used by transportation engineering educators to share curricular materials a...
Tethys: A Platform for Water Resources Modeling and Decision Support Apps
NASA Astrophysics Data System (ADS)
Nelson, J.; Swain, N. R.
2015-12-01
The interactive nature of web applications or "web apps" makes it an excellent medium for conveying complex scientific concepts to lay audiences and creating decision support tools that harness cutting edge modeling techniques. However, the technical expertise required to develop web apps represents a barrier for would-be developers. This barrier can be characterized by the following hurdles that developers must overcome: (1) identify, select, and install software that meet the spatial and computational capabilities commonly required for water resources modeling; (2) orchestrate the use of multiple free and open source (FOSS) projects and navigate their differing application programming interfaces; (3) learn the multi-language programming skills required for modern web development; and (4) develop a web-secure and fully featured web portal to host the app. Tethys Platform has been developed to lower the technical barrier and minimize the initial development investment that prohibits many scientists and engineers from making use of the web app medium. It includes (1) a suite of FOSS that address the unique data and computational needs common to water resources web app development, (2) a Python software development kit that streamlines development, and (3) a customizable web portal that is used to deploy the completed web apps. Tethys synthesizes several software projects including PostGIS, 52°North WPS, GeoServer, Google Maps™, OpenLayers, and Highcharts. It has been used to develop a broad array of web apps for water resources modeling and decision support for several projects including CI-WATER, HydroShare, and the National Flood Interoperability Experiment. The presentation will include live demos of some of the apps that have been developed using Tethys to demonstrate its capabilities.
2013-07-18
VA) • DFAS • Human Resources - HR Shared Services (Indianapolis, IN) • Personnel Security - HR Shared Services (Indianapolis, IN) DHRA...Security (Camp Lejeune) No Yes Yes AAFES Human Resources No No No Force Protection Yes Yes Yes DFAS Human Resources - HR Shared Services No...No No Personnel Security - HR Shared Services Yes Yes Yes DLA Human Resources No No Yes Personnel Security Yes Yes Yes DoDEA Human
Online to offline teaching model in optics education: resource sharing course and flipped class
NASA Astrophysics Data System (ADS)
Li, Xiaotong; Cen, Zhaofeng; Liu, Xiangdong; Zheng, Zhenrong
2016-09-01
Since the platform "Coursera" is created by the professors of Stanford University Andrew Ng and Daphne Koller, more and more universities have joined in it. From the very beginning, online education is not only about education itself, but also connected with social equality. This is especially significant for the economic transformation in China. In this paper the research and practice on informatization of optical education are described. Online to offline (O2O) education activities, such as online learning and offline meeting, online homework and online to offline discussion, online tests and online to offline evaluation, are combined into our teaching model in the course of Applied Optics. These various O2O strategies were implemented respectively in the autumn-winter small class and the spring-summer middle class according to the constructivism and the idea of open education. We have developed optical education resources such as videos of lectures, light transmission or ray trace animations, online tests, etc. We also divide the learning procedure into 4 steps: First, instead of being given a course offline, students will learn the course online; Second, once a week or two weeks, students will have a discussion in their study groups; Third, students will submit their homework and study reports; Fourth, they will do online and offline tests. The online optical education resources have been shared in some universities in China, together with new challenges to teachers and students when facing the revolution in the e-learning future.
OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, Oliver; Greiner, Annette; Cholia, Shreyas
Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data accessmore » (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.« less
Pisani, Elizabeth; Botchway, Stella
2017-01-01
Background: Increasingly, biomedical researchers are encouraged or required by research funders and journals to share their data, but there's very little guidance on how to do that equitably and usefully, especially in resource-constrained settings. We performed an in-depth case study of one data sharing pioneer: the WorldWide Antimalarial Resistance Network (WWARN). Methods: The case study included a records review, a quantitative analysis of WAARN-related publications, in-depth interviews with 47 people familiar with WWARN, and a witness seminar involving a sub-set of 11 interviewees. Results: WWARN originally aimed to collate clinical, in vitro, pharmacological and molecular data into linked, open-access databases intended to serve as a public resource to guide antimalarial drug treatment policies. Our study describes how WWARN navigated challenging institutional and academic incentive structures, alongside funders' reluctance to invest in capacity building in malaria-endemic countries, which impeded data sharing. The network increased data contributions by focusing on providing free, online tools to improve the quality and efficiency of data collection, and by inviting collaborative authorship on papers addressing policy-relevant questions that could only be answered through pooled analyses. By July 1, 2016, the database included standardised data from 103 molecular studies and 186 clinical trials, representing 135,000 individual patients. Developing the database took longer and cost more than anticipated, and efforts to increase equity for data contributors are on-going. However, analyses of the pooled data have generated new methods and influenced malaria treatment recommendations globally. Despite not achieving the initial goal of real-time surveillance, WWARN has developed strong data governance and curation tools, which are now being adapted relatively quickly for other diseases. Conclusions: To be useful, data sharing requires investment in long-term infrastructure. To be feasible, it requires new incentive structures that favour the generation of reusable knowledge. PMID:29018840
Pisani, Elizabeth; Botchway, Stella
2017-01-01
Increasingly, biomedical researchers are encouraged or required by research funders and journals to share their data, but there's very little guidance on how to do that equitably and usefully, especially in resource-constrained settings. We performed an in-depth case study of one data sharing pioneer: the WorldWide Antimalarial Resistance Network (WWARN). The case study included a records review, a quantitative analysis of WAARN-related publications, in-depth interviews with 47 people familiar with WWARN, and a witness seminar involving a sub-set of 11 interviewees. WWARN originally aimed to collate clinical, in vitro, pharmacological and molecular data into linked, open-access databases intended to serve as a public resource to guide antimalarial drug treatment policies. Our study describes how WWARN navigated challenging institutional and academic incentive structures, alongside funders' reluctance to invest in capacity building in malaria-endemic countries, which impeded data sharing. The network increased data contributions by focusing on providing free, online tools to improve the quality and efficiency of data collection, and by inviting collaborative authorship on papers addressing policy-relevant questions that could only be answered through pooled analyses. By July 1, 2016, the database included standardised data from 103 molecular studies and 186 clinical trials, representing 135,000 individual patients. Developing the database took longer and cost more than anticipated, and efforts to increase equity for data contributors are on-going. However, analyses of the pooled data have generated new methods and influenced malaria treatment recommendations globally. Despite not achieving the initial goal of real-time surveillance, WWARN has developed strong data governance and curation tools, which are now being adapted relatively quickly for other diseases. To be useful, data sharing requires investment in long-term infrastructure. To be feasible, it requires new incentive structures that favour the generation of reusable knowledge.
SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services
Gessler, Damian DG; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T
2009-01-01
Background SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. Results There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at , developer tools at , and a portal to third-party ontologies at (a "swap meet"). Conclusion SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs. PMID:19775460
DOT National Transportation Integrated Search
1996-04-01
This report presents the results of research on the institutional and non-technical issues related to shared resource projects. Shared resource projects are a particular form of public-private partnering that may help public agencies underwrite their...
30 CFR 57.11027 - Scaffolds and working platforms.
Code of Federal Regulations, 2010 CFR
2010-07-01
... condition. Floorboards shall be laid properly and the scaffolds and working platform shall not be overloaded... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Scaffolds and working platforms. 57.11027 Section 57.11027 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND...
Surgical Outreach for Children by International Humanitarian Organizations: A Review.
Kynes, J Matthew; Zeigler, Laura; McQueen, Kelly
2017-06-28
Low- and middle-income countries carry a disproportionate share of the global burden of pediatric surgical disease and have limited local healthcare infrastructure and human resources to address this burden. Humanitarian efforts that have improved or provided access to necessary basic or emergency surgery for children in these settings have included humanitarian assistance and disaster relief, short-term surgical missions, and long-term projects such as building pediatric specialty hospitals and provider networks. Each of these efforts may also include educational initiatives designed to increase local capacity. This article will provide an overview of pediatric humanitarian surgical outreach including reference to available evidence-based analyses of these platforms and make recommendations for surgical outreach initiatives for children.
The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences
Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker
2016-01-01
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant’s platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses. PMID:26752627
Design distributed simulation platform for vehicle management system
NASA Astrophysics Data System (ADS)
Wen, Zhaodong; Wang, Zhanlin; Qiu, Lihua
2006-11-01
Next generation military aircraft requires the airborne management system high performance. General modules, data integration, high speed data bus and so on are needed to share and manage information of the subsystems efficiently. The subsystems include flight control system, propulsion system, hydraulic power system, environmental control system, fuel management system, electrical power system and so on. The unattached or mixed architecture is changed to integrated architecture. That means the whole airborne system is regarded into one system to manage. So the physical devices are distributed but the system information is integrated and shared. The process function of each subsystem are integrated (including general process modules, dynamic reconfiguration), furthermore, the sensors and the signal processing functions are shared. On the other hand, it is a foundation for power shared. Establish a distributed vehicle management system using 1553B bus and distributed processors which can provide a validation platform for the research of airborne system integrated management. This paper establishes the Vehicle Management System (VMS) simulation platform. Discuss the software and hardware configuration and analyze the communication and fault-tolerant method.
A genome-wide association study platform built on iPlant cyber-infrastructure
USDA-ARS?s Scientific Manuscript database
We demonstrated a flexible Genome-Wide Association (GWA) Study (GWAS) platform built upon the iPlant Collaborative Cyber-infrastructure. The platform supports big data management, sharing, and large scale study of both genotype and phenotype data on clusters. End users can add their own analysis too...
Knowledge Sharing via Social Networking Platforms in Organizations
ERIC Educational Resources Information Center
Kettles, Degan
2012-01-01
Knowledge Management Systems have been actively promoted for decades within organizations but have frequently failed to be used. Recently, deployments of enterprise social networking platforms used for knowledge management have become commonplace. These platforms help harness the knowledge of workers by serving as repositories of knowledge as well…
Reusable Social Networking Capabilities for an Earth Science Collaboratory
NASA Astrophysics Data System (ADS)
Lynnes, C.; Da Silva, D.; Leptoukh, G. G.; Ramachandran, R.
2011-12-01
A vast untapped resource of data, tools, information and knowledge lies within the Earth science community. This is due to the fact that it is difficult to share the full spectrum of these entities, particularly their full context. As a result, most knowledge exchange is through person-to-person contact at meetings, email and journal articles, each of which can support only a limited level of detail. We propose the creation of an Earth Science Collaboratory (ESC): a framework that would enable sharing of data, tools, workflows, results and the contextual knowledge about these information entities. The Drupal platform is well positioned to provide the key social networking capabilities to the ESC. As a proof of concept of a rich collaboration mechanism, we have developed a Drupal-based mechanism for graphically annotating and commenting on results images from analysis workflows in the online Giovanni analysis system for remote sensing data. The annotations can be tagged and shared with others in the community. These capabilities are further supplemented by a Research Notebook capability reused from another online analysis system named Talkoot. The goal is a reusable set of modules that can integrate with variety of other applications either within Drupal web frameworks or at a machine level.
Castella, Jean-Christophe
2009-02-01
In northern Vietnam uplands the successive policy reforms that accompanied agricultural decollectivisation triggered very rapid changes in land use in the 1990s. From a centralized system of natural resource management, a multitude of individual strategies emerged which contributed to new production interactions among farming households, changes in landscape structures, and conflicting strategies among local stakeholders. Within this context of agrarian transition, learning devices can help local communities to collectively design their own course of action towards sustainable natural resource management. This paper presents a collaborative approach combining a number of participatory methods and geovisualisation tools (e.g., spatially explicit multi-agent models and role-playing games) with the shared goal to analyse and represent the interactions between: (i) decision-making processes by individual farmers based on the resource profiles of their farms; (ii) the institutions which regulate resource access and usage; and (iii) the biophysical and socioeconomic environment. This methodological pathway is illustrated by a case study in Bac Kan Province where it successfully led to a communication platform on natural resource management. In a context of rapid socioeconomic changes, learning devices and geovisualisation tools helped embed the participatory approach within a process of community development. The combination of different tools, each with its own advantages and constraints, proved highly relevant for supporting collective natural resource management.
Architecture for the Interdisciplinary Earth Data Alliance
NASA Astrophysics Data System (ADS)
Richard, S. M.
2016-12-01
The Interdisciplinary Earth Data Alliance (IEDA) is leading an EarthCube (EC) Integrative Activity to develop a governance structure and technology framework that enables partner data systems to share technology, infrastructure, and practice for documenting, curating, and accessing heterogeneous geoscience data. The IEDA data facility provides capabilities in an extensible framework that enables domain-specific requirements for each partner system in the Alliance to be integrated into standardized cross-domain workflows. The shared technology infrastructure includes a data submission hub, a domain-agnostic file-based repository, an integrated Alliance catalog and a Data Browser for data discovery across all partner holdings, as well as services for registering identifiers for datasets (DOI) and samples (IGSN). The submission hub will be a platform that facilitates acquisition of cross-domain resource documentation and channels users into domain and resource-specific workflows tailored for each partner community. We are exploring an event-based message bus architecture with a standardized plug-in interface for adding capabilities. This architecture builds on the EC CINERGI metadata pipeline as well as the message-based architecture of the SEAD project. Plug-in components for file introspection to match entities to a data type registry (extending EC Digital Crust and Research Data Alliance work), extract standardized keywords (using CINERGI components), location, cruise, personnel and other metadata linkage information (building on GeoLink and existing IEDA partner components). The submission hub will feed submissions to appropriate partner repositories and service endpoints targeted by domain and resource type for distribution. The Alliance governance will adopt patterns (vocabularies, operations, resource types) for self-describing data services using standard HTTP protocol for simplified data access (building on EC GeoWS and other `RESTful' approaches). Exposure of resource descriptions (datasets and service distributions) for harvesting by commercial search engines as well as geoscience-data focused crawlers (like EC B-Cube crawler) will increase discoverability of IEDA resources with minimal effort by curators.
WebGIS based on semantic grid model and web services
NASA Astrophysics Data System (ADS)
Zhang, WangFei; Yue, CaiRong; Gao, JianGuo
2009-10-01
As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by ontology based on Grid technology and Web Services.
Open NASA Earth Exchange (OpenNEX): A Public-Private Partnership for Climate Change Research
NASA Astrophysics Data System (ADS)
Nemani, R. R.; Lee, T. J.; Michaelis, A.; Ganguly, S.; Votava, P.
2014-12-01
NASA Earth Exchange (NEX) is a data, computing and knowledge collaborative that houses satellite, climate and ancillary data where a community of researchers can come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform with access to large supercomputing resources. As a part of broadening the community beyond NASA-funded researchers, NASA through an agreement with Amazon Inc. made available to the public a large collection of Climate and Earth Sciences satellite data. The data, available through the Open NASA Earth Exchange (OpenNEX) platform hosted by Amazon Web Services (AWS) public cloud, consists of large amounts of global land surface imaging, vegetation conditions, climate observations and climate projections. In addition to the data, users of OpenNEX platform can also watch lectures from leading experts, learn basic access and use of the available data sets. In order to advance White House initiatives such as Open Data, Big Data and Climate Data and the Climate Action Plan, NASA over the past six months conducted the OpenNEX Challenge. The two-part challenge was designed to engage the public in creating innovative ways to use NASA data and address climate change impacts on economic growth, health and livelihood. Our intention was that the challenges allow citizen scientists to realize the value of NASA data assets and offers NASA new ideas on how to share and use that data. The first "ideation" challenge, closed on July 31st attracted over 450 participants consisting of climate scientists, hobbyists, citizen scientists, IT experts and App developers. Winning ideas from the first challenge will be incorporated into the second "builder" challenge currently targeted to launch mid-August and close by mid-November. The winner(s) will be formally announced at AGU in December of 2014. We will share our experiences and lessons learned over the past year from OpenNEX, a public-private partnership for engaging and enabling a large community of citizen scientists to better understand global climate changes and in creating climate resilience.
Synchronization of Finite State Shared Resources
1976-03-01
IMHI uiw mmm " AFOSR -TR- 70- 0^8 3 QC o SYNCHRONIZATION OF FINITE STATE SHARED RESOURCES Edward A Sei neide.- DEPARTMENT of COMPUTER...34" ■ ■ ^ I I. i. . : ,1 . i-i SYNCHRONIZATION OF FINITE STATE SHARED RESOURCES Edward A Schneider Department of Computer...SIGNIFICANT NUMBER OF PAGES WHICH DO NOT REPRODUCE LEGIBLY. ABSTRACT The problem of synchronizing a set of operations defined on a shared resource
Using Twitter and other social media platforms to provide situational awareness during an incident.
Tobias, Ed
2011-10-01
The recent use of social media by protesters in Iran, Egypt, Yemen and elsewhere has focused new attention on this communications medium. Government agencies and businesses, as well, are using social media to push information to their stakeholders. Those who are on the front lines of this information revolution, however, realise that social media is most effective when the communication is two-way. Unlike other media, social media allows information sharing. This, in turn, provides emergency managers with new situational-awareness resources when trying to mitigate an incident. As Federal Emergency Management Agency (FEMA) Administrator Craig Fugate told Information Week on January 19th, 2011: 'We can adjust much quicker if we can figure out how to have (a) two-way conversation and if we can look at the public as a resource. The public is putting out better situational awareness than many of our own agencies can.' This paper provides examples of how social media can be used as a situational-awareness resource and specific 'tools' that can be used to assist with this task.
30 CFR 250.509 - Well-completion structures on fixed platforms.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false Well-completion structures on fixed platforms. 250.509 Section 250.509 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND... stresses to the platform. [53 FR 10690, Apr. 1, 1988, as amended at 54 FR 50616, Dec. 8, 1989. Redesignated...
30 CFR 250.920 - What are the MMS requirements for assessment of fixed platforms?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What are the MMS requirements for assessment of fixed platforms? 250.920 Section 250.920 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF... Structures Inspection, Maintenance, and Assessment of Platforms § 250.920 What are the MMS requirements for...
A suite of R packages for web-enabled modeling and analysis of surface waters
NASA Astrophysics Data System (ADS)
Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.
2014-12-01
Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.
U-Compare: share and compare text mining tools with UIMA
Kano, Yoshinobu; Baumgartner, William A.; McCrohon, Luke; Ananiadou, Sophia; Cohen, K. Bretonnel; Hunter, Lawrence; Tsujii, Jun'ichi
2009-01-01
Summary: Due to the increasing number of text mining resources (tools and corpora) available to biologists, interoperability issues between these resources are becoming significant obstacles to using them effectively. UIMA, the Unstructured Information Management Architecture, is an open framework designed to aid in the construction of more interoperable tools. U-Compare is built on top of the UIMA framework, and provides both a concrete framework for out-of-the-box text mining and a sophisticated evaluation platform allowing users to run specific tools on any target text, generating both detailed statistics and instance-based visualizations of outputs. U-Compare is a joint project, providing the world's largest, and still growing, collection of UIMA-compatible resources. These resources, originally developed by different groups for a variety of domains, include many famous tools and corpora. U-Compare can be launched straight from the web, without needing to be manually installed. All U-Compare components are provided ready-to-use and can be combined easily via a drag-and-drop interface without any programming. External UIMA components can also simply be mixed with U-Compare components, without distinguishing between locally and remotely deployed resources. Availability: http://u-compare.org/ Contact: kano@is.s.u-tokyo.ac.jp PMID:19414535
NASA Technical Reports Server (NTRS)
Staten, B.; Moyer, E.; Vizir, V.; Gompf, H.; Hoban-Higgins, T.; Lewis, L.; Ronca, A.; Fuller, C. A.
2016-01-01
Biospecimen Sharing Programs (BSPs) have been organized by NASA Ames Research Center since the 1960s with the goal of maximizing utilization and scientific return from rare, complex and costly spaceflight experiments. BSPs involve acquiring otherwise unused biological specimens from primary space research experiments for distribution to secondary experiments. Here we describe a collaboration leveraging Ames expertise in biospecimen sharing to magnify the scientific impact of research informing astronaut health funded by the NASA Human Research Program (HRP) Human Health Countermeasures (HHC) Element. The concept expands biospecimen sharing to one-off ground-based studies utilizing analogue space platforms (e.g., Hindlimb Unloading (HLU), Artificial Gravity) for rodent experiments, thereby significantly broadening the range of research opportunities with translational relevance for protecting human health in space and on Earth.
NASA Astrophysics Data System (ADS)
Xiong, Ting; He, Zhiwen
2017-06-01
Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.
Development of mobile platform integrated with existing electronic medical records.
Kim, YoungAh; Kim, Sung Soo; Kang, Simon; Kim, Kyungduk; Kim, Jun
2014-07-01
This paper describes a mobile Electronic Medical Record (EMR) platform designed to manage and utilize the existing EMR and mobile application with optimized resources. We structured the mEMR to reuse services of retrieval and storage in mobile app environments that have already proven to have no problem working with EMRs. A new mobile architecture-based mobile solution was developed in four steps: the construction of a server and its architecture; screen layout and storyboard making; screen user interface design and development; and a pilot test and step-by-step deployment. This mobile architecture consists of two parts, the server-side area and the client-side area. In the server-side area, it performs the roles of service management for EMR and documents and for information exchange. Furthermore, it performs menu allocation depending on user permission and automatic clinical document architecture document conversion. Currently, Severance Hospital operates an iOS-compatible mobile solution based on this mobile architecture and provides stable service without additional resources, dealing with dynamic changes of EMR templates. The proposed mobile solution should go hand in hand with the existing EMR system, and it can be a cost-effective solution if a quality EMR system is operated steadily with this solution. Thus, we expect this example to be shared with hospitals that currently plan to deploy mobile solutions.
Development of Mobile Platform Integrated with Existing Electronic Medical Records
Kim, YoungAh; Kang, Simon; Kim, Kyungduk; Kim, Jun
2014-01-01
Objectives This paper describes a mobile Electronic Medical Record (EMR) platform designed to manage and utilize the existing EMR and mobile application with optimized resources. Methods We structured the mEMR to reuse services of retrieval and storage in mobile app environments that have already proven to have no problem working with EMRs. A new mobile architecture-based mobile solution was developed in four steps: the construction of a server and its architecture; screen layout and storyboard making; screen user interface design and development; and a pilot test and step-by-step deployment. This mobile architecture consists of two parts, the server-side area and the client-side area. In the server-side area, it performs the roles of service management for EMR and documents and for information exchange. Furthermore, it performs menu allocation depending on user permission and automatic clinical document architecture document conversion. Results Currently, Severance Hospital operates an iOS-compatible mobile solution based on this mobile architecture and provides stable service without additional resources, dealing with dynamic changes of EMR templates. Conclusions The proposed mobile solution should go hand in hand with the existing EMR system, and it can be a cost-effective solution if a quality EMR system is operated steadily with this solution. Thus, we expect this example to be shared with hospitals that currently plan to deploy mobile solutions. PMID:25152837
Webb, Adam J; Thorisson, Gudmundur A; Brookes, Anthony J
2011-05-01
Explosive growth in the generation of genotype-to-phenotype (G2P) data necessitates a concerted effort to tackle the logistical and informatics challenges this presents. The GEN2PHEN Project represents one such effort, with a broad strategy of uniting disparate G2P resources into a hybrid centralized-federated network. This is achieved through a holistic strategy focussed on three overlapping areas: data input standards and pipelines through which to submit and collect data (data in); federated, independent, extendable, yet interoperable database platforms on which to store and curate widely diverse datasets (data storage); and data formats and mechanisms with which to exchange, combine, and extract data (data exchange and output). To fully leverage this data network, we have constructed the "G2P Knowledge Centre" (http://www.gen2phen.org). This central platform provides holistic searching of the G2P data domain allied with facilities for data annotation and user feedback, access to extensive G2P and informatics resources, and tools for constructing online working communities centered on the G2P domain. Through the efforts of GEN2PHEN, and through combining data with broader community-derived knowledge, the Knowledge Centre opens up exciting possibilities for organizing, integrating, sharing, and interpreting new waves of G2P data in a collaborative fashion. © 2011 Wiley-Liss, Inc.
ISS--an electronic syndromic surveillance system for infectious disease in rural China.
Yan, Weirong; Palm, Lars; Lu, Xin; Nie, Shaofa; Xu, Biao; Zhao, Qi; Tao, Tao; Cheng, Liwei; Tan, Li; Dong, Hengjin; Diwan, Vinod K
2013-01-01
Syndromic surveillance system has great advantages in promoting the early detection of epidemics and reducing the necessities of disease confirmation, and it is especially effective for surveillance in resource poor settings. However, most current syndromic surveillance systems are established in developed countries, and there are very few reports on the development of an electronic syndromic surveillance system in resource-constrained settings. This study describes the design and pilot implementation of an electronic surveillance system (ISS) for the early detection of infectious disease epidemics in rural China, complementing the conventional case report surveillance system. ISS was developed based on an existing platform 'Crisis Information Sharing Platform' (CRISP), combining with modern communication and GIS technology. ISS has four interconnected functions: 1) work group and communication group; 2) data source and collection; 3) data visualization; and 4) outbreak detection and alerting. As of Jan. 31(st) 2012, ISS has been installed and pilot tested for six months in four counties in rural China. 95 health facilities, 14 pharmacies and 24 primary schools participated in the pilot study, entering respectively 74,256, 79,701, and 2330 daily records into the central database. More than 90% of surveillance units at the study sites are able to send daily information into the system. In the paper, we also presented the pilot data from health facilities in the two counties, which showed the ISS system had the potential to identify the change of disease patterns at the community level. The ISS platform may facilitate the early detection of infectious disease epidemic as it provides near real-time syndromic data collection, interactive visualization, and automated aberration detection. However, several constraints and challenges were encountered during the pilot implementation of ISS in rural China.
The Data Platform for Climate Research and Action: Introducing Climate Watch
NASA Astrophysics Data System (ADS)
Hennig, R. J.; Ge, M.; Friedrich, J.; Lebling, K.; Carlock, G.; Arcipowska, A.; Mangan, E.; Biru, H.; Tankou, A.; Chaudhury, M.
2017-12-01
The Paris Agreement, adopted through Decision 1/CP.21, brings all nations together to take on ambitious efforts to combat climate change. Open access to climate data supporting climate research, advancing knowledge, and informing decision making is key to encourage and strengthen efforts of stakeholders at all levels to address and respond to effects of climate change. Climate Watch is a robust online data platform developed in response to the urgent needs of knowledge and tools to empower climate research and action, including those of researchers, policy makers, the private sector, civil society, and all other non-state actors. Building on the rapid growing technology of open data and information sharing, Climate Watch is equipped with extensive amount of climate data, informative visualizations, concise yet efficient user interface, and connection to resources users need to gather insightful information on national and global progress towards delivering on the objective of the Convention and the Paris Agreement. Climate Watch brings together hundreds of quantitative and qualitative indicators for easy explore, visualize, compare, download at global, national, and sectoral levels: Greenhouse gas (GHG) emissions for more than 190 countries over the1850-2014 time period, covering all seven Kyoto Gases following IPCC source/sink categories; Structured information on over 150 NDCs facilitating the clarity, understanding and transparency of countries' contributions to address climate change; Over 6500 identified linkages between climate actions in NDCs across the 169 targets of the sustainable development goals (SDG); Over 200 indicators describing low carbon pathways from models and scenarios by integrated assessment models (IAMs) and national sources; and Data on vulnerability and risk, policies, finance, and many more. Climate Watch platform is developed as part of the broader efforts within the World Resources Institute, the NDC Partnership, and in collaboration with GIZ, UNFCCC, World Bank, and Climate Analytics.
Web-based reactive transport modeling using PFLOTRAN
NASA Astrophysics Data System (ADS)
Zhou, H.; Karra, S.; Lichtner, P. C.; Versteeg, R.; Zhang, Y.
2017-12-01
Actionable understanding of system behavior in the subsurface is required for a wide spectrum of societal and engineering needs by both commercial firms and government entities and academia. These needs include, for example, water resource management, precision agriculture, contaminant remediation, unconventional energy production, CO2 sequestration monitoring, and climate studies. Such understanding requires the ability to numerically model various coupled processes that occur across different temporal and spatial scales as well as multiple physical domains (reservoirs - overburden, surface-subsurface, groundwater-surface water, saturated-unsaturated zone). Currently, this ability is typically met through an in-house approach where computational resources, model expertise, and data for model parameterization are brought together to meet modeling needs. However, such an approach has multiple drawbacks which limit the application of high-end reactive transport codes such as the Department of Energy funded[?] PFLOTRAN code. In addition, while many end users have a need for the capabilities provided by high-end reactive transport codes, they do not have the expertise - nor the time required to obtain the expertise - to effectively use these codes. We have developed and are actively enhancing a cloud-based software platform through which diverse users are able to easily configure, execute, visualize, share, and interpret PFLOTRAN models. This platform consists of a web application and available on-demand HPC computational infrastructure. The web application consists of (1) a browser-based graphical user interface which allows users to configure models and visualize results interactively, and (2) a central server with back-end relational databases which hold configuration, data, modeling results, and Python scripts for model configuration, and (3) a HPC environment for on-demand model execution. We will discuss lessons learned in the development of this platform, the rationale for different interfaces, implementation choices, as well as the planned path forward.
Robinson, Elva J.H.
2016-01-01
Resource sharing is an important cooperative behavior in many animals. Sharing resources is particularly important in social insect societies, as division of labor often results in most individuals including, importantly, the reproductives, relying on other members of the colony to provide resources. Sharing resources between individuals is therefore fundamental to the success of social insects. Resource sharing is complicated if a colony inhabits several spatially separated nests, a nesting strategy common in many ant species. Resources must be shared not only between individuals in a single nest but also between nests. We investigated the behaviors facilitating resource redistribution between nests in a dispersed-nesting population of wood ant Formica lugubris. We marked ants, in the field, as they transported resources along the trails between nests of a colony, to investigate how the behavior of individual workers relates to colony-level resource exchange. We found that workers from a particular nest “forage” to other nests in the colony, treating them as food sources. Workers treating other nests as food sources means that simple, pre-existing foraging behaviors are used to move resources through a distributed system. It may be that this simple behavioral mechanism facilitates the evolution of this complex life-history strategy. PMID:27004016
Parallel k-means++ for Multiple Shared-Memory Architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mackey, Patrick S.; Lewis, Robert R.
2016-09-22
In recent years k-means++ has become a popular initialization technique for improved k-means clustering. To date, most of the work done to improve its performance has involved parallelizing algorithms that are only approximations of k-means++. In this paper we present a parallelization of the exact k-means++ algorithm, with a proof of its correctness. We develop implementations for three distinct shared-memory architectures: multicore CPU, high performance GPU, and the massively multithreaded Cray XMT platform. We demonstrate the scalability of the algorithm on each platform. In addition we present a visual approach for showing which platform performed k-means++ the fastest for varyingmore » data sizes.« less
Purawat, Shweta; Cowart, Charles; Amaro, Rommie E; Altintas, Ilkay
2017-05-01
The BBDTC (https://biobigdata.ucsd.edu) is a community-oriented platform to encourage high-quality knowledge dissemination with the aim of growing a well-informed biomedical big data community through collaborative efforts on training and education. The BBDTC is an e-learning platform that empowers the biomedical community to develop, launch and share open training materials. It deploys hands-on software training toolboxes through virtualization technologies such as Amazon EC2 and Virtualbox. The BBDTC facilitates migration of courses across other course management platforms. The framework encourages knowledge sharing and content personalization through the playlist functionality that enables unique learning experiences and accelerates information dissemination to a wider community.
Raieli, Vincenzo; Correnti, E; Sandullo, A; Romano, M; Marchese, F; Loiacono, C; Brighina, Filippo
It is crucial that all headache specialists receive adequate training. Considering the unsatisfactory results obtained with standard updating courses and the growing need for continuing professional education, a digital platform was developed as a training tool. The platform has been active since 1 October 2014. It is readily accessible to doctors by free registration. Users have access to all the material available on the platform, which includes scientific articles, e-books, presentations and images. Users can share their own material and clinical cases directly. At the time of this study, the platform had 37 users. In the second year following its launch 316 files were downloaded and five discussions were started. These saw 22 contributions. Fifteen of the 37 members did not perform any action on the platform. In total, 74 files were uploaded in the second year of activity, but 90% of the contributions came from a very small group of users. There were no significant differences in use of the platform between members of the Italian Society for the Study of Headache and other specialists. Even though the platform appears to be an easily accessible, interactive and inexpensive instrument, the higher number of downloads than uploads suggests that it is used passively.
Architectural design and support for knowledge sharing across heterogeneous MAST systems
NASA Astrophysics Data System (ADS)
Arkin, Ronald C.; Garcia-Vergara, Sergio; Lee, Sung G.
2012-06-01
A novel approach for the sharing of knowledge between widely heterogeneous robotic agents is presented, drawing upon Gardenfors Conceptual Spaces approach [4]. The target microrobotic platforms considered are computationally, power, sensor, and communications impoverished compared to more traditional robotics platforms due to their small size. This produces novel challenges for the system to converge on an interpretation of events within the world, in this case specifically focusing on the task of recognizing the concept of a biohazard in an indoor setting.
Integrating Telco interoffice fiber transport with coaxial distribution
NASA Astrophysics Data System (ADS)
McCarthy, Steven M.
1993-02-01
Real success in the residential broadband market is contingent on a platform that most efficiently shares broadband port costs while at the same time affords us an elegant, and cost efficient, upgrade from today's analog to tomorrow's digital world. Spectrum transport, whether it be over new or existing fiber/coax systems or FTTC, is that platform. It is compatible with today's home entertainment market, can be evolved to future digital transport, and effectively shares the cost of interfacing with a broadband network.
BrainLiner: A Neuroinformatics Platform for Sharing Time-Aligned Brain-Behavior Data
Takemiya, Makoto; Majima, Kei; Tsukamoto, Mitsuaki; Kamitani, Yukiyasu
2016-01-01
Data-driven neuroscience aims to find statistical relationships between brain activity and task behavior from large-scale datasets. To facilitate high-throughput data processing and modeling, we created BrainLiner as a web platform for sharing time-aligned, brain-behavior data. Using an HDF5-based data format, BrainLiner treats brain activity and data related to behavior with the same salience, aligning both behavioral and brain activity data on a common time axis. This facilitates learning the relationship between behavior and brain activity. Using a common data file format also simplifies data processing and analyses. Properties describing data are unambiguously defined using a schema, allowing machine-readable definition of data. The BrainLiner platform allows users to upload and download data, as well as to explore and search for data from the web platform. A WebGL-based data explorer can visualize highly detailed neurophysiological data from within the web browser, and a data-driven search feature allows users to search for similar time windows of data. This increases transparency, and allows for visual inspection of neural coding. BrainLiner thus provides an essential set of tools for data sharing and data-driven modeling. PMID:26858636
Climate@Home: Crowdsourcing Climate Change Research
NASA Astrophysics Data System (ADS)
Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.
2011-12-01
Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...
Positive train control shared network.
DOT National Transportation Integrated Search
2015-05-01
The Interoperable Train Control (ITC) Positive : Train Control (PTC) Shared Network (IPSN) : project investigated anticipated industry benefits : and the level of support for the development of : a hosted technological platform for PTC : messaging ac...
NASA Astrophysics Data System (ADS)
Rösel, Anja; Pavlov, Alexey K.; Granskog, Mats A.; Gerland, Sebastian; Meyer, Amelie; Hudson, Stephen R.; King, Jennifer; Itkin, Polona; Cohen, Lana; Dodd, Paul; de Steur, Laura
2016-04-01
The findings of climate science need to be communicated to the general public. Researchers are encouraged to do so by journalists, policy-makers and funding agencies and many of us want to become better science communicators. But how can we do this at the lab or small research group level without specifically allocated resources in terms of funds and communication officers? And how do we sustain communication on a regular basis and not just during the limited lifetime of a specific project? One of the solutions is to use the emerging platform of social media, which has become a powerful and inexpensive tool for communicating science to different target audiences. Many research institutions and individual researchers are already advanced users of social media, but small research groups and labs remain underrepresented. The group of oceanographers, sea ice and atmospheric scientists at the Norwegian Polar Institute (@OceanSeaIceNPI( will share our experiences developing and maintaining researcher-driven outreach for over a year through Instagram, Twitter and Facebook. We will present our solutions to some of the practical considerations such as identifying key target groups, defining the framework for sharing responsibilities and interactions within the research group, and choosing an up-to-date and appropriate social medium. By sharing this information, we aim to inspire and assist other research groups and labs in conducting their own effective science communication.
Ross, Stephen E; Johnson, Kevin B; Siek, Katie A; Gordon, Jeffry S; Khan, Danish U; Haverhals, Leah M
2011-07-12
Adverse drug events are a major safety issue in ambulatory care. Improving medication self-management could reduce these adverse events. Researchers have developed medication applications for tethered personal health records (PHRs), but little has been reported about medication applications for interoperable PHRs. Our objective was to develop two complementary personal health applications on a common PHR platform: one to assist children with complex health needs (MyMediHealth), and one to assist older adults in care transitions (Colorado Care Tablet). The applications were developed using a user-centered design approach. The two applications shared a common PHR platform based on a service-oriented architecture. MyMediHealth employed Web and mobile phone user interfaces. Colorado Care Tablet employed a Web interface customized for a tablet PC. We created complementary medication management applications tailored to the needs of distinctly different user groups using common components. Challenges were addressed in multiple areas, including how to encode medication identities, how to incorporate knowledge bases for medication images and consumer health information, how to include supplementary dosing information, how to simplify user interfaces for older adults, and how to support mobile devices for children. These prototypes demonstrate the utility of abstracting PHR data and services (the PHR platform) from applications that can be tailored to meet the needs of diverse patients. Based on the challenges we faced, we provide recommendations on the structure of publicly available knowledge resources and the use of mobile messaging systems for PHR applications.
A data platform to improve rabies prevention, Sri Lanka.
De Silva, A Pubudu; Harischandra, Pa Lionel; Beane, Abi; Rathnayaka, Shriyananda; Pimburage, Ruwini; Wijesiriwardana, Wageesha; Gamage, Dilanthi; Jayasinghe, Desika; Sigera, Chathurani; Gunasekara, Amila; Cadre, Mizaya; Amunugama, Sarath; Athapattu, Priyantha L; Jayasinghe, K Saroj A; Dondorp, Arjen M; Haniffa, Rashan
2017-09-01
In Sri Lanka, rabies prevention initiatives are hindered by fragmented and delayed information-sharing that limits clinicians' ability to follow patients and impedes public health surveillance. In a project led by the health ministry, we adapted existing technologies to create an electronic platform for rabies surveillance. Information is entered by trained clinical staff, and both aggregate and individual patient data are visualized in real time. An automated short message system (SMS) alerts patients for vaccination follow-up appointments and informs public health inspectors about incidents of animal bites. The platform was rolled out in June 2016 in four districts of Sri Lanka, linking six rabies clinics, three laboratories and the public health inspectorate. Over a 9-month period, 12 121 animal bites were reported to clinics and entered in the registry. Via secure portals, clinicians and public health teams accessed live information on treatment and outcomes of patients started on post-exposure prophylaxis (9507) or receiving deferred treatment (2614). Laboratories rapidly communicated the results of rabies virus tests on dead mammals (328/907 positive). In two pilot districts SMS reminders were sent to 1376 (71.2%) of 1933 patients whose contact details were available. Daily SMS reports alerted 17 public health inspectors to bite incidents in their area for investigation. Existing technologies in low-resource countries can be harnessed to improve public health surveillance. Investment is needed in platform development and training and support for front-line staff. Greater public engagement is needed to improve completeness of surveillance and treatment.
26 CFR 1.482-0T - Outline of regulations under section 482 (temporary).
Code of Federal Regulations, 2011 CFR
2011-04-01
... accounting requirements. (i) In general. (ii) Reliance on financial accounting. (4) CSA reporting...). (a) In general. (1) RAB share method for cost sharing transactions (CSTs). (2) Methods for platform... accepted accounting principles. (4) Time and manner of making the election. (C) Consistency. (4) IDC share...
Conceptualising Online Knowledge Sharing: What Teachers' Perceptions Can Tell Us
ERIC Educational Resources Information Center
Hood, Nina
2017-01-01
This study questions the current dependence on theories of social learning and communities of practice in research on teachers' online learning and online knowledge-sharing behaviour. It employs the interpretative approach to examine how teachers conceptualise their engagement with two USA-based online knowledge-sharing platforms within the…
The Role of Social Media Tools: Accessible Tourism for Disabled Citizens
ERIC Educational Resources Information Center
Altinay, Zehra; Saner, Tulen; Bahçelerli, Nesrin M.; Altinay, Fahriye
2016-01-01
Knowledge sharing becomes important to accomplish digital citizenship. Social media tools become popular to share and diffuse the knowledge in the digitalization. This social media learning and knowledge sharing platforms provides accessibility to the services within societies especially for disabled citizens. This research study aims to evaluate…
Sharing Music and Culture through Singing in Australia
ERIC Educational Resources Information Center
Joseph, Dawn
2009-01-01
This article discusses the notion of sharing music and culture as an effective platform to celebrate diversity in Melbourne, Australia. My research project "Celebrating Music Making and Finding Meaning" investigates and illustrates a context of diversity, one that promotes respect in a multicultural society sharing music and culture of a…
The content of social media's shared images about Ebola: a retrospective study.
Seltzer, E K; Jean, N S; Kramer-Golinkoff, E; Asch, D A; Merchant, R M
2015-09-01
Social media have strongly influenced awareness and perceptions of public health emergencies, but a considerable amount of social media content is now carried through images, rather than just text. This study's objective is to explore how image-sharing platforms are used for information dissemination in public health emergencies. Retrospective review of images posted on two popular image-sharing platforms to characterize public discourse about Ebola. Using the keyword '#ebola' we identified a 1% sample of images posted on Instagram and Flickr across two sequential weeks in November 2014. Images from both platforms were independently coded by two reviewers and characterized by themes. We reviewed 1217 images posted on Instagram and Flickr and identified themes. Nine distinct themes were identified. These included: images of health care workers and professionals [308 (25%)], West Africa [75 (6%)], the Ebola virus [59 (5%)], and artistic renderings of Ebola [64 (5%)]. Also identified were images with accompanying embedded text related to Ebola and associated: facts [68 (6%)], fears [40 (3%)], politics [46 (4%)], and jokes [284 (23%)]. Several [273 (22%)] images were unrelated to Ebola or its sequelae. Instagram images were primarily coded as jokes [255 (42%)] or unrelated [219 (36%)], while Flickr images primarily depicted health care workers and other professionals [281 (46%)] providing care or other services for prevention or treatment. Image sharing platforms are being used for information exchange about public health crises, like Ebola. Use differs by platform and discerning these differences can help inform future uses for health care professionals and researchers seeking to assess public fears and misinformation or provide targeted education/awareness interventions. Copyright © 2015 The Royal Institute of Public Health. All rights reserved.
Being Sticker Rich: Numerical Context Influences Children’s Sharing Behavior
Posid, Tasha; Fazio, Allyse; Cordes, Sara
2015-01-01
Young children spontaneously share resources with anonymous recipients, but little is known about the specific circumstances that promote or hinder these prosocial tendencies. Children (ages 3–11) received a small (12) or large (30) number of stickers, and were then given the opportunity to share their windfall with either one or multiple anonymous recipients (Dictator Game). Whether a child chose to share or not varied as a function of age, but was uninfluenced by numerical context. Moreover, children’s giving was consistent with a proportion-based account, such that children typically donated a similar proportion (but different absolute number) of the resources given to them, regardless of whether they originally received a small or large windfall. The proportion of resources donated, however, did vary based on the number of recipients with whom they were allowed to share, such that on average, children shared more when there were more recipients available, particularly when they had more resources, suggesting they take others into consideration when making prosocial decisions. Finally, results indicated that a child’s gender also predicted sharing behavior, with males generally sharing more resources than females. Together, findings suggest that the numerical contexts under which children are asked to share, as well as the quantity of resources that they have to share, may interact to promote (or hinder) altruistic behaviors throughout childhood. PMID:26535900
A Trusted Platform for Transportation Data Sharing & Stakeholder Engagement
DOT National Transportation Integrated Search
2018-03-01
Information sharing to support critical transportation systems presents numerous challenges given the diversity of information sources and visual representations typically used to portray system performance and characteristics12. This research projec...
38 CFR 17.240 - Sharing specialized medical resources.
Code of Federal Regulations, 2012 CFR
2012-07-01
... medical resources. 17.240 Section 17.240 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS MEDICAL Sharing of Medical Facilities, Equipment, and Information § 17.240 Sharing specialized medical resources. Subject to such terms and conditions as the Under Secretary for Health shall prescribe...
38 CFR 17.240 - Sharing specialized medical resources.
Code of Federal Regulations, 2010 CFR
2010-07-01
... medical resources. 17.240 Section 17.240 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS MEDICAL Sharing of Medical Facilities, Equipment, and Information § 17.240 Sharing specialized medical resources. Subject to such terms and conditions as the Under Secretary for Health shall prescribe...
cMapper: gene-centric connectivity mapper for EBI-RDF platform.
Shoaib, Muhammad; Ansari, Adnan Ahmad; Ahn, Sung-Min
2017-01-15
In this era of biological big data, data integration has become a common task and a challenge for biologists. The Resource Description Framework (RDF) was developed to enable interoperability of heterogeneous datasets. The EBI-RDF platform enables an efficient data integration of six independent biological databases using RDF technologies and shared ontologies. However, to take advantage of this platform, biologists need to be familiar with RDF technologies and SPARQL query language. To overcome this practical limitation of the EBI-RDF platform, we developed cMapper, a web-based tool that enables biologists to search the EBI-RDF databases in a gene-centric manner without a thorough knowledge of RDF and SPARQL. cMapper allows biologists to search data entities in the EBI-RDF platform that are connected to genes or small molecules of interest in multiple biological contexts. The input to cMapper consists of a set of genes or small molecules, and the output are data entities in six independent EBI-RDF databases connected with the given genes or small molecules in the user's query. cMapper provides output to users in the form of a graph in which nodes represent data entities and the edges represent connections between data entities and inputted set of genes or small molecules. Furthermore, users can apply filters based on database, taxonomy, organ and pathways in order to focus on a core connectivity graph of their interest. Data entities from multiple databases are differentiated based on background colors. cMapper also enables users to investigate shared connections between genes or small molecules of interest. Users can view the output graph on a web browser or download it in either GraphML or JSON formats. cMapper is available as a web application with an integrated MySQL database. The web application was developed using Java and deployed on Tomcat server. We developed the user interface using HTML5, JQuery and the Cytoscape Graph API. cMapper can be accessed at http://cmapper.ewostech.net Readers can download the development manual from the website http://cmapper.ewostech.net/docs/cMapperDocumentation.pdf. Source Code is available at https://github.com/muhammadshoaib/cmapperContact:smahn@gachon.ac.krSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.
Simonyan, Vahan; Mazumder, Raja
2014-09-30
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis
Simonyan, Vahan; Mazumder, Raja
2014-01-01
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953
Public health practice course using Google Plus.
Wu, Ting-Ting; Sung, Tien-Wen
2014-03-01
In recent years, mobile device-assisted clinical education has become popular among nursing school students. The introduction of mobile devices saves manpower and reduces errors while enhancing nursing students' professional knowledge and skills. To respond to the demands of various learning strategies and to maintain existing systems of education, the concept of Cloud Learning is gradually being introduced to instructional environments. Cloud computing facilitates learning that is personalized, diverse, and virtual. This study involved assessing the advantages of mobile devices and Cloud Learning in a public health practice course, in which Google+ was used as the learning platform, integrating various application tools. Users could save and access data by using any wireless Internet device. The platform was student centered and based on resource sharing and collaborative learning. With the assistance of highly flexible and convenient technology, certain obstacles in traditional practice training can be resolved. Our findings showed that the students who adopted Google+ were learned more effectively compared with those who were limited to traditional learning systems. Most students and the nurse educator expressed a positive attitude toward and were satisfied with the innovative learning method.
Integration of robotic resources into FORCEnet
NASA Astrophysics Data System (ADS)
Nguyen, Chinh; Carroll, Daniel; Nguyen, Hoa
2006-05-01
The Networked Intelligence, Surveillance, and Reconnaissance (NISR) project integrates robotic resources into Composeable FORCEnet to control and exploit unmanned systems over extremely long distances. The foundations are built upon FORCEnet-the U.S. Navy's process to define C4ISR for net-centric operations-and the Navy Unmanned Systems Common Control Roadmap to develop technologies and standards for interoperability, data sharing, publish-and-subscribe methodology, and software reuse. The paper defines the goals and boundaries for NISR with focus on the system architecture, including the design tradeoffs necessary for unmanned systems in a net-centric model. Special attention is given to two specific scenarios demonstrating the integration of unmanned ground and water surface vehicles into the open-architecture web-based command-and-control information-management system of Composeable FORCEnet. Planned spiral development for NISR will improve collaborative control, expand robotic sensor capabilities, address multiple domains including underwater and aerial platforms, and extend distributive communications infrastructure for battlespace optimization for unmanned systems in net-centric operations.
Models and Simulations as a Service: Exploring the Use of Galaxy for Delivering Computational Models
Walker, Mark A.; Madduri, Ravi; Rodriguez, Alex; Greenstein, Joseph L.; Winslow, Raimond L.
2016-01-01
We describe the ways in which Galaxy, a web-based reproducible research platform, can be used for web-based sharing of complex computational models. Galaxy allows users to seamlessly customize and run simulations on cloud computing resources, a concept we refer to as Models and Simulations as a Service (MaSS). To illustrate this application of Galaxy, we have developed a tool suite for simulating a high spatial-resolution model of the cardiac Ca2+ spark that requires supercomputing resources for execution. We also present tools for simulating models encoded in the SBML and CellML model description languages, thus demonstrating how Galaxy’s reproducible research features can be leveraged by existing technologies. Finally, we demonstrate how the Galaxy workflow editor can be used to compose integrative models from constituent submodules. This work represents an important novel approach, to our knowledge, to making computational simulations more accessible to the broader scientific community. PMID:26958881
The essential nature of sharing in science.
Fischer, Beth A; Zigmond, Michael J
2010-12-01
Advances in science are the combined result of the efforts of a great many scientists, and in many cases, their willingness to share the products of their research. These products include data sets, both small and large, and unique research resources not commercially available, such as cell lines and software programs. The sharing of these resources enhances both the scope and the depth of research, while making more efficient use of time and money. However, sharing is not without costs, many of which are borne by the individual who develops the research resource. Sharing, for example, reduces the uniqueness of the resources available to a scientist, potentially influencing the originator's perceived productivity and ultimately his or her competitiveness for jobs, promotions, and grants. Nevertheless, for most researchers-particularly those using public funds-sharing is no longer optional but must be considered an obligation to science, the funding agency, and ultimately society at large. Most funding agencies, journals, and professional societies now require a researcher who has published work involving a unique resource to make that resource available to other investigators. Changes could be implemented to mitigate some of the costs. The creator of the resource could explore the possibility of collaborating with those who request it. In addition, institutions that employ and fund researchers could change their policies and practices to make sharing a more attractive and viable option. For example, when evaluating an individual's productivity, institutions could provide credit for the impact a researcher has had on their field through the provision of their unique resources to other investigators, regardless of whether that impact is reflected in the researcher's list of publications. In addition, increased funding for the development and maintenance of user-friendly public repositories for data and research resources would also help to reduce barriers to sharing by minimizing the time, effort, and funding needed by individual investigators to comply with requests for their unique resource. Indeed, sharing is an imperative, but it is also essential to find ways to protect for both the original owner of the resource and those wishing to share it.
Advanced e-Infrastructures for Civil Protection applications: the CYCLOPS Project
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Nativi, S.; Verlato, M.; Ayral, P. A.; Fiorucci, P.; Pina, A.; Oliveira, J.; Sorani, R.
2009-04-01
During the full cycle of the emergency management, Civil Protection operative procedures involve many actors belonging to several institutions (civil protection agencies, public administrations, research centers, etc.) playing different roles (decision-makers, data and service providers, emergency squads, etc.). In this context the sharing of information is a vital requirement to make correct and effective decisions. Therefore a European-wide technological infrastructure providing a distributed and coordinated access to different kinds of resources (data, information, services, expertise, etc.) could enhance existing Civil Protection applications and even enable new ones. Such European Civil Protection e-Infrastructure should be designed taking into account the specific requirements of Civil Protection applications and the state-of-the-art in the scientific and technological disciplines which could make the emergency management more effective. In the recent years Grid technologies have reached a mature state providing a platform for secure and coordinated resource sharing between the participants collected in the so-called Virtual Organizations. Moreover the Earth and Space Sciences Informatics provide the conceptual tools for modeling the geospatial information shared in Civil Protection applications during its entire lifecycle. Therefore a European Civil Protection e-infrastructure might be based on a Grid platform enhanced with Earth Sciences services. In the context of the 6th Framework Programme the EU co-funded Project CYCLOPS (CYber-infrastructure for CiviL protection Operative ProcedureS), ended in December 2008, has addressed the problem of defining the requirements and identifying the research strategies and innovation guidelines towards an advanced e-Infrastructure for Civil Protection. Starting from the requirement analysis CYCLOPS has proposed an architectural framework for a European Civil Protection e-Infrastructure. This architectural framework has been evaluated through the development of prototypes of two operative applications used by the Italian Civil Protection for Wild Fires Risk Assessment (RISICO) and by the French Civil Protection for Flash Flood Risk Management (SPC-GD). The results of these studies and proof-of-concepts have been used as the basis for the definition of research and innovation strategies aiming to the detailed design and implementation of the infrastructure. In particular the main research themes and topics to be addressed have been identified and detailed. Finally the obstacles to the innovation required for the adoption of this infrastructure and possible strategies to overcome them have been discussed.
Optimizing health information technology's role in enabling comparative effectiveness research.
Navathe, Amol S; Conway, Patrick H
2010-12-01
Health information technology (IT) is a key enabler of comparative effectiveness research (CER). Health IT standards for data sharing are essential to advancing the research data infrastructure, and health IT is critical to the next step of incorporating clinical data into data sources. Four key principles for advancement of CER are (1) utilization of data as a strategic asset, (2) leveraging public-private partnerships, (3) building robust, scalable technology platforms, and (4) coordination of activities across government agencies. To maximize the value of the resources, payers and providers must contribute data to initiatives, engage with government agencies on lessons learned, continue to develop new technologies that address key challenges, and utilize the data to improve patient outcomes and conduct research.
IAU Public Astronomical Organisations Network
NASA Astrophysics Data System (ADS)
Canas, Lina; Cheung, Sze Leung
2015-08-01
The Office for Astronomy Outreach has devoted intensive means to create and support a global network of public astronomical organisations around the world. Focused on bringing established and newly formed amateur astronomy organizations together, providing communications channels and platforms for disseminating news to the global community and the sharing of best practices and resources among these associations around the world. In establishing the importance that these organizations have for the dissemination of activities globally and acting as key participants in IAU various campaigns social media has played a key role in keeping this network engaged and connected. Here we discuss the implementation process of maintaining this extensive network, the processing and gathering of information and the interactions between local active members at a national and international level.
Open Marketplace for Simulation Software on the Basis of a Web Platform
NASA Astrophysics Data System (ADS)
Kryukov, A. P.; Demichev, A. P.
2016-02-01
The focus in development of a new generation of middleware shifts from the global grid systems to building convenient and efficient web platforms for remote access to individual computing resources. Further line of their development, suggested in this work, is related not only with the quantitative increase in their number and with the expansion of scientific, engineering, and manufacturing areas in which they are used, but also with improved technology for remote deployment of application software on the resources interacting with the web platforms. Currently, the services for providers of application software in the context of scientific-oriented web platforms is not developed enough. The proposed in this work new web platforms of application software market should have all the features of the existing web platforms for submissions of jobs to remote resources plus the provision of specific web services for interaction on market principles between the providers and consumers of application packages. The suggested approach will be approved on the example of simulation applications in the field of nonlinear optics.
Tilman, Andrew R; Levin, Simon; Watson, James R
2018-06-05
Harvesting behaviors of natural resource users, such as farmers, fishermen and aquaculturists, are shaped by season-to-season and day-to-day variability, or in other words risk. Here, we explore how risk-mitigation strategies can lead to sustainable use and improved management of common-pool natural resources. Over-exploitation of unmanaged natural resources, which lowers their long-term productivity, is a central challenge facing societies. While effective top-down management is a possible solution, it is not available if the resource is outside the jurisdictional bounds of any management entity, or if existing institutions cannot effectively impose sustainable-use rules. Under these conditions, alternative approaches to natural resource governance are required. Here, we study revenue-sharing clubs as a mechanism by which resource users can mitigate their income volatility and importantly, as a co-benefit, are also incentivized to reduce their effort, leading to reduced over-exploitation and improved resource governance. We use game theoretic analyses and agent-based modeling to determine the conditions in which revenue-sharing can be beneficial for resource management as well as resource users. We find that revenue-sharing agreements can emerge and lead to improvements in resource management when there is large variability in production/revenue and when this variability is uncorrelated across members of the revenue-sharing club. Further, we show that if members of the revenue-sharing collective can sell their product at a price premium, then the range of ecological and economic conditions under which revenue-sharing can be a tool for management greatly expands. These results have implications for the design of bottom-up management, where resource users themselves are incentivized to operate in ecologically sustainable and economically advantageous ways. Copyright © 2018 Elsevier Ltd. All rights reserved.
Gauchotte, Guillaume; Ameisen, David; Boutonnat, Jean; Battistella, Maxime; Copie, Christiane; Garcia, Stéphane; Rigau, Valérie; Galateau-Sallé, Françoise; Terris, Benoit; Vergier, Béatrice; Wendum, Dominique; Bertheau, Philippe
2013-06-01
Building online teaching materials is a highly time and energy consuming task for teachers of a single university. With the help of the Collège des pathologistes, we initiated a French national university network for building mutualized online teaching pathology cases, tests and other pedagogic resources. Nineteen French universities are associated to this project, initially funded by UNF3S (http://www.unf3s.org/). One national e-learning Moodle platform (http://virtual-slides.univ-paris7.fr/moodle/) contains texts, medias and URL pointing toward decentralized virtual slides. The Moodle interface has been explained to the teachers since september 2011 using web-based conferences with screen-sharing. The following contents have been created: 20 clinical cases, several tests with multiple choices and short answer questions, and gross examination videos. A survey with 16 teachers and students showed a 94 % satisfaction rate, most of the 16 participants being favorable to the development of e-learning, in parallel with other courses in classroom. These tools will be further developed for the different study levels of pathology. In conclusion, these tools offer very interesting perspectives for pathology teaching. The organization of a national inter-university network is a useful way to create and share numerous and good-quality pedagogic resources. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Era-Planet the European Network for Observing Our Changing Planet
NASA Astrophysics Data System (ADS)
Pirrone, N.; Cinnirella, S.; Nativi, S.; Sprovieri, F.; Hedgecock, I. M.
2016-06-01
In the last decade a significant number of projects and programmes in different domains of Earth Observation and environmental monitoring have generated a substantial amount of data and knowledge on different aspects related to environmental quality and sustainability. Big data generated by in-situ or satellite platforms are being collected and archived with a plethora of systems and instruments making difficult the sharing of data and transfer of knowledge to stakeholders and policy makers to support key economic and societal sectors. The overarching goal of ERAPLANET is to strengthen the European Research Area in the domain of Earth Observation in coherence with the European participation in the Group on Earth Observation (GEO) and Copernicus. The expected impact is to strengthen European leadership within the forthcoming GEO 2015-2025 Work Plan. ERA-PLANET is designed to reinforce the interface with user communities, whose needs the Global Earth Observation System of Systems (GEOSS) intends to address. It will provide more accurate, comprehensive and authoritative information to policy and decision-makers in key societal benefit areas, such as Smart Cities and Resilient Societies; Resource efficiency and Environmental management; Global changes and Environmental treaties; Polar areas and Natural resources. ERA-PLANET will provide advanced decision-support tools and technologies aimed to better monitor our global environment and share the information and knowledge available in the different domains of Earth Observation.
Seeland, Ute; Nauman, Ahmad T; Cornelis, Alissa; Ludwig, Sabine; Dunkel, Mathias; Kararigas, Georgios; Regitz-Zagrosek, Vera
2016-01-01
Sex and Gender Medicine is a novel discipline that provides equitable medical care for society and improves outcomes for both male and female patients. The integration of sex- and gender-specific knowledge into medical curricula is limited due to adequate learning material, systematic teacher training and an innovative communication strategy. We aimed at initiating an e-learning and knowledge-sharing platform for Sex and Gender Medicine, the eGender platform (http://egender.charite.de), to ensure that future doctors and health professionals will have adequate knowledge and communication skills on sex and gender differences in order to make informed decisions for their patients. The web-based eGender knowledge-sharing platform was designed to support the blended learning pedagogical teaching concept and follows the didactic concept of constructivism. Learning materials developed by Sex and Gender Medicine experts of seven universities have been used as the basis for the new learning tools . The content of these tools is patient-centered and provides add-on information on gender-sensitive aspects of diseases. The structural part of eGender was designed and developed using the open source e-learning platform Moodle. The eGender platform comprises an English and a German version of e-learning modules: one focusing on basic knowledge and seven on specific medical disciplines. Each module consists of several courses corresponding to a disease or symptom complex. Self-organized learning has to be managed by using different learning tools, e.g., texts and audiovisual material, tools for online communication and collaborative work. More than 90 users from Europe registered for the eGender Medicine learning modules. The most frequently accessed module was "Gender Medicine-Basics" and the users favored discussion forums. These e-learning modules fulfill the quality criteria for higher education and are used within the elective Master Module "Gender Medicine-Basics" implemented into the accredited Master of Public Health at Charité-Berlin. The eGender platform is a flexible and user-friendly electronical knowledge-sharing platform providing evidence-based high-quality learning material used by a growing number of registered users. The eGender Medicine learning modules could be key in the reform of medical curricula to integrate Sex and Gender Medicine into the education of health professionals.
a Public Platform for Geospatial Data Sharing for Disaster Risk Management
NASA Astrophysics Data System (ADS)
Balbo, S.; Boccardo, P.; Dalmasso, S.; Pasquali, P.
2013-01-01
Several studies have been conducted in Africa to assist local governments in addressing the risk situation related to natural hazards. Geospatial data containing information on vulnerability, impacts, climate change, disaster risk reduction is usually part of the output of such studies and is valuable to national and international organizations to reduce the risks and mitigate the impacts of disasters. Nevertheless this data isn't efficiently widely distributed and often resides in remote storage solutions hardly reachable. Spatial Data Infrastructures are technical solutions capable to solve this issue, by storing geospatial data and making them widely available through the internet. Among these solutions, GeoNode, an open source online platform for geospatial data sharing, has been developed in recent years. GeoNode is a platform for the management and publication of geospatial data. It brings together mature and stable open-source software projects under a consistent and easy-to-use interface allowing users, with little training, to quickly and easily share data and create interactive maps. GeoNode data management tools allow for integrated creation of data, metadata, and map visualizations. Each dataset in the system can be shared publicly or restricted to allow access to only specific users. Social features like user profiles and commenting and rating systems allow for the development of communities around each platform to facilitate the use, management, and quality control of the data the GeoNode instance contains (http://geonode.org/). This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this platform will help to ensure that the data created by a number of past or ongoing projects is maintained and that this information remains accessible and useful. An Integrated Flood Risk Management Plan for a river basin has already been included in the platform and other data from future disaster risk management projects will be added as well.
Huang, Mingbo; Hu, Ding; Yu, Donglan; Zheng, Zhensheng; Wang, Kuijian
2011-12-01
Enhanced extracorporeal counterpulsation (EECP) information consists of both text and hemodynamic waveform data. At present EECP text information has been successfully managed through Web browser, while the management and sharing of hemodynamic waveform data through Internet has not been solved yet. In order to manage EECP information completely, based on the in-depth analysis of EECP hemodynamic waveform file of digital imaging and communications in medicine (DICOM) format and its disadvantages in Internet sharing, we proposed the use of the extensible markup language (XML), which is currently the Internet popular data exchange standard, as the storage specification for the sharing of EECP waveform data. Then we designed a web-based sharing system of EECP hemodynamic waveform data via ASP. NET 2.0 platform. Meanwhile, we specifically introduced the four main system function modules and their implement methods, including DICOM to XML conversion module, EECP waveform data management module, retrieval and display of EECP waveform module and the security mechanism of the system.
Jenkins, Chris; Pierson, Lyndon G.
2016-10-25
Techniques and mechanism to selectively provide resource access to a functional domain of a platform. In an embodiment, the platform includes both a report domain to monitor the functional domain and a policy domain to identify, based on such monitoring, a transition of the functional domain from a first integrity level to a second integrity level. In response to a change in integrity level, the policy domain may configure the enforcement domain to enforce against the functional domain one or more resource accessibility rules corresponding to the second integrity level. In another embodiment, the policy domain automatically initiates operations in aid of transitioning the platform from the second integrity level to a higher integrity level.
Building biomedical web communities using a semantically aware content management system.
Das, Sudeshna; Girard, Lisa; Green, Tom; Weitzman, Louis; Lewis-Bowen, Alister; Clark, Tim
2009-03-01
Web-based biomedical communities are becoming an increasingly popular vehicle for sharing information amongst researchers and are fast gaining an online presence. However, information organization and exchange in such communities is usually unstructured, rendering interoperability between communities difficult. Furthermore, specialized software to create such communities at low cost-targeted at the specific common information requirements of biomedical researchers-has been largely lacking. At the same time, a growing number of biological knowledge bases and biomedical resources are being structured for the Semantic Web. Several groups are creating reference ontologies for the biomedical domain, actively publishing controlled vocabularies and making data available in Resource Description Framework (RDF) language. We have developed the Science Collaboration Framework (SCF) as a reusable platform for advanced structured online collaboration in biomedical research that leverages these ontologies and RDF resources. SCF supports structured 'Web 2.0' style community discourse amongst researchers, makes heterogeneous data resources available to the collaborating scientist, captures the semantics of the relationship among the resources and structures discourse around the resources. The first instance of the SCF framework is being used to create an open-access online community for stem cell research-StemBook (http://www.stembook.org). We believe that such a framework is required to achieve optimal productivity and leveraging of resources in interdisciplinary scientific research. We expect it to be particularly beneficial in highly interdisciplinary areas, such as neurodegenerative disease and neurorepair research, as well as having broad utility across the natural sciences.
Riley, Jennifer; McGowan, Melissa; Rozmovits, Linda
2014-06-30
The emergency department (ED) is an environment fraught with increasing patient volumes, competing priorities, fluctuating information, and ad hoc interprofessional clinical teams. Limited time is available to reflect on and discuss clinical experiences, policies, or research with others on the involved team. Online resources, such as webcasts and blogs, offer an accessible platform for emergency shift workers to engage in interprofessional discussion and education. Our objective was to explore the current opportunities for shared learning and discussion and to discover the potential of online resources to foster and facilitate interprofessional education within an academic tertiary emergency department community. A qualitative study using semistructured interviews was conducted to solicit participants' views of the current culture of IPE in the ED, the potential value of introducing new online resources and technology in support of IPE, and possible barriers to uptake. Participation was voluntary and participants provided verbal informed consent. Online resources discussed included webcasts, interactive discussion forums, websites, and dashboard with links to central repositories. Identified barriers to uptake of new online resources were an unwillingness to "work" off-shift, a dislike of static one-directional communication, concerns with confidentiality, and the suggestion that new resources would be used by only a select few. Owing to the sensitive dynamics of emergency medicine-and the preference among its professional staff to foster interprofessional discussion and education through personal engagement, in an unhurried, non-stressful environment-introducing and investing in online resources should be undertaken with caution.
A study on an information security system of a regional collaborative medical platform.
Zhao, Junping; Peng, Kun; Leng, Jinchang; Sun, Xiaowei; Zhang, Zhenjiang; Xue, Wanguo; Ren, Lianzhong
2010-01-01
The objective of this study was to share the experience of building an information security system for a regional collaborative medical platform (RCMP) and discuss the lessons learned from practical projects. Safety measures are analyzed from the perspective of system engineering. We present the essential requirements, critical architectures, and policies for system security of regional collaborative medical platforms.
Krause, Denise D
2015-01-01
There are a variety of challenges to developing strategies to improve access to health care, but access to data is critical for effective evidence-based decision-making. Many agencies and organizations throughout Mississippi have been collecting quality health data for many years. However, those data have historically resided in data silos and have not been readily shared. A strategy was developed to build and coordinate infrastructure, capacity, tools, and resources to facilitate health workforce and population health planning throughout the state. Realizing data as the foundation upon which to build, the primary objective was to develop the capacity to collect, store, maintain, visualize, and analyze data from a variety of disparate sources -- with the ultimate goal of improving access to health care. Specific aims were to: 1) build a centralized data repository and scalable informatics platform, 2) develop a data management solution for this platform and then, 3) derive value from this platform by facilitating data visualization and analysis. A managed data lake was designed and constructed for health data from disparate sources throughout the state of Mississippi. A data management application was developed to log and track all data sources, maps and geographies, and data marts. With this informatics platform as a foundation, a variety of tools are used to visualize and analyze data. To illustrate, a web mapping application was developed to examine the health workforce geographically and attractive data visualizations and dynamic dashboards were created to facilitate health planning and research. Samples of data visualizations that aim to inform health planners and policymakers are presented. Many agencies and organizations throughout the state benefit from this platform. The overarching goal is that by providing timely, reliable information to stakeholders, Mississippians in general will experience improved access to quality care.
An open source web interface for linking models to infrastructure system databases
NASA Astrophysics Data System (ADS)
Knox, S.; Mohamed, K.; Harou, J. J.; Rheinheimer, D. E.; Medellin-Azuara, J.; Meier, P.; Tilmant, A.; Rosenberg, D. E.
2016-12-01
Models of networked engineered resource systems such as water or energy systems are often built collaboratively with developers from different domains working at different locations. These models can be linked to large scale real world databases, and they are constantly being improved and extended. As the development and application of these models becomes more sophisticated, and the computing power required for simulations and/or optimisations increases, so has the need for online services and tools which enable the efficient development and deployment of these models. Hydra Platform is an open source, web-based data management system, which allows modellers of network-based models to remotely store network topology and associated data in a generalised manner, allowing it to serve multiple disciplines. Hydra Platform uses a web API using JSON to allow external programs (referred to as `Apps') to interact with its stored networks and perform actions such as importing data, running models, or exporting the networks to different formats. Hydra Platform supports multiple users accessing the same network and has a suite of functions for managing users and data. We present ongoing development in Hydra Platform, the Hydra Web User Interface, through which users can collaboratively manage network data and models in a web browser. The web interface allows multiple users to graphically access, edit and share their networks, run apps and view results. Through apps, which are located on the server, the web interface can give users access to external data sources and models without the need to install or configure any software. This also ensures model results can be reproduced by removing platform or version dependence. Managing data and deploying models via the web interface provides a way for multiple modellers to collaboratively manage data, deploy and monitor model runs and analyse results.
MetaNET--a web-accessible interactive platform for biological metabolic network analysis.
Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael
2014-01-01
Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.
A Web Tool for Research in Nonlinear Optics
NASA Astrophysics Data System (ADS)
Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.
2016-02-01
This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.
Economic models for management of resources in peer-to-peer and grid computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David
2001-07-01
The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.
Anderson, Beth M.; Stevens, Michael C.; Glahn, David C.; Assaf, Michal; Pearlson, Godfrey D.
2013-01-01
We present a modular, high performance, open-source database system that incorporates popular neuroimaging database features with novel peer-to-peer sharing, and a simple installation. An increasing number of imaging centers have created a massive amount of neuroimaging data since fMRI became popular more than 20 years ago, with much of that data unshared. The Neuroinformatics Database (NiDB) provides a stable platform to store and manipulate neuroimaging data and addresses several of the impediments to data sharing presented by the INCF Task Force on Neuroimaging Datasharing, including 1) motivation to share data, 2) technical issues, and 3) standards development. NiDB solves these problems by 1) minimizing PHI use, providing a cost effective simple locally stored platform, 2) storing and associating all data (including genome) with a subject and creating a peer-to-peer sharing model, and 3) defining a sample, normalized definition of a data storage structure that is used in NiDB. NiDB not only simplifies the local storage and analysis of neuroimaging data, but also enables simple sharing of raw data and analysis methods, which may encourage further sharing. PMID:23912507
Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data
NASA Astrophysics Data System (ADS)
Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.
2016-12-01
Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.
NASA Astrophysics Data System (ADS)
Albeke, S. E.; Perkins, D. G.; Ewers, S. L.; Ewers, B. E.; Holbrook, W. S.; Miller, S. N.
2015-12-01
The sharing of data and results is paramount for advancing scientific research. The Wyoming Center for Environmental Hydrology and Geophysics (WyCEHG) is a multidisciplinary group that is driving scientific breakthroughs to help manage water resources in the Western United States. WyCEHG is mandated by the National Science Foundation (NSF) to share their data. However, the infrastructure from which to share such diverse, complex and massive amounts of data did not exist within the University of Wyoming. We developed an innovative framework to meet the data organization, sharing, and discovery requirements of WyCEHG by integrating both open and closed source software, embedded metadata tags, semantic web technologies, and a web-mapping application. The infrastructure uses a Relational Database Management System as the foundation, providing a versatile platform to store, organize, and query myriad datasets, taking advantage of both structured and unstructured formats. Detailed metadata are fundamental to the utility of datasets. We tag data with Uniform Resource Identifiers (URI's) to specify concepts with formal descriptions (i.e. semantic ontologies), thus allowing users the ability to search metadata based on the intended context rather than conventional keyword searches. Additionally, WyCEHG data are geographically referenced. Using the ArcGIS API for Javascript, we developed a web mapping application leveraging database-linked spatial data services, providing a means to visualize and spatially query available data in an intuitive map environment. Using server-side scripting (PHP), the mapping application, in conjunction with semantic search modules, dynamically communicates with the database and file system, providing access to available datasets. Our approach provides a flexible, comprehensive infrastructure from which to store and serve WyCEHG's highly diverse research-based data. This framework has not only allowed WyCEHG to meet its data stewardship requirements, but can provide a template for others to follow.
Availability and Use of Shared Data From Cardiometabolic Clinical Trials.
Vaduganathan, Muthiah; Nagarur, Amulya; Qamar, Arman; Patel, Ravi B; Navar, Ann Marie; Peterson, Eric D; Bhatt, Deepak L; Fonarow, Gregg C; Yancy, Clyde W; Butler, Javed
2018-02-27
Sharing of patient-level clinical trial data has been widely endorsed. Little is known about how extensively these data have been used for cardiometabolic diseases. We sought to evaluate the availability and use of shared data from cardiometabolic clinical trials. We extracted data from ClinicalStudyDataRequest.com, a large, multisponsor data-sharing platform hosting individual patient-level data from completed studies sponsored by 13 pharmaceutical companies. From January 2013 to May 2017, the platform had data from 3374 clinical trials, of which 537 (16%) evaluated cardiometabolic therapeutics (phase 1, 36%; phase 2, 17%; phase 2/3, 1%; phase 3, 42%; phase 4, 4%). They covered 74 therapies and 398 925 patients. Diabetes mellitus (60%) and hypertension (15%) were the most common study topics. Median time from study completion to data availability was 79 months. As of May 2017, ClinicalStudyDataRequest.com had received 318 submitted proposals, of which 163 had signed data-sharing agreements. Thirty of these proposals were related to cardiometabolic therapies and requested data from 79 unique studies (15% of all trials, 29% of phase 3/4 trials). Most (96%) data requesters of cardiometabolic clinical trial data were from academic centers in North America and Western Europe, and half the proposals were unfunded. Most proposals were for secondary hypothesis-generating questions, with only 1 proposed reanalysis of the original study primary hypothesis. To date, 3 peer-reviewed articles have been published after a median of 19 months (9-32 months) from the data-sharing agreement. Despite availability of data from >500 cardiometabolic trials in a multisponsor data-sharing platform, only 15% of these trials and 29% of phase 3/4 trials have been accessed by investigators thus far, and a negligible minority of analyses have reached publication. © 2017 American Heart Association, Inc.
Flynn, Allen J; Bahulekar, Namita; Boisvert, Peter; Lagoze, Carl; Meng, George; Rampton, James; Friedman, Charles P
2017-01-01
Throughout the world, biomedical knowledge is routinely generated and shared through primary and secondary scientific publications. However, there is too much latency between publication of knowledge and its routine use in practice. To address this latency, what is actionable in scientific publications can be encoded to make it computable. We have created a purpose-built digital library platform to hold, manage, and share actionable, computable knowledge for health called the Knowledge Grid Library. Here we present it with its system architecture.
Future Visions of the Brahmaputra - Establishing Hydrologic Baseline and Water Resources Context
NASA Astrophysics Data System (ADS)
Ray, P. A.; Yang, Y. E.; Wi, S.; Brown, C. M.
2013-12-01
The Brahmaputra River Basin (China-India-Bhutan-Bangladesh) is on the verge of a transition from a largely free flowing and highly variable river to a basin of rapid investment and infrastructure development. This work demonstrates a knowledge platform for the basin that compiles available data, and develops hydrologic and water resources system models of the basin. A Variable Infiltration Capacity (VIC) model of the Brahmaputra basin supplies hydrologic information of major tributaries to a water resources system model, which routes runoff generated via the VIC model through water infrastructure, and accounts for water withdrawals for agriculture, hydropower generation, municipal demand, return flows and others human activities. The system model also simulates agricultural production and the economic value of water in its various uses, including municipal, agricultural, and hydropower. Furthermore, the modeling framework incorporates plausible climate change scenarios based on the latest projections of changes to contributing glaciers (upstream), as well as changes to monsoon behavior (downstream). Water resources projects proposed in the Brahmaputra basin are evaluated based on their distribution of benefits and costs in the absence of well-defined water entitlements, and relative to a complex regional water-energy-food nexus. Results of this project will provide a basis for water sharing negotiation among the four countries and inform trans-national water-energy policy making.
The Federated Satellite Systems paradigm: Concept and business case evaluation
NASA Astrophysics Data System (ADS)
Golkar, Alessandro; Lluch i Cruz, Ignasi
2015-06-01
This paper defines the paradigm of Federated Satellite Systems (FSS) as a novel distributed space systems architecture. FSS are networks of spacecraft trading previously inefficiently allocated and unused resources such as downlink bandwidth, storage, processing power, and instrument time. FSS holds the promise to enhance cost-effectiveness, performance and reliability of existing and future space missions, by networking different missions and effectively creating a pool of resources to exchange between participants in the federation. This paper introduces and describes the FSS paradigm, and develops an approach integrating mission analysis and economic assessments to evaluate the feasibility of the business case of FSS. The approach is demonstrated on a case study on opportunities enabled by FSS to enhance space exploration programs, with particular reference to the International Space Station. The application of the proposed methodology shows that the FSS concept is potentially able to create large commercial markets of in-space resources, by providing the technical platform to offer the opportunity for spacecraft to share or make use of unused resources within their orbital neighborhood. It is shown how the concept is beneficial to satellite operators, space agencies, and other stakeholders of the space industry to more flexibly interoperate space systems as a portfolio of assets, allowing unprecedented collaboration among heterogeneous types of missions.
NASA Astrophysics Data System (ADS)
Lidya, L.
2017-03-01
National Health Insurance has been implemented since 1st January 2014. A number of new policies have been established including multilevel referral system. The multilevel referral system classified health care center into three levels, it determined that the flow of patient treatment should be started from first level health care center. There are 144 kind of diseases that must be treat in the first level which mainly consists of general physicians. Unfortunately, competence of the physician in the first level may not fulfil the standard competence yet. To improved the physisians knowledge, government has created many events to accelerate knowledge sharing. However, it still needs times and many resources to give significan results. Expert system is kind of software that provide consulting services to non-expert users in accordance with the area of its expertise. It can improved effectivity and efficiency of knowledge sharing and learning. This research was developed a model of TB diagnose expert system which comply with the standard procedure of TB diagnosis and regulation. The proposed expert system has characteristics as follows provide facility to manage multimedia clinical data, supporting the complexity of TB diagnosis (combine rule-based and case-based expert system), interactive interface, good usability, multi-platform, evolutionary.
Sharing Earth Observation Data When Health Management
NASA Astrophysics Data System (ADS)
Cox, E. L., Jr.
2015-12-01
While the global community is struck by pandemics and epidemics from time to time the ability to fully utilize earth observations and integrate environmental information has been limited - until recently. Mature science understanding is allowing new levels of situational awareness be possible when and if the relevant data is available and shared in a timely and useable manner. Satellite and other remote sensing tools have been used to observe, monitor, assess and predict weather and water impacts for decades. In the last few years much of this has included a focus on the ability to monitor changes on climate scales that suggest changes in quantity and quality of ecosystem resources or the "one-health" approach where trans-disciplinary links between environment, animal and vegetative health may provide indications of best ways to manage susceptibility to infectious disease or outbreaks. But the scale of impacts and availability of information from earth observing satellites, airborne platforms, health tracking systems and surveillance networks offer new integrated tools. This presentation will describe several recent events, such as Superstorm Sandy in the United States and the Ebola outbreak in Africa, where public health and health infrastructure have been exposed to environmental hazards and lessons learned from disaster response in the ability to share data have been effective in risk reduction.
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...
A Ubiquitous Sensor Network Platform for Integrating Smart Devices into the Semantic Sensor Web
de Vera, David Díaz Pardo; Izquierdo, Álvaro Sigüenza; Vercher, Jesús Bernat; Gómez, Luis Alfonso Hernández
2014-01-01
Ongoing Sensor Web developments make a growing amount of heterogeneous sensor data available to smart devices. This is generating an increasing demand for homogeneous mechanisms to access, publish and share real-world information. This paper discusses, first, an architectural solution based on Next Generation Networks: a pilot Telco Ubiquitous Sensor Network (USN) Platform that embeds several OGC® Sensor Web services. This platform has already been deployed in large scale projects. Second, the USN-Platform is extended to explore a first approach to Semantic Sensor Web principles and technologies, so that smart devices can access Sensor Web data, allowing them also to share richer (semantically interpreted) information. An experimental scenario is presented: a smart car that consumes and produces real-world information which is integrated into the Semantic Sensor Web through a Telco USN-Platform. Performance tests revealed that observation publishing times with our experimental system were well within limits compatible with the adequate operation of smart safety assistance systems in vehicles. On the other hand, response times for complex queries on large repositories may be inappropriate for rapid reaction needs. PMID:24945678
A ubiquitous sensor network platform for integrating smart devices into the semantic sensor web.
de Vera, David Díaz Pardo; Izquierdo, Alvaro Sigüenza; Vercher, Jesús Bernat; Hernández Gómez, Luis Alfonso
2014-06-18
Ongoing Sensor Web developments make a growing amount of heterogeneous sensor data available to smart devices. This is generating an increasing demand for homogeneous mechanisms to access, publish and share real-world information. This paper discusses, first, an architectural solution based on Next Generation Networks: a pilot Telco Ubiquitous Sensor Network (USN) Platform that embeds several OGC® Sensor Web services. This platform has already been deployed in large scale projects. Second, the USN-Platform is extended to explore a first approach to Semantic Sensor Web principles and technologies, so that smart devices can access Sensor Web data, allowing them also to share richer (semantically interpreted) information. An experimental scenario is presented: a smart car that consumes and produces real-world information which is integrated into the Semantic Sensor Web through a Telco USN-Platform. Performance tests revealed that observation publishing times with our experimental system were well within limits compatible with the adequate operation of smart safety assistance systems in vehicles. On the other hand, response times for complex queries on large repositories may be inappropriate for rapid reaction needs.
NASA Astrophysics Data System (ADS)
Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin
2012-08-01
Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.
PC/AT-based architecture for shared telerobotic control
NASA Astrophysics Data System (ADS)
Schinstock, Dale E.; Faddis, Terry N.; Barr, Bill G.
1993-03-01
A telerobotic control system must include teleoperational, shared, and autonomous modes of control in order to provide a robot platform for incorporating the rapid advances that are occurring in telerobotics and associated technologies. These modes along with the ability to modify the control algorithms are especially beneficial for telerobotic control systems used for research purposes. The paper describes an application of the PC/AT platform to the control system of a telerobotic test cell. The paper provides a discussion of the suitability of the PC/AT as a platform for a telerobotic control system. The discussion is based on the many factors affecting the choice of a computer platform for a real time control system. The factors include I/O capabilities, simplicity, popularity, computational performance, and communication with external systems. The paper also includes a description of the actuation, measurement, and sensor hardware of both the master manipulator and the slave robot. It also includes a description of the PC-Bus interface cards. These cards were developed by the researchers in the KAT Laboratory, specifically for interfacing to the master manipulator and slave robot. Finally, a few different versions of the low level telerobotic control software are presented. This software incorporates shared control by supervisory systems and the human operator and traded control between supervisory systems and the human operator.
Near real time water resources data for river basin management
NASA Technical Reports Server (NTRS)
Paulson, R. W. (Principal Investigator)
1973-01-01
The author has identified the following significant results. Twenty Data Collection Platforms (DCP) are being field installed on USGS water resources stations in the Delaware River Basin. DCP's have been successfully installed and are operating well on five stream gaging stations, three observation wells, and one water quality monitor in the basin. DCP's have been installed at nine additional water quality monitors, and work is progressing on interfacing the platforms to the monitors. ERTS-related water resources data from the platforms are being provided in near real time, by the Goddard Space Flight Center to the Pennsylvania district, Water Resources Division, U.S. Geological Survey. On a daily basis, the data are computer processed by the Survey and provided to the Delaware River Basin Commission. Each daily summary contains data that were relayed during 4 or 5 of the 15 orbits made by ERTS-1 during the previous day. Water resources parameters relays by the platforms include dissolved oxygen concentrations, temperature, pH, specific conductance, well level, and stream gage height, which is used to compute stream flow for the daily summary.
SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services.
Gessler, Damian D G; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T
2009-09-23
SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at http://sswap.info (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at http://sswap.info/protocol.jsp, developer tools at http://sswap.info/developer.jsp, and a portal to third-party ontologies at http://sswapmeet.sswap.info (a "swap meet"). SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs.
ERIC Educational Resources Information Center
Dormody, Thomas J.
1992-01-01
A survey of 372 secondary agriculture teachers received 274 responses showing a majority of agriculture and science departments share resources, although at low levels. Many more predicted future sharing. Equipment and supplies were most often shared, instructional services least often. (SK)
Representing Hydrologic Models as HydroShare Resources to Facilitate Model Sharing and Collaboration
NASA Astrophysics Data System (ADS)
Castronova, A. M.; Goodall, J. L.; Mbewe, P.
2013-12-01
The CUAHSI HydroShare project is a collaborative effort that aims to provide software for sharing data and models within the hydrologic science community. One of the early focuses of this work has been establishing metadata standards for describing models and model-related data as HydroShare resources. By leveraging this metadata definition, a prototype extension has been developed to create model resources that can be shared within the community using the HydroShare system. The extension uses a general model metadata definition to create resource objects, and was designed so that model-specific parsing routines can extract and populate metadata fields from model input and output files. The long term goal is to establish a library of supported models where, for each model, the system has the ability to extract key metadata fields automatically, thereby establishing standardized model metadata that will serve as the foundation for model sharing and collaboration within HydroShare. The Soil Water & Assessment Tool (SWAT) is used to demonstrate this concept through a case study application.
NASA Astrophysics Data System (ADS)
Miller, M. K.; MacKenzie, S.
2011-12-01
Many aquariums, zoos, museums, and other informal science education (ISE) centers across the country want to connect their visitors with the important issue of climate change. Communicating climate change and the science it embodies is no easy task though, and ISE institutions are seeking creative and collaborative ways to best interpret the issue with their audiences. Some of these institutions, particularly aquariums and zoos, have live specimens on exhibit that stand to be severely impacted by climate change. Others see it as an educational and moral imperative to address such an important issue affecting the world today, especially one so close to the core mission of their institution. Regardless, informal science educators have noticed that the public is increasingly coming to them with questions related to climate change, and they want to be able to respond as effectively as they can. The Monterey Bay Aquarium is one partner in a coalition of aquariums, zoos, museums and informal science education institutions that are working together to present climate change to its visitors. These institutions hold enormous public trust as sources of sound scientific information. Whether it is through exhibitions like the Aquarium's Hot Pink Flamingos: Stories of Hope in a Changing Sea, interpretive and communication techniques to navigate challenging climate change discussions, or with sustainability planning and operational greening efforts, there is a concerted movement to improve the capacity of these institutions to respond to the issue. Ultimately, their goal is to inspire visitors in a way that positively impacts the country's discourse surrounding climate change, and helps steer our dialog toward a focus on solutions. In addition to the Hot Pink Flamingos exhibit, the Aquarium is also working with the coalition to build a website, www.climateinterpreter.org, that can serve as an online platform for sharing the experiences of what different partners have learned at their respective locations, and a clearinghouse for resources related to effectively communicating climate change. While the website was built for informal science educators, its content and information will be a valuable resource for everyone in the science and education community. There is a broad need for a better way to present climate change to a variety of audiences, whether it is the public, students, or just a colleague and peer.
Klett, Timothy R.; Pitman, Janet K.; Moore, Thomas E.; Gautier, D.L.
2017-11-15
The U.S. Geological Survey (USGS) recently assessed the potential for undiscovered oil and gas resources of the North Kara Basins and Platforms Province as part of the its Circum-Arctic Resource Appraisal. This geologic province is north of western Siberia, Russian Federation, in the North Kara Sea between Novaya Zemlya to the west and Severnaya Zemlya to the east. One assessment unit (AU) was defined, the North Kara Basins and Platforms AU, which coincides with the geologic province. This AU was assessed for undiscovered, technically recoverable resources. The total estimated mean volumes of undiscovered petroleum resources in the province are ~1.8 billion barrels of crude oil, ~15.0 trillion cubic feet of natural gas, and ~0.4 billion barrels of natural-gas liquids, all north of the Arctic Circle.
Digital dissemination platform of transportation engineering education materials.
DOT National Transportation Integrated Search
2014-09-01
National agencies have called for more widespread adoption of best practices in engineering education. To facilitate this sharing of practices we will develop a web-based system that will be used by transportation engineering educators to share curri...
Cloud-Based Data Sharing Connects Emergency Managers
NASA Technical Reports Server (NTRS)
2014-01-01
Under an SBIR contract with Stennis Space Center, Baltimore-based StormCenter Communications Inc. developed an improved interoperable platform for sharing geospatial data over the Internet in real time-information that is critical for decision makers in emergency situations.
Sciarra, Adilia Maria Pires; Croti, Ulisses Alexandre; Batigalia, Fernando
2014-01-01
Congenital heart diseases are the world's most common major birth defect, affecting one in every 120 children. Ninety percent of these children are born in areas where appropriate medical care is inadequate or unavailable. To share knowledge and experience between an international center of excellence in pediatric cardiac surgery and a related program in Brazil. The strategy used by the program was based on long-term technological and educational support models used in that center, contributing to the creation and implementation of new programs. The Telemedicine platform was used for real-time monthly broadcast of themes. A chat software was used for interaction between participating members and the group from the center of excellence. Professionals specialized in care provided to the mentioned population had the opportunity to share to the knowledge conveyed. It was possible to observe that the technological resources that implement the globalization of human knowledge were effective in the dissemination and improvement of the team regarding the care provided to children with congenital heart diseases.
Sciarra, Adilia Maria Pires; Croti, Ulisses Alexandre; Batigalia, Fernando
2014-01-01
Introduction Congenital heart diseases are the world's most common major birth defect, affecting one in every 120 children. Ninety percent of these children are born in areas where appropriate medical care is inadequate or unavailable. Objective To share knowledge and experience between an international center of excellence in pediatric cardiac surgery and a related program in Brazil. Methods The strategy used by the program was based on long-term technological and educational support models used in that center, contributing to the creation and implementation of new programs. The Telemedicine platform was used for real-time monthly broadcast of themes. A chat software was used for interaction between participating members and the group from the center of excellence. Results Professionals specialized in care provided to the mentioned population had the opportunity to share to the knowledge conveyed. Conclusion It was possible to observe that the technological resources that implement the globalization of human knowledge were effective in the dissemination and improvement of the team regarding the care provided to children with congenital heart diseases. PMID:24896168
MinT: Middleware for Cooperative Interaction of Things
Jeon, Soobin; Jung, Inbum
2017-01-01
This paper proposes an Internet of Things (IoT) middleware called Middleware for Cooperative Interaction of Things (MinT). MinT supports a fully distributed IoT environment in which IoT devices directly connect to peripheral devices easily construct a local or global network, and share their data in an energy efficient manner. MinT provides a sensor abstract layer, a system layer and an interaction layer. These enable integrated sensing device operations, efficient resource management, and active interconnection between peripheral IoT devices. In addition, MinT provides a high-level API to develop IoT devices easily for IoT device developers. We aim to enhance the energy efficiency and performance of IoT devices through the performance improvements offered by MinT resource management and request processing. The experimental results show that the average request rate increased by 25% compared to Californium, which is a middleware for efficient interaction in IoT environments with powerful performance, an average response time decrease of 90% when resource management was used, and power consumption decreased by up to 68%. Finally, the proposed platform can reduce the latency and power consumption of IoT devices. PMID:28632182
MinT: Middleware for Cooperative Interaction of Things.
Jeon, Soobin; Jung, Inbum
2017-06-20
This paper proposes an Internet of Things (IoT) middleware called Middleware for Cooperative Interaction of Things (MinT). MinT supports a fully distributed IoT environment in which IoT devices directly connect to peripheral devices easily construct a local or global network, and share their data in an energy efficient manner. MinT provides a sensor abstract layer, a system layer and an interaction layer. These enable integrated sensing device operations, efficient resource management, and active interconnection between peripheral IoT devices. In addition, MinT provides a high-level API to develop IoT devices easily for IoT device developers. We aim to enhance the energy efficiency and performance of IoT devices through the performance improvements offered by MinT resource management and request processing. The experimental results show that the average request rate increased by 25% compared to Californium, which is a middleware for efficient interaction in IoT environments with powerful performance, an average response time decrease of 90% when resource management was used, and power consumption decreased by up to 68%. Finally, the proposed platform can reduce the latency and power consumption of IoT devices.
Evaluating interactive computer-based scenarios designed for learning medical technology.
Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Wallergård, Mattias; Johansson, Gerd
2014-11-01
The use of medical equipment is growing in healthcare, resulting in an increased need for resources to educate users in how to manage the various devices. Learning the practical operation of a device is one thing, but learning how to work with the device in the actual clinical context is more challenging. This paper presents a computer-based simulation prototype for learning medical technology in the context of critical care. Properties from simulation and computer games have been adopted to create a visualization-based, interactive and contextually bound tool for learning. A participatory design process, including three researchers and three practitioners from a clinic for infectious diseases, was adopted to adjust the form and content of the prototype to the needs of the clinical practice and to create a situated learning experience. An evaluation with 18 practitioners showed that practitioners were positive to this type of tool for learning and that it served as a good platform for eliciting and sharing knowledge. Our conclusion is that this type of tools can be a complement to traditional learning resources to situate the learning in a context without requiring advanced technology or being resource-demanding. Copyright © 2014 Elsevier Ltd. All rights reserved.
The Development and Evaluation of a Network for Producing and Sharing Video Presentations
ERIC Educational Resources Information Center
Sadik, Alaa
2014-01-01
This paper describes the technology and methodology used in the development and evaluation of an online network to help the instructors to produce and share video presentations in a new and innovative way. The network offers an application and platform for recording and sharing video presentations. The application allows instructors to narrate and…
NASA Astrophysics Data System (ADS)
Cardille, J. A.; Gonzales, R.; Parrott, L.; Bai, J.
2009-12-01
How should researchers store and share data? For most of history, scientists with results and data to share have been mostly limited to books and journal articles. In recent decades, the advent of personal computers and shared data formats has made it feasible, though often cumbersome, to transfer data between individuals or among small groups. Meanwhile, the use of automatic samplers, simulation models, and other data-production techniques has increased greatly. The result is that there is more and more data to store, and a greater expectation that they will be available at the click of a button. In 10 or 20 years, will we still send emails to each other to learn about what data exist? The development and widespread familiarity with virtual globes like Google Earth and NASA WorldWind has created the potential, in just the last few years, to revolutionize the way we share data, search for and search through data, and understand the relationship between individual projects in research networks, where sharing and dissemination of knowledge is encouraged. For the last two years, we have been building the GeoSearch application, a cutting-edge online resource for the storage, sharing, search, and retrieval of data produced by research networks. Linking NASA’s WorldWind globe platform, the data browsing toolkit prefuse, and SQL databases, GeoSearch’s version 1.0 enables flexible searches and novel geovisualizations of large amounts of related scientific data. These data may be submitted to the database by individual researchers and processed by GeoSearch’s data parser. Ultimately, data from research groups gathered in a research network would be shared among users via the platform. Access is not limited to the scientists themselves; administrators can determine which data can be presented publicly and which require group membership. Under the auspices of the Canada’s Sustainable Forestry Management Network of Excellence, we have created a moderate-sized database of ecological measurements in forests; we expect to extend the approach to a Quebec lake research network encompassing decades of lake measurements. In this session, we will describe and present four related components of the new system: GeoSearch’s globe-based searching and display of scientific data; prefuse-based visualization of social connections among members of a scientific research network; geolocation of research projects using Google Spreadsheets, KML, and Google Earth/Maps; and collaborative construction of a geolocated database of research articles. Each component is designed to have applications for scientists themselves as well as the general public. Although each implementation is in its infancy, we believe they could be useful to other researcher networks.
Attention and Visuospatial Working Memory Share the Same Processing Resources
Feng, Jing; Pratt, Jay; Spence, Ian
2012-01-01
Attention and visuospatial working memory (VWM) share very similar characteristics; both have the same upper bound of about four items in capacity and they recruit overlapping brain regions. We examined whether both attention and VWM share the same processing resources using a novel dual-task costs approach based on a load-varying dual-task technique. With sufficiently large loads on attention and VWM, considerable interference between the two processes was observed. A further load increase on either process produced reciprocal increases in interference on both processes, indicating that attention and VWM share common resources. More critically, comparison among four experiments on the reciprocal interference effects, as measured by the dual-task costs, demonstrates no significant contribution from additional processing other than the shared processes. These results support the notion that attention and VWM share the same processing resources. PMID:22529826
Learning by Doing: How to Develop a Cross-Platform Web App
ERIC Educational Resources Information Center
Huynh, Minh; Ghimire, Prashant
2015-01-01
As mobile devices become prevalent, there is always a need for apps. How hard is it to develop an app, especially a cross-platform app? The paper shares an experience in a project that involved the development of a student services web app that can be run on cross-platform mobile devices. The paper first describes the background of the project,…
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...
A Bibliographic Bank for Resource Sharing in Library Systems: A Feasibility Study. Final Report.
ERIC Educational Resources Information Center
Schwartz, Eugene S.; Saxe, Henry I.
This study of resource sharing among public libraries was made possible by six library systems in northern Illinois. With the organization of the library systems and development of interlibrary loan services and other cooperative activities, the problem of extending resource sharing among member libraries and between library systems arose. Several…
David B. Butts
1987-01-01
Wildfires do not respect property boundaries. Whole geographic regions are typically impacted by major wildfire outbreaks. Various fire related resources can be shared to solve such crises; whether they are shared, and how they are shared depends to a great extent upon the rapport among the agencies involved. Major progress has been achieved over the past decade...
OpenQuake, a platform for collaborative seismic hazard and risk assessment
NASA Astrophysics Data System (ADS)
Henshaw, Paul; Burton, Christopher; Butler, Lars; Crowley, Helen; Danciu, Laurentiu; Nastasi, Matteo; Monelli, Damiano; Pagani, Marco; Panzeri, Luigi; Simionato, Michele; Silva, Vitor; Vallarelli, Giuseppe; Weatherill, Graeme; Wyss, Ben
2013-04-01
Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, tools and models for global seismic hazard and risk assessment, within the context of the Global Earthquake Model (GEM). Guided by the needs and experiences of governments, companies and international organisations, all contributions are being integrated into OpenQuake: a web-based platform that - together with other resources - will become accessible in 2014. With OpenQuake, stakeholders worldwide will be able to calculate, visualize and investigate earthquake hazard and risk, capture new data and share findings for joint learning. The platform is envisaged as a collaborative hub for earthquake risk assessment, used at global and local scales, around which an active network of users has formed. OpenQuake will comprise both online and offline tools, many of which can also be used independently. One of the first steps in OpenQuake development was the creation of open-source software for advanced seismic hazard and risk calculations at any scale, the OpenQuake Engine. Although in continuous development, a command-line version of the software is already being test-driven and used by hundreds worldwide; from non-profits in Central Asia, seismologists in sub-Saharan Africa and companies in South Asia to the European seismic hazard harmonization programme (SHARE). In addition, several technical trainings were organized with scientists from different regions of the world (sub-Saharan Africa, Central Asia, Asia-Pacific) to introduce the engine and other OpenQuake tools to the community, something that will continue to happen over the coming years. Other tools that are being developed of direct interest to the hazard community are: • OpenQuake Modeller; fundamental instruments for the creation of seismogenic input models for seismic hazard assessment, a critical input to the OpenQuake Engine. OpenQuake Modeller will consist of a suite of tools (Hazard Modellers Toolkit) for characterizing the seismogenic sources of earthquakes and their models of earthquakes recurrence. An earthquake catalogue homogenization tool, for integration, statistical comparison and user-defined harmonization of multiple catalogues of earthquakes is also included in the OpenQuake modeling tools. • A data capture tool for active faults; a tool that allows geologists to draw (new) fault discoveries on a map in an intuitive GIS-environment and add details on the fault through the tool. This data, once quality checked, can then be integrated with the global active faults database, which will increase in value with every new fault insertion. Building on many ongoing efforts and the knowledge of scientists worldwide, GEM will for the first time integrate state-of-the-art data, models, results and open-source tools into a single platform. The platform will continue to increase in value, in particular for use in local contexts, through contributions from and collaborations with scientists and organisations worldwide. This presentation will showcase the OpenQuake Platform, focusing on the IT solutions that have been adopted as well as the added value that the platform will bring to scientists worldwide.
A Digital Repository and Execution Platform for Interactive Scholarly Publications in Neuroscience.
Hodge, Victoria; Jessop, Mark; Fletcher, Martyn; Weeks, Michael; Turner, Aaron; Jackson, Tom; Ingram, Colin; Smith, Leslie; Austin, Jim
2016-01-01
The CARMEN Virtual Laboratory (VL) is a cloud-based platform which allows neuroscientists to store, share, develop, execute, reproduce and publicise their work. This paper describes new functionality in the CARMEN VL: an interactive publications repository. This new facility allows users to link data and software to publications. This enables other users to examine data and software associated with the publication and execute the associated software within the VL using the same data as the authors used in the publication. The cloud-based architecture and SaaS (Software as a Service) framework allows vast data sets to be uploaded and analysed using software services. Thus, this new interactive publications facility allows others to build on research results through reuse. This aligns with recent developments by funding agencies, institutions, and publishers with a move to open access research. Open access provides reproducibility and verification of research resources and results. Publications and their associated data and software will be assured of long-term preservation and curation in the repository. Further, analysing research data and the evaluations described in publications frequently requires a number of execution stages many of which are iterative. The VL provides a scientific workflow environment to combine software services into a processing tree. These workflows can also be associated with publications and executed by users. The VL also provides a secure environment where users can decide the access rights for each resource to ensure copyright and privacy restrictions are met.
Cytobank: providing an analytics platform for community cytometry data analysis and collaboration.
Chen, Tiffany J; Kotecha, Nikesh
2014-01-01
Cytometry is used extensively in clinical and laboratory settings to diagnose and track cell subsets in blood and tissue. High-throughput, single-cell approaches leveraging cytometry are developed and applied in the computational and systems biology communities by researchers, who seek to improve the diagnosis of human diseases, map the structures of cell signaling networks, and identify new cell types. Data analysis and management present a bottleneck in the flow of knowledge from bench to clinic. Multi-parameter flow and mass cytometry enable identification of signaling profiles of patient cell samples. Currently, this process is manual, requiring hours of work to summarize multi-dimensional data and translate these data for input into other analysis programs. In addition, the increase in the number and size of collaborative cytometry studies as well as the computational complexity of analytical tools require the ability to assemble sufficient and appropriately configured computing capacity on demand. There is a critical need for platforms that can be used by both clinical and basic researchers who routinely rely on cytometry. Recent advances provide a unique opportunity to facilitate collaboration and analysis and management of cytometry data. Specifically, advances in cloud computing and virtualization are enabling efficient use of large computing resources for analysis and backup. An example is Cytobank, a platform that allows researchers to annotate, analyze, and share results along with the underlying single-cell data.
Song, Jinzhao; Pandian, Vikram; Mauk, Michael G; Bau, Haim H; Cherry, Sara; Tisi, Laurence C; Liu, Changchun
2018-04-03
Rapid and quantitative molecular diagnostics in the field, at home, and at remote clinics is essential for evidence-based disease management, control, and prevention. Conventional molecular diagnostics requires extensive sample preparation, relatively sophisticated instruments, and trained personnel, restricting its use to centralized laboratories. To overcome these limitations, we designed a simple, inexpensive, hand-held, smartphone-based mobile detection platform, dubbed "smart-connected cup" (SCC), for rapid, connected, and quantitative molecular diagnostics. Our platform combines bioluminescent assay in real-time and loop-mediated isothermal amplification (BART-LAMP) technology with smartphone-based detection, eliminating the need for an excitation source and optical filters that are essential in fluorescent-based detection. The incubation heating for the isothermal amplification is provided, electricity-free, with an exothermic chemical reaction, and incubation temperature is regulated with a phase change material. A custom Android App was developed for bioluminescent signal monitoring and analysis, target quantification, data sharing, and spatiotemporal mapping of disease. SCC's utility is demonstrated by quantitative detection of Zika virus (ZIKV) in urine and saliva and HIV in blood within 45 min. We demonstrate SCC's connectivity for disease spatiotemporal mapping with a custom-designed website. Such a smart- and connected-diagnostic system does not require any lab facilities and is suitable for use at home, in the field, in the clinic, and particularly in resource-limited settings in the context of Internet of Medical Things (IoMT).
DCO-VIVO: A Collaborative Data Platform for the Deep Carbon Science Communities
NASA Astrophysics Data System (ADS)
Wang, H.; Chen, Y.; West, P.; Erickson, J. S.; Ma, X.; Fox, P. A.
2014-12-01
Deep Carbon Observatory (DCO) is a decade-long scientific endeavor to understand carbon in the complex deep Earth system. Thousands of DCO scientists from institutions across the globe are organized into communities representing four domains of exploration: Extreme Physics and Chemistry, Reservoirs and Fluxes, Deep Energy, and Deep Life. Cross-community and cross-disciplinary collaboration is one of the most distinctive features in DCO's flexible research framework. VIVO is an open-source Semantic Web platform that facilitates cross-institutional researcher and research discovery. it includes a number of standard ontologies that interconnect people, organizations, publications, activities, locations, and other entities of research interest to enable browsing, searching, visualizing, and generating Linked Open (research) Data. The DCO-VIVO solution expedites research collaboration between DCO scientists and communities. Based on DCO's specific requirements, the DCO Data Science team developed a series of extensions to the VIVO platform including extending the VIVO information model, extended query over the semantic information within VIVO, integration with other open source collaborative environments and data management systems, using single sign-on, assigning of unique Handles to DCO objects, and publication and dataset ingesting extensions using existing publication systems. We present here the iterative development of these requirements that are now in daily use by the DCO community of scientists for research reporting, information sharing, and resource discovery in support of research activities and program management.
NASA Astrophysics Data System (ADS)
Celicourt, P.; Sam, R.; Piasecki, M.
2016-12-01
Global phenomena such as climate change and large scale environmental degradation require the collection of accurate environmental data at detailed spatial and temporal scales from which knowledge and actionable insights can be derived using data science methods. Despite significant advances in sensor network technologies, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome and expensive task. These factors demonstrate why environmental data collection remains a challenge especially in developing countries where technical infrastructure, expertise and pecuniary resources are scarce. In addition, they also demonstrate the reason why dense and long-term environmental data collection has been historically quite difficult. Moreover, hydrometeorological data collection efforts usually overlook the (critically important) inclusion of a standards-based system for storing, managing, organizing, indexing, documenting and sharing sensor data. We are developing a cross-platform software framework using the Python programming language that will allow us to develop a low cost end-to-end (from sensor to publication) system for hydrometeorological conditions monitoring. The software framework contains provision for sensor, sensor platforms, calibration and network protocols description, sensor programming, data storage, data publication and visualization and more importantly data retrieval in a desired unit system. It is being tested on the Raspberry Pi microcomputer as end node and a laptop PC as the base station in a wireless setting.
Shared communications. Volume I, a summary and literature review
DOT National Transportation Integrated Search
2004-09-01
This paper provides a review of examples from the literature of shared communication resources and of agencies and/or organizations that share communication resources. The primary emphasis is on rural, intelligent transportation system communications...
NASA Astrophysics Data System (ADS)
Zhang, L.; Zhang, W.; Zeng, S. J.; Na, W.; Yang, H.; Huang, J.; Tan, X. D.; Sun, Z. J.
2015-08-01
The Silk Road, a major traffic route across the Eurasia continent, has been a convergence for the exchange, communication and dissemination of various cultures such as nations, materials, religions and arts for more than two thousand years. And the cultural heritage along the long and complicate route has been also attractive. In recent years, the Silk Road - the Road Network along the Chang'an-Tianshan Mountain has been listed in the Directory of World Cultural Heritage. The rare and rich cultural resources along the Silk Road, especially those in the territory of China, have attracted attentions of the world. This article describes the research ideas, methods, processes and results of the planning design on the internet-based dissemination services platform system for cultural heritage resources. First of all, it has defined the targeting for dissemination services and the research methods applied for the Silk Road heritage resources, based on scientific and objective spatial measurement and research on history and geography, to carry on the excavation of values of cultural resource for the target users. Then, with the front-end art exhibit by means of innovative IT, time and space maps of cultural heritage resources, interactive graphics display, panoramic three-dimensional virtual tour, and the Silk Road topics as the main features, a comprehensive and multi-angle cultural resources dissemination services platform is built. The research core of the platform is a demand-oriented system design on the basis of cultural resources and features as the fundamental, the value of contemporary manifestation as the foundation, and cultural dissemination and service as a starting point. This platform has achieved, temporal context generalization, interest profiles extension, online and offline adaptation, and other prominent innovations. On the basis of routes heritage resource protection and dissemination services with complex relationship between time and space, and the Silk Road as the representative, practice and research of the platform in the internet context help to provide an application reference and theoretical basis.
Mougin, Christian; Artige, Emmanuelle; Marchand, Frédéric; Mondy, Samuel; Ratié, Céline; Sellier, Nadine; Castagnone-Sereno, Philippe; D'Acier, Armelle Cœur; Esmenjaud, Daniel; Faivre-Primot, Céline; Granjon, Laurent; Hamelet, Valérie; Lange, Frederic; Pagès, Sylvie; Rimet, Frédéric; Ris, Nicolas; Sallé, Guillaume
2018-04-19
The Biological Resource Centre for the Environment BRC4Env is a network of Biological Resource Centres (BRCs) and collections whose leading objectives are to improve the visibility of genetic and biological resources maintained by its BRCs and collections and to facilitate their use by a large research community, from agriculture research to life sciences and environmental sciences. Its added value relies on sharing skills, harmonizing practices, triggering projects in comparative biology, and ultimately proposing a single-entry portal to facilitate access to documented samples, taking into account the partnership policies of research institutions as well as the legal frame which varies with the biological nature of resources. BRC4Env currently includes three BRCs: the Centre for Soil Genetic Resources of the platform GenoSol, in partnership with the European Conservatory of Soil Samples; the Egg Parasitoids Collection (EP-Coll); and the collection of ichthyological samples, Colisa. BRC4Env is also associated to several biological collections: microbial consortia (entomopathogenic bacteria, freshwater microalgae…), terrestrial arthropods, nematodes (plant parasitic, entomopathogenic, animal parasitic...), and small mammals. The BRCs and collections of BRC4Env are involved in partnership with academic scientists, as well as private companies, in the fields of medicinal mining, biocontrol, sustainable agriculture, and additional sectors. Moreover, the staff of the BRCs is involved in many training courses for students from French licence degree to Ph.D, engineers, as well as ongoing training.
All inequality is not equal: children correct inequalities using resource value.
Shaw, Alex; Olson, Kristina R
2013-01-01
Fairness concerns guide children's judgments about how to share resources with others. However, it is unclear from past research if children take extant inequalities or the value of resources involved in an inequality into account when sharing with others; these questions are the focus of the current studies. In all experiments, children saw an inequality between two recipients-one had two more resources than another. What varied between conditions was the value of the resources that the child could subsequently distribute. When the resources were equal in value to those involved in the original inequality, children corrected the previous inequality by giving two resources to the child with fewer resources (Experiment 1). However, as the value of the resources increased relative to those initially shared by the experimenter, children were more likely to distribute the two high value resources equally between the two recipients, presumably to minimize the overall inequality in value (Experiments 1 and 2). We found that children specifically use value, not just size, when trying to equalize outcomes (Experiment 3) and further found that children focus on the relative rather than absolute value of the resources they share-when the experimenter had unequally distributed the same high value resource that the child would later share, children corrected the previous inequality by giving two high value resources to the person who had received fewer high value resources. These results illustrate that children attempt to correct past inequalities and try to maintain equality not just in the count of resources but also by using the value of resources.
Ontology-Based Empirical Knowledge Verification for Professional Virtual Community
ERIC Educational Resources Information Center
Chen, Yuh-Jen
2011-01-01
A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…
Knowledge discovery through games and game theory
NASA Astrophysics Data System (ADS)
Smith, James F., III; Rhyne, Robert D.
2001-03-01
A fuzzy logic based expert system has been developed that automatically allocates electronic attack (EA) resources in real-time over many dissimilar platforms. The platforms can be very general, e.g., ships, planes, robots, land based facilities, etc. Potential foes the platforms deal with can also be general. The initial version of the algorithm was optimized using a genetic algorithm employing fitness functions constructed based on expertise. A new approach is being explored that involves embedding the resource manager in a electronic game environment. The game allows a human expert to play against the resource manager in a simulated battlespace with each of the defending platforms being exclusively directed by the fuzzy resource manager and the attacking platforms being controlled by the human expert or operating autonomously under their own logic. This approach automates the data mining problem. The game automatically creates a database reflecting the domain expert's knowledge, it calls a data mining function, a genetic algorithm, for data mining of the database as required. The game allows easy evaluation of the information mined in the second step. The measure of effectiveness (MOE) for re-optimization is discussed. The mined information is extremely valuable as shown through demanding scenarios.
Learning about water resource sharing through game play
NASA Astrophysics Data System (ADS)
Ewen, Tracy; Seibert, Jan
2016-10-01
Games are an optimal way to teach about water resource sharing, as they allow real-world scenarios to be enacted. Both students and professionals learning about water resource management can benefit from playing games, through the process of understanding both the complexity of sharing of resources between different groups and decision outcomes. Here we address how games can be used to teach about water resource sharing, through both playing and developing water games. An evaluation of using the web-based game Irrigania in the classroom setting, supported by feedback from several educators who have used Irrigania to teach about the sustainable use of water resources, and decision making, at university and high school levels, finds Irrigania to be an effective and easy tool to incorporate into a curriculum. The development of two water games in a course for masters students in geography is also presented as a way to teach and communicate about water resource sharing. Through game development, students learned soft skills, including critical thinking, problem solving, team work, and time management, and overall the process was found to be an effective way to learn about water resource decision outcomes. This paper concludes with a discussion of learning outcomes from both playing and developing water games.
From Data to Knowledge: GEOSS experience and the GEOSS Knowledge Base contribution to the GCI
NASA Astrophysics Data System (ADS)
Santoro, M.; Nativi, S.; Mazzetti, P., Sr.; Plag, H. P.
2016-12-01
According to systems theory, data is raw, it simply exists and has no significance beyond its existence; while, information is data that has been given meaning by way of relational connection. The appropriate collection of information, such that it contributes to understanding, is a process of knowledge creation.The Global Earth Observation System of Systems (GEOSS) developed by the Group on Earth Observations (GEO) is a set of coordinated, independent Earth observation, information and processing systems that interact and provide access to diverse information for a broad range of users in both public and private sectors. GEOSS links these systems to strengthen the monitoring of the state of the Earth. In the past ten years, the development of GEOSS has taught several lessons dealing with the need to move from (open) data to information and knowledge sharing. Advanced user-focused services require to move from a data-driven framework to a knowledge sharing platform. Such a platform needs to manage information and knowledge, in addition to datasets linked to them. For this scope, GEO has launched a specific task called "GEOSS Knowledge Base", which deals with resources, like user requirements, Sustainable Development Goals (SDGs), observation and processing ontologies, publications, guidelines, best practices, business processes/algorithms, definition of advanced concepts like Essential Variables (EVs), indicators, strategic goals, etc. In turn, information and knowledge (e.g. guidelines, best practices, user requirements, business processes, algorithms, etc.) can be used to generate additional information and knowledge from shared datasets. To fully utilize and leverage the GEOSS Knowledge Base, the current GEOSS Common Infrastructure (GCI) model will be extended and advanced to consider important concepts and implementation artifacts, such as data processing services and environmental/economic models as well as EVs, Primary Indicators, and SDGs. The new GCI model will link these concepts to the present dataset, observation and sensor concepts, enabling a set of very important new capabilities to be offered to GEOSS users.
NASA Astrophysics Data System (ADS)
Holloway, T.; Hastings, M. G.; Barnes, R. T.; Fischer, E. V.; Wiedinmyer, C.; Rodriguez, C.; Adams, M. S.; Marin-Spiotta, E.
2014-12-01
The Earth Science Women's Network (ESWN) is an international peer-mentoring organization with over 2000 members, dedicated to career development and community for women across the geosciences. Since its formation in 2002, ESWN has supported the growth of a more diverse scientific community through a combination of online and in-person networking activities. Lessons learned related to online networking and community-building will be presented. ESWN serves upper-level undergraduates, graduate students, professionals in a range of environmental fields, scientists working in federal and state governments, post-doctoral researchers, and academic faculty and scientists. Membership includes women working in over 50 countries, although the majority of ESWN members work in the U.S. ESWN increases retention of women in the geosciences by enabling and supporting professional person-to-person connections. This approach has been shown to reduce feelings of isolation among our members and help build professional support systems critical to career success. In early 2013 ESWN transitioned online activities to an advanced social networking platform that supports discussion threads, group formation, and individual messaging. Prior to that, on-line activities operated through a traditional list-serve, hosted by the National Center for Atmospheric Research (NCAR). The new web center, http://eswnonline.org, serves as the primary forum for members to build connections, seek advice, and share resources. For example, members share job announcements, discuss issues of work-life balance, and organize events at professional conferences. ESWN provides a platform for problem-based mentoring, drawing from the wisdom of colleagues across a range of career stages.
Virtual Exploitation Environment Demonstration for Atmospheric Missions
NASA Astrophysics Data System (ADS)
Natali, Stefano; Mantovani, Simone; Hirtl, Marcus; Santillan, Daniel; Triebnig, Gerhard; Fehr, Thorsten; Lopes, Cristiano
2017-04-01
The scientific and industrial communities are being confronted with a strong increase of Earth Observation (EO) satellite missions and related data. This is in particular the case for the Atmospheric Sciences communities, with the upcoming Copernicus Sentinel-5 Precursor, Sentinel-4, -5 and -3, and ESA's Earth Explorers scientific satellites ADM-Aeolus and EarthCARE. The challenge is not only to manage the large volume of data generated by each mission / sensor, but to process and analyze the data streams. Creating synergies among the different datasets will be key to exploit the full potential of the available information. As a preparation activity supporting scientific data exploitation for Earth Explorer and Sentinel atmospheric missions, ESA funded the "Technology and Atmospheric Mission Platform" (TAMP) [1] [2] project; a scientific and technological forum (STF) has been set-up involving relevant European entities from different scientific and operational fields to define the platforḿs requirements. Data access, visualization, processing and download services have been developed to satisfy useŕs needs; use cases defined with the STF, such as study of the SO2 emissions for the Holuhraun eruption (2014) by means of two numerical models, two satellite platforms and ground measurements, global Aerosol analyses from long time series of satellite data, and local Aerosol analysis using satellite and LIDAR, have been implemented to ensure acceptance of TAMP by the atmospheric sciences community. The platform pursues the "virtual workspace" concept: all resources (data, processing, visualization, collaboration tools) are provided as "remote services", accessible through a standard web browser, to avoid the download of big data volumes and for allowing utilization of provided infrastructure for computation, analysis and sharing of results. Data access and processing are achieved through standardized protocols (WCS, WPS). As evolution toward a pre-operational environment, the "Virtual Exploitation Environment Demonstration for Atmospheric Missions" (VEEDAM) aims at maintaining, running and evolving the platform, demonstrating e.g. the possibility to perform massive processing over heterogeneous data sources. This work presents the VEEDAM concepts, provides pre-operational examples, stressing on the interoperability achievable exposing standardized data access and processing services (e.g. making accessible data and processing resources from different VREs). [1] TAMP platform landing page http://vtpip.zamg.ac.at/ [2] TAMP introductory video https://www.youtube.com/watch?v=xWiy8h1oXQY
30 CFR 56.11027 - Scaffolds and working platforms.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Scaffolds and working platforms. 56.11027 Section 56.11027 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Travelways...
NASA Astrophysics Data System (ADS)
Kibria, Mirza Golam; Villardi, Gabriel Porto; Ishizu, Kentaro; Kojima, Fumihide; Yano, Hiroyuki
2016-12-01
In this paper, we study inter-operator spectrum sharing and intra-operator resource allocation in shared spectrum access communication systems and propose efficient dynamic solutions to address both inter-operator and intra-operator resource allocation optimization problems. For inter-operator spectrum sharing, we present two competent approaches, namely the subcarrier gain-based sharing and fragmentation-based sharing, which carry out fair and flexible allocation of the available shareable spectrum among the operators subject to certain well-defined sharing rules, traffic demands, and channel propagation characteristics. The subcarrier gain-based spectrum sharing scheme has been found to be more efficient in terms of achieved throughput. However, the fragmentation-based sharing is more attractive in terms of computational complexity. For intra-operator resource allocation, we consider resource allocation problem with users' dissimilar service requirements, where the operator supports users with delay constraint and non-delay constraint service requirements, simultaneously. This optimization problem is a mixed-integer non-linear programming problem and non-convex, which is computationally very expensive, and the complexity grows exponentially with the number of integer variables. We propose less-complex and efficient suboptimal solution based on formulating exact linearization, linear approximation, and convexification techniques for the non-linear and/or non-convex objective functions and constraints. Extensive simulation performance analysis has been carried out that validates the efficiency of the proposed solution.
A High-Throughput Processor for Flight Control Research Using Small UAVs
NASA Technical Reports Server (NTRS)
Klenke, Robert H.; Sleeman, W. C., IV; Motter, Mark A.
2006-01-01
There are numerous autopilot systems that are commercially available for small (<100 lbs) UAVs. However, they all share several key disadvantages for conducting aerodynamic research, chief amongst which is the fact that most utilize older, slower, 8- or 16-bit microcontroller technologies. This paper describes the development and testing of a flight control system (FCS) for small UAV s based on a modern, high throughput, embedded processor. In addition, this FCS platform contains user-configurable hardware resources in the form of a Field Programmable Gate Array (FPGA) that can be used to implement custom, application-specific hardware. This hardware can be used to off-load routine tasks such as sensor data collection, from the FCS processor thereby further increasing the computational throughput of the system.
The Scalable Checkpoint/Restart Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moody, A.
The Scalable Checkpoint/Restart (SCR) library provides an interface that codes may use to worite our and read in application-level checkpoints in a scalable fashion. In the current implementation, checkpoint files are cached in local storage (hard disk or RAM disk) on the compute nodes. This technique provides scalable aggregate bandwidth and uses storage resources that are fully dedicated to the job. This approach addresses the two common drawbacks of checkpointing a large-scale application to a shared parallel file system, namely, limited bandwidth and file system contention. In fact, on current platforms, SCR scales linearly with the number of compute nodes.more » It has been benchmarked as high as 720GB/s on 1094 nodes of Atlas, which is nearly two orders of magnitude faster thanthe parallel file system.« less
Medical faculties educational network: multidimensional quality assessment.
Komenda, Martin; Schwarz, Daniel; Feberová, Jitka; Stípek, Stanislav; Mihál, Vladimír; Dušek, Ladislav
2012-12-01
Today, World Wide Web technology provides many opportunities in the disclosure of electronic learning and teaching content. The MEFANET project (MEdical FAculties NETwork) has initiated international, effective and open cooperation among all Czech and Slovak medical faculties in the medical education fields. This paper introduces the original MEFANET educational web portal platform. Its main aim is to present the unique collaborative environment, which combines the sharing of electronic educational resources with the use tools for their quality evaluation. It is in fact a complex e-publishing system, which consists of ten standalone portal instances and one central gateway. The fundamental principles of the developed system and used technologies are reported here, as well as procedures of a new multidimensional quality assessment. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Design and implementation of a secure wireless mote-based medical sensor network.
Malasri, Kriangsiri; Wang, Lan
2009-01-01
A medical sensor network can wirelessly monitor vital signs of humans, making it useful for long-term health care without sacrificing patient comfort and mobility. For such a network to be viable, its design must protect data privacy and authenticity given that medical data are highly sensitive. We identify the unique security challenges of such a sensor network and propose a set of resource-efficient mechanisms to address these challenges. Our solution includes (1) a novel two-tier scheme for verifying the authenticity of patient data, (2) a secure key agreement protocol to set up shared keys between sensor nodes and base stations, and (3) symmetric encryption/decryption for protecting data confidentiality and integrity. We have implemented the proposed mechanisms on a wireless mote platform, and our results confirm their feasibility.
Spreading Ebola Panic: Newspaper and Social Media Coverage of the 2014 Ebola Health Crisis.
Kilgo, Danielle K; Yoo, Joseph; Johnson, Thomas J
2018-02-23
During times of hot crises, traditional news organizations have historically contributed to public fear and panic by emphasizing risks and uncertainties. The degree to which digital and social media platforms contribute to this panic is essential to consider in the new media landscape. This research examines news coverage of the 2014 Ebola crisis, exploring differences in presentation between newspaper coverage and news shared on the social news platform Reddit. Results suggest that news shared on Reddit amplified panic and uncertainty surrounding Ebola, while traditional newspaper coverage was significantly less likely to produce panic-inducing coverage.
The application of network teaching in applied optics teaching
NASA Astrophysics Data System (ADS)
Zhao, Huifu; Piao, Mingxu; Li, Lin; Liu, Dongmei
2017-08-01
Network technology has become a creative tool of changing human productivity, the rapid development of it has brought profound changes to our learning, working and life. Network technology has many advantages such as rich contents, various forms, convenient retrieval, timely communication and efficient combination of resources. Network information resources have become the new education resources, get more and more application in the education, has now become the teaching and learning tools. Network teaching enriches the teaching contents, changes teaching process from the traditional knowledge explanation into the new teaching process by establishing situation, independence and cooperation in the network technology platform. The teacher's role has shifted from teaching in classroom to how to guide students to learn better. Network environment only provides a good platform for the teaching, we can get a better teaching effect only by constantly improve the teaching content. Changchun university of science and technology introduced a BB teaching platform, on the platform, the whole optical classroom teaching and the classroom teaching can be improved. Teachers make assignments online, students learn independently offline or the group learned cooperatively, this expands the time and space of teaching. Teachers use hypertext form related knowledge of applied optics, rich cases and learning resources, set up the network interactive platform, homework submission system, message board, etc. The teaching platform simulated the learning interest of students and strengthens the interaction in the teaching.
Development of a consent resource for genomic data sharing in the clinical setting.
Riggs, Erin Rooney; Azzariti, Danielle R; Niehaus, Annie; Goehringer, Scott R; Ramos, Erin M; Rodriguez, Laura Lyman; Knoppers, Bartha; Rehm, Heidi L; Martin, Christa Lese
2018-06-13
Data sharing between clinicians, laboratories, and patients is essential for improvements in genomic medicine, but obtaining consent for individual-level data sharing is often hindered by a lack of time and resources. To address this issue, the Clinical Genome Resource (ClinGen) developed tools to facilitate consent, including a one-page consent form and online supplemental video with information on key topics, such as risks and benefits of data sharing. To determine whether the consent form and video accurately conveyed key data sharing concepts, we surveyed 5,162 members of the general public. We measured comprehension at baseline, after reading the form and watching the video. Additionally, we assessed participants' attitudes toward genomic data sharing. Participants' performance on comprehension questions significantly improved over baseline after reading the form and continued to improve after watching the video. Results suggest reading the form alone provided participants with important knowledge regarding broad data sharing, and watching the video allowed for broader comprehension. These materials are now available at http://www.clinicalgenome.org/share . These resources will provide patients a straightforward way to share their genetic and health information, and improve the scientific community's access to data generated through routine healthcare.
Open-source mobile digital platform for clinical trial data collection in low-resource settings.
van Dam, Joris; Omondi Onyango, Kevin; Midamba, Brian; Groosman, Nele; Hooper, Norman; Spector, Jonathan; Pillai, Goonaseelan Colin; Ogutu, Bernhards
2017-02-01
Governments, universities and pan-African research networks are building durable infrastructure and capabilities for biomedical research in Africa. This offers the opportunity to adopt from the outset innovative approaches and technologies that would be challenging to retrofit into fully established research infrastructures such as those regularly found in high-income countries. In this context we piloted the use of a novel mobile digital health platform, designed specifically for low-resource environments, to support high-quality data collection in a clinical research study. Our primary aim was to assess the feasibility of a using a mobile digital platform for clinical trial data collection in a low-resource setting. Secondarily, we sought to explore the potential benefits of such an approach. The investigative site was a research institute in Nairobi, Kenya. We integrated an open-source platform for mobile data collection commonly used in the developing world with an open-source, standard platform for electronic data capture in clinical trials. The integration was developed using common data standards (Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model), maximising the potential to extend the approach to other platforms. The system was deployed in a pharmacokinetic study involving healthy human volunteers. The electronic data collection platform successfully supported conduct of the study. Multidisciplinary users reported high levels of satisfaction with the mobile application and highlighted substantial advantages when compared with traditional paper record systems. The new system also demonstrated a potential for expediting data quality review. This pilot study demonstrated the feasibility of using a mobile digital platform for clinical research data collection in low-resource settings. Sustainable scientific capabilities and infrastructure are essential to attract and support clinical research studies. Since many research structures in Africa are being developed anew, stakeholders should consider implementing innovative technologies and approaches.
Low-cost bioanalysis on paper-based and its hybrid microfluidic platforms.
Dou, Maowei; Sanjay, Sharma Timilsina; Benhabib, Merwan; Xu, Feng; Li, XiuJun
2015-12-01
Low-cost assays have broad applications ranging from human health diagnostics and food safety inspection to environmental analysis. Hence, low-cost assays are especially attractive for rural areas and developing countries, where financial resources are limited. Recently, paper-based microfluidic devices have emerged as a low-cost platform which greatly accelerates the point of care (POC) analysis in low-resource settings. This paper reviews recent advances of low-cost bioanalysis on paper-based microfluidic platforms, including fully paper-based and paper hybrid microfluidic platforms. In this review paper, we first summarized the fabrication techniques of fully paper-based microfluidic platforms, followed with their applications in human health diagnostics and food safety analysis. Then we highlighted paper hybrid microfluidic platforms and their applications, because hybrid platforms could draw benefits from multiple device substrates. Finally, we discussed the current limitations and perspective trends of paper-based microfluidic platforms for low-cost assays. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lu, H.; Yi, D.
2010-12-01
The Deep Exploration is one of the important approaches to the Geoscience research. Since 1980s we had started it and achieved a lot of data. Researchers usually integrate both data of space exploration and deep exploration to study geological structures and represent the Earth’s subsurface, and analyze and explain on the base of integrated data. Due to the different exploration approach it results the heterogeneity of data, and therefore the data achievement is always of the import issue to make the researchers confused. The problem of data share and interaction has to be solved during the development of the SinoProbe research project. Through the research of domestic and overseas well-known exploration project and geosciences data platform, the subject explores the solution of data share and interaction. Based on SOA we present the deep exploration data share framework which comprises three level: data level is used for the solution of data store and the integration of the heterogeneous data; medial level provides the data service of geophysics, geochemistry, etc. by the means of Web service, and carry out kinds of application combination by the use of GIS middleware and Eclipse RCP; interaction level provides professional and non-professional customer the access to different accuracy data. The framework adopts GeoSciML data interaction approach. GeoSciML is a geosciences information markup language, as an application of the OpenGIS Consortium’s (OGC) Geography Markup Language (GML). It transfers heterogeneous data into one earth frame and implements inter-operation. We dissertate in this article the solution how to integrate the heterogeneous data and share the data in the project of SinoProbe.
ERIC Educational Resources Information Center
Bevan, Paul; Tyler, Alyson
2009-01-01
Purpose: This paper aims to outline the developments and strategies employed to supply online library services in Wales through a national platform: library.wales.org These services include: the "Cat Cymru" cross-catalogue search, centrally procured subscription resources and local library microsites. Design/methodology/approach: The…
Creating an X Window Terminal-Based Information Technology Center.
ERIC Educational Resources Information Center
Klassen, Tim W.
1997-01-01
The creation of an information technology center at the University of Oregon Science Library is described. Goals included providing access to Internet-based resources and multimedia software, platforms for running science-oriented software, and resources so students can create multimedia materials. A mixed-lab platform was created with Unix-based…
DOT National Transportation Integrated Search
1997-06-06
Shared resource projects offer an opportunity for public transportation agencies to leverage property assets in exchange for support for transportation programs. Intelligent transportation systems (ITS) require wireline infrastructure in roadway ROW ...
BingEO: Enable Distributed Earth Observation Data for Environmental Research
NASA Astrophysics Data System (ADS)
Wu, H.; Yang, C.; Xu, Y.
2010-12-01
Our planet is facing great environmental challenges including global climate change, environmental vulnerability, extreme poverty, and a shortage of clean cheap energy. To address these problems, scientists are developing various models to analysis, forecast, simulate various geospatial phenomena to support critical decision making. These models not only challenge our computing technology, but also challenge us to feed huge demands of earth observation data. Through various policies and programs, open and free sharing of earth observation data are advocated in earth science. Currently, thousands of data sources are freely available online through open standards such as Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS). Seamless sharing and access to these resources call for a spatial Cyberinfrastructure (CI) to enable the use of spatial data for the advancement of related applied sciences including environmental research. Based on Microsoft Bing Search Engine and Bing Map, a seamlessly integrated and visual tool is under development to bridge the gap between researchers/educators and earth observation data providers. With this tool, earth science researchers/educators can easily and visually find the best data sets for their research and education. The tool includes a registry and its related supporting module at server-side and an integrated portal as its client. The proposed portal, Bing Earth Observation (BingEO), is based on Bing Search and Bing Map to: 1) Use Bing Search to discover Web Map Services (WMS) resources available over the internet; 2) Develop and maintain a registry to manage all the available WMS resources and constantly monitor their service quality; 3) Allow users to manually register data services; 4) Provide a Bing Maps-based Web application to visualize the data on a high-quality and easy-to-manipulate map platform and enable users to select the best data layers online. Given the amount of observation data accumulated already and still growing, BingEO will allow these resources to be utilized more widely, intensively, efficiently and economically in earth science applications.
Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?
Wahn, Basil; König, Peter
2017-01-01
Human information processing is limited by attentional resources. That is, via attentional mechanisms, humans select a limited amount of sensory input to process while other sensory input is neglected. In multisensory research, a matter of ongoing debate is whether there are distinct pools of attentional resources for each sensory modality or whether attentional resources are shared across sensory modalities. Recent studies have suggested that attentional resource allocation across sensory modalities is in part task-dependent. That is, the recruitment of attentional resources across the sensory modalities depends on whether processing involves object-based attention (e.g., the discrimination of stimulus attributes) or spatial attention (e.g., the localization of stimuli). In the present paper, we review findings in multisensory research related to this view. For the visual and auditory sensory modalities, findings suggest that distinct resources are recruited when humans perform object-based attention tasks, whereas for the visual and tactile sensory modalities, partially shared resources are recruited. If object-based attention tasks are time-critical, shared resources are recruited across the sensory modalities. When humans perform an object-based attention task in combination with a spatial attention task, partly shared resources are recruited across the sensory modalities as well. Conversely, for spatial attention tasks, attentional processing does consistently involve shared attentional resources for the sensory modalities. Generally, findings suggest that the attentional system flexibly allocates attentional resources depending on task demands. We propose that such flexibility reflects a large-scale optimization strategy that minimizes the brain's costly resource expenditures and simultaneously maximizes capability to process currently relevant information.
Genomic Data Commons | Office of Cancer Genomics
The NCI’s Center for Cancer Genomics launches the Genomic Data Commons (GDC), a unified data sharing platform for the cancer research community. The mission of the GDC is to enable data sharing across the entire cancer research community, to ultimately support precision medicine in oncology.
Contributions of international cooperation projects to the HIV/AIDS response in China.
Sun, Jiangping; Liu, Hui; Li, Hui; Wang, Liqiu; Guo, Haoyan; Shan, Duo; Bulterys, Marc; Korhonen, Christine; Hao, Yang; Ren, Minghui
2010-12-01
For 20 years, China has participated in 267 international cooperation projects against the HIV/AIDS epidemic and received ∼526 million USD from over 40 international organizations. These projects have played an important role by complementing national efforts in the fight against HIV/AIDS in China. The diverse characteristics of these projects followed three phases over 20 years. Initially, stand-alone projects provided technical support in surveillance, training or advocacy for public awareness. As the epidemic spread across China, projects became a part of the comprehensive and integrated national response. Currently, international best practices encourage the inclusion of civil society and non-governmental organizations in an expanded response to the epidemic. Funding from international projects has accounted for one-third of the resources provided for the HIV/AIDS response in China. Beyond this strong financial support, these programmes have introduced best practices, accelerated the introduction of AIDS policies, strengthened capacity, improved the development of grassroots social organizations and established a platform for communication and experience sharing with the international community. However, there are still challenges ahead, including integrating existing resources and exploring new programme models. The National Centre for AIDS/STD Control and Prevention (NCAIDS) in China is consolidating all international projects into national HIV prevention, treatment and care activities. International cooperation projects have been an invaluable component of China's response to HIV/AIDS, and China has now been able to take this information and share its experiences with other countries with the help of these same international programmes.
Improved Functionality and Curation Support in the ADS
NASA Astrophysics Data System (ADS)
Accomazzi, Alberto; Kurtz, Michael J.; Henneken, Edwin A.; Grant, Carolyn S.; Thompson, Donna; Chyla, Roman; Holachek, Alexandra; Sudilovsky, Vladimir; Murray, Stephen S.
2015-01-01
In this poster we describe the developments of the new ADS platform over the past year, focusing on the functionality which improves its discovery and curation capabilities.The ADS Application Programming Interface (API) is being updated to support authenticated access to the entire suite of ADS services, in addition to the search functionality itself. This allows programmatic access to resources which are specific to a user or class of users.A new interface, built directly on top of the API, now provides a more intuitive search experience and takes into account the best practices in web usability and responsive design. The interface now incorporates in-line views of graphics from the AAS Astroexplorer and the ADS All-Sky Survey image collections.The ADS Private Libraries, first introduced over 10 years ago, are now being enhanced to allow the bookmarking, tagging and annotation of records of interest. In addition, libraries can be shared with one or more ADS users, providing an easy way to collaborate in the curation of lists of papers. A library can also be explicitly made public and shared at large via the publishing of its URL.In collaboration with the AAS, the ADS plans to support the adoption of ORCID identifiers by implementing a plugin which will simplify the import of papers in ORCID via a query to the ADS API. Deeper integration between the two systems will depend on available resources and feedback from the community.
Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data
NASA Astrophysics Data System (ADS)
Hong, J. H.; Su, Y. T.
2016-06-01
With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.
Edgerton, Elizabeth; Reiney, Erin; Mueller, Siobhan; Reicherter, Barry; Curtis, Katherine; Waties, Stephanie; Limber, Susan P
2016-05-01
Every day in classrooms, playgrounds and school hallways, through text messages and mobile technology apps, children are bullied by other children. Conversations about this bullying-what it is, who is involved, and how to stop it-are taking place online. To fill a need for relevant, research-based materials on bullying, the U.S. Department of Health and Human Services' Health Resources and Services Administration worked with Widmeyer Communications to investigate the scope of media conversations about bullying and discover new strategies for promoting appropriate public health messages about bullying to intended audiences. Key components of the methodology included: analyzing common search terms and aligning social media content with terms used in searches rather than technical language; identifying influencers in social media spheres, cultivating relationships with them, and sharing their positive, relevant content; examining which digital formats are most popular for sharing and creating content across platforms; tracking and reporting on a wide variety of metrics (such as click-through and engagement rates and reach, resonance, relevance, and Klout scores) to understand conversations around bullying; and looking at online conversations and engaging participants using applicable resources and calls to action. A key finding included a significant gap between search terms and online content and has led to recommendations and comprehensive ideas for improving the reach and resonance of StopBullying.gov content and communications. © 2016 Society for Public Health Education.
Development of a remote proton radiation therapy solution over internet2.
Belard, Arnaud; Tinnel, Brent; Wilson, Steve; Ferro, Ralph; O'Connell, John
2009-12-01
Through our existing partnership, our research program has leveraged the benefits of proton radiation therapy through the development a robust telemedicine solution for remote proton therapy planning. Our proof-of-concept system provides a cost-effective and functional videoconferencing desktop platform for both ad-hoc and scheduled communication, as well as a robust interface for data collaboration (application-sharing of a commercial radiation treatment planning package). Over a 2-year period, our evaluation of this model has highlighted the inherent benefits of this affordable remote treatment planning solution, i.e., (1) giving physicians the ability to remotely participate in refining and generating proton therapy plans via a secure and robust Internet2 VPN tunnel to the University of Pennsylvania's commercial proton treatment planning package; (2) allowing cancer-care providers sending patients to a proton treatment facility to participate in treatment planning decisions by enabling referring or accepting providers to initiate ad-hoc, point-to-point communication with their counterparts to clarify and resolve issues arising before or during patient treatment; and thus (3) allowing stewards of an otherwise highly centralized resource the ability to encourage wider participation with and referrals to sparsely located proton treatment centers by adapting telemedicine techniques that allow sharing of proton therapy planning services. We believe that our elegant and very affordable approach to remote proton treatment planning opens the door to greater worldwide referrals to the scarce resource of proton treatment units and wide-ranging scientific collaboration, both nationally and internationally.
1984-04-01
civilian facility. In FY 79, of the $20 million that the Veterans Administration (VA) spent on shared services , only $17,000 was for services shared...2) present incentives to encourage shared services are inadequate; and (3) such sharing of resources can be effected without a detrimental impact on...Regionalization in Perspective", which provided an excellent review of hospital regionalization and the potential benefits associated with shared services . 6
PR-PR: Cross-Platform Laboratory Automation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linshiz, G; Stawski, N; Goyal, G
To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Goldenmore » Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.« less
PR-PR: cross-platform laboratory automation system.
Linshiz, Gregory; Stawski, Nina; Goyal, Garima; Bi, Changhao; Poust, Sean; Sharma, Monica; Mutalik, Vivek; Keasling, Jay D; Hillson, Nathan J
2014-08-15
To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.
Humphries, Debbie L; Hyde, Justeen; Hahn, Ethan; Atherly, Adam; O'Keefe, Elaine; Wilkinson, Geoffrey; Eckhouse, Seth; Huleatt, Steve; Wong, Samuel; Kertanis, Jennifer
2018-01-01
Forty one percent of local health departments in the U.S. serve jurisdictions with populations of 25,000 or less. Researchers, policymakers, and advocates have long questioned how to strengthen public health systems in smaller municipalities. Cross-jurisdictional sharing may increase quality of service, access to resources, and efficiency of resource use. To characterize perceived strengths and challenges of independent and comprehensive sharing approaches, and to assess cost, quality, and breadth of services provided by independent and sharing health departments in Connecticut (CT) and Massachusetts (MA). We interviewed local health directors or their designees from 15 comprehensive resource-sharing jurisdictions and 54 single-municipality jurisdictions in CT and MA using a semi-structured interview. Quantitative data were drawn from closed-ended questions in the semi-structured interviews; municipal demographic data were drawn from the American Community Survey and other public sources. Qualitative data were drawn from open-ended questions in the semi-structured interviews. The findings from this multistate study highlight advantages and disadvantages of two common public health service delivery models - independent and shared. Shared service jurisdictions provided more community health programs and services, and invested significantly more ($120 per thousand (1K) population vs. $69.5/1K population) on healthy food access activities. Sharing departments had more indicators of higher quality food safety inspections (FSIs), and there was a non-linear relationship between cost per FSI and number of FSI. Minimum cost per FSI was reached above the total number of FSI conducted by all but four of the jurisdictions sampled. Independent jurisdictions perceived their governing bodies to have greater understanding of the roles and responsibilities of local public health, while shared service jurisdictions had fewer staff per 1,000 population. There are trade-offs with sharing and remaining independent. Independent health departments serving small jurisdictions have limited resources but strong local knowledge. Multi-municipality departments have more resources but require more time and investment in governance and decision-making. When making decisions about the right service delivery model for a given municipality, careful consideration should be given to local culture and values. Some economies of scale may be achieved through resource sharing for municipalities <25,000 population.
ERIC Educational Resources Information Center
Helal, Ahmed H., Ed.; Weiss, Joachim W.
This proceedings includes the following papers presented at the 16th International Essen Symposium: "Electronic Resource Sharing: It May Seem Obvious, But It's Not as Simple as it Looks" (Herbert S. White); "Resource Sharing through OCLC: A Comprehensive Approach" (Janet Mitchell); "The Business Information Network:…
Willems, Roel M.; Hagoort, Peter
2016-01-01
Many studies have revealed shared music–language processing resources by finding an influence of music harmony manipulations on concurrent language processing. However, the nature of the shared resources has remained ambiguous. They have been argued to be syntax specific and thus due to shared syntactic integration resources. An alternative view regards them as related to general attention and, thus, not specific to syntax. The present experiments evaluated these accounts by investigating the influence of language on music. Participants were asked to provide closure judgements on harmonic sequences in order to assess the appropriateness of sequence endings. At the same time participants read syntactic garden-path sentences. Closure judgements revealed a change in harmonic processing as the result of reading a syntactically challenging word. We found no influence of an arithmetic control manipulation (experiment 1) or semantic garden-path sentences (experiment 2). Our results provide behavioural evidence for a specific influence of linguistic syntax processing on musical harmony judgements. A closer look reveals that the shared resources appear to be needed to hold a harmonic key online in some form of syntactic working memory or unification workspace related to the integration of chords and words. Overall, our results support the syntax specificity of shared music–language processing resources. PMID:26998339
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...
Perronne, Christian; Adjagba, Alex; Duclos, Philippe; Floret, Daniel; Houweling, Hans; Le Goaster, Corinne; Lévy-Brühl, Daniel; Meyer, François; Senouci, Kamel; Wichmann, Ole
2016-03-08
Many experts on vaccination are convinced that efforts should be made to encourage increased collaboration between National Immunization Technical Advisory Groups on immunization (NITAGs) worldwide. International meetings were held in Berlin, Germany, in 2010 and 2011, to discuss improvement of the methodologies for the development of evidence-based vaccination recommendations, recognizing the need for collaboration and/or sharing of resources in this effort. A third meeting was held in Paris, France, in December 2014, to consider the design of specific practical activities and an organizational structure to enable effective and sustained collaboration. The following conclusions were reached: (i) The proposed collaboration needs a core functional structure and the establishment or strengthening of an international network of NITAGs. (ii) Priority subjects for collaborative work are background information for recommendations, systematic reviews, mathematical models, health economic evaluations and establishment of common frameworks and methodologies for reviewing and grading the evidence. (iii) The programme of collaborative work should begin with participation of a limited number of NITAGs which already have a high level of expertise. The amount of joint work could be increased progressively through practical activities and pragmatic examples. Due to similar priorities and already existing structures, this should be organized at regional or subregional level. For example, in the European Union a project is funded by the European Centre for Disease Prevention and Control (ECDC) with the aim to set up a network for improving data, methodology and resource sharing and thereby supporting NITAGs. Such regional networking activities should be carried out in collaboration with the World Health Organization (WHO). (iv) A global steering committee should be set up to promote international exchange between regional networks and to increase the involvement of less experienced NITAGs. NITAGs already collaborate at the global level via the NITAG Resource Centre, a web-based platform developed by the Health Policy and Institutional Development Unit (WHO Collaborating Centre) of the Agence de Médecine Préventive (AMP-HPID). It would be appropriate to continue facilitating the coordination of this global network through the AMP-HPID NITAG Resource Centre. (v) While sharing work products and experiences, each NITAG would retain responsibility for its own decision-making and country-specific recommendations. Copyright © 2016. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Johns, E. M.; Mayernik, M. S.; Boler, F. M.; Corson-Rikert, J.; Daniels, M. D.; Gross, M. B.; Khan, H.; Maull, K. E.; Rowan, L. R.; Stott, D.; Williams, S.; Krafft, D. B.
2015-12-01
Researchers seek information and data through a variety of avenues: published literature, search engines, repositories, colleagues, etc. In order to build a web application that leverages linked open data to enable multiple paths for information discovery, the EarthCollab project has surveyed two geoscience user communities to consider how researchers find and share scholarly output. EarthCollab, a cross-institutional, EarthCube funded project partnering UCAR, Cornell University, and UNAVCO, is employing the open-source semantic web software, VIVO, as the underlying technology to connect the people and resources of virtual research communities. This study will present an analysis of survey responses from members of the two case study communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy. The survey results illustrate the types of research products that respondents indicate should be discoverable within a digital platform and the current methods used to find publications, data, personnel, tools, and instrumentation. The responses showed that scientists rely heavily on general purpose search engines, such as Google, to find information, but that data center websites and the published literature were also critical sources for finding collaborators, data, and research tools.The survey participants also identify additional features of interest for an information platform such as search engine indexing, connection to institutional web pages, generation of bibliographies and CVs, and outward linking to social media. Through the survey, the user communities prioritized the type of information that is most important to display and describe their work within a research profile. The analysis of this survey will inform our further development of a platform that will facilitate different types of information discovery strategies, and help researchers to find and use the associated resources of a research project.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 6 2013-01-01 2013-01-01 false Cost-sharing. 624.7 Section 624.7 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE WATER RESOURCES EMERGENCY WATERSHED PROTECTION § 624.7 Cost-sharing. (a) Except as provided in...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 6 2014-01-01 2014-01-01 false Cost-sharing. 624.7 Section 624.7 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE WATER RESOURCES EMERGENCY WATERSHED PROTECTION § 624.7 Cost-sharing. (a) Except as provided in...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 6 2011-01-01 2011-01-01 false Cost-sharing. 624.7 Section 624.7 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE WATER RESOURCES EMERGENCY WATERSHED PROTECTION § 624.7 Cost-sharing. (a) Except as provided in...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 6 2012-01-01 2012-01-01 false Cost-sharing. 624.7 Section 624.7 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE WATER RESOURCES EMERGENCY WATERSHED PROTECTION § 624.7 Cost-sharing. (a) Except as provided in...
State Support for Open Educational Resources: Key Findings from Achieve's OER Institute
ERIC Educational Resources Information Center
Achieve, Inc., 2013
2013-01-01
Open Educational Resources (OER) offer unique new opportunities for educators to share quality learning resources, especially in an increasingly digital world. Forty-six states and the District of Columbia have adopted the Common Core State Standards (CCSS), providing them with the unprecedented advantage of being able to share resources that are…
ERIC Educational Resources Information Center
Lange, Karen
The Wyoming Academic Libraries Resource Project was initiated to improve cooperation and resource sharing by developing an interconnected information access and delivery system among Wyoming's academic libraries and the State Library. The goal was to formalize communication, cooperation, and resource sharing by developing an Ariel document…
The Socio-Technical Design of a Library and Information Science Collaboratory
ERIC Educational Resources Information Center
Lassi, Monica; Sonnenwald, Diane H.
2013-01-01
Introduction: We present a prototype collaboratory, a socio-technical platform to support sharing research data collection instruments in library and information science. No previous collaboratory has attempted to facilitate sharing digital research data collection instruments among library and information science researchers. Method: We have…
The demands and resources arising from shared office spaces.
Morrison, Rachel L; Macky, Keith A
2017-04-01
The prevalence of flexible and shared office spaces is increasing significantly, yet the socioemotional outcomes associated with these environments are under researched. Utilising the job demands-resources (JD-R) model we investigate both the demands and the resources that can accrue to workers as a result of shared work environments and hot-desking. Data were collected from work experienced respondents (n = 1000) assessing the extent to which they shared their office space with others, along with demands comprising distractions, uncooperative behaviours, distrust, and negative relationships, and resources from co-worker friendships and supervisor support. We found that, as work environments became more shared (with hot-desking being at the extreme end of the continuum), not only were there increases in demands, but co-worker friendships were not improved and perceptions of supervisory support decreased. Findings are discussed in relation to employee well-being and recommendations are made regarding how best to ameliorate negative consequences of shared work environments. Copyright © 2016 Elsevier Ltd. All rights reserved.
Coordinating Resource Usage through Adaptive Service Provisioning in Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Fok, Chien-Liang; Roman, Gruia-Catalin; Lu, Chenyang
Wireless sensor networks (WSNs) exhibit high levels of network dynamics and consist of devices with limited energy. This results in the need to coordinate applications not only at the functional level, as is traditionally done, but also in terms of resource utilization. In this paper, we present a middleware that does this using adaptive service provisioning. Novel service binding strategies automatically adapt application behavior when opportunities for energy savings surface, and switch providers when the network topology changes. The former is accomplished by providing limited information about the energy consumption associated with using various services, systematically exploiting opportunities for sharing service invocations, and exploiting the broadcast nature of wireless communication in WSNs. The middleware has been implemented and evaluated on two disparate WSN platforms, the TelosB and Imote2. Empirical results show that adaptive service provisioning can enable energy-aware service binding decisions that result in increased energy efficiency and significantly increase service availability, while imposing minimal additional burden on the application, service, and device developers. Two applications, medical patient monitoring and structural health monitoring, demonstrate the middleware's efficacy.
[Application of an improved model of a job-matching platform for nurses].
Huang, Way-Ren; Lin, Chiou-Fen
2015-04-01
The three-month attrition rate for new nurses in Taiwan remains high. Many hospitals rely on traditional recruitment methods to find new nurses, yet it appears that their efficacy is less than ideal. To effectively solve this manpower shortage, a nursing resource platform is a project worth developing in the future. This study aimed to utilize a quality-improvement model to establish communication between hospitals and nursing students and create a customized employee-employer information-matching platform to help nursing students enter the workforce. This study was structured around a quality-improvement model and used current situation analysis, literature review, focus-group discussions, and process re-engineering to formulate necessary content for a job-matching platform for nursing. The concept of an academia-industry strategic alliance helped connect supply and demand within the same supply chain. The nurse job-matching platform created in this study provided job flexibility as well as job suitability assessments and continued follow-up and services for nurses after entering the workforce to provide more accurate matching of employers and employees. The academia-industry strategic alliance, job suitability, and long-term follow-up designed in this study are all new features in Taiwan's human resource service systems. The proposed human resource process re-engineering provides nursing students facing graduation with a professionally managed human resources platform. Allowing students to find an appropriate job prior to graduation will improve willingness to work and employee retention.
A Community Assessment Tool for Education Resources
NASA Astrophysics Data System (ADS)
Hou, C. Y.; Soyka, H.; Hutchison, V.; Budden, A. E.
2016-12-01
In order to facilitate and enhance better understanding of how to conserve life on earth and the environment that sustains it, Data Observation Network for Earth (DataONE) develops, implements, and shares educational activities and materials as part of its commitment to the education of its community, including scientific researchers, educators, and the public. Creating and maintaining educational materials that remain responsive to community needs is reliant on careful evaluations in order to enhance current and future resources. DataONE's extensive collaboration with individuals and organizations has informed the development of its educational resources and through these interactions, the need for a comprehensive, customizable education evaluation instrument became apparent. In this presentation, the authors will briefly describe the design requirements and research behind a prototype instrument that is intended to be used by the community for evaluation of its educational activities and resources. We will then demonstrate the functionality of a web based platform that enables users to identify the type of educational activity across multiple axes. This results in a set of structured evaluation questions that can be included in a survey instrument. Users can also access supporting documentation describing the types of question included in the output or simply download a full editable instrument. Our aim is that by providing the community with access to a structured evaluation instrument, Earth/Geoscience educators will be able to gather feedback easily and efficiently in order to help maintain the quality, currency/relevancy, and value of their resources, and ultimately, support a more data literate community.
Resource Sharing: A Necessity for the '80s.
ERIC Educational Resources Information Center
Lavo, Barbara, Comp.
Papers presented at a 1981 seminar on library resource sharing covered topics related to Australasian databases, Australian and New Zealand document delivery systems, and shared acquisition and cataloging for special libraries. The papers included: (1) "AUSINET: Australasia's Information Network?" by Ian McCallum; (2) "Australia/New…
30 CFR 220.022 - Calculation of net profit share payment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Calculation of net profit share payment. 220.022 Section 220.022 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR MINERALS REVENUE MANAGEMENT ACCOUNTING PROCEDURES FOR DETERMINING NET PROFIT SHARE PAYMENT FOR OUTER CONTINENTAL...
Secure public cloud platform for medical images sharing.
Pan, Wei; Coatrieux, Gouenou; Bouslimi, Dalel; Prigent, Nicolas
2015-01-01
Cloud computing promises medical imaging services offering large storage and computing capabilities for limited costs. In this data outsourcing framework, one of the greatest issues to deal with is data security. To do so, we propose to secure a public cloud platform devoted to medical image sharing by defining and deploying a security policy so as to control various security mechanisms. This policy stands on a risk assessment we conducted so as to identify security objectives with a special interest for digital content protection. These objectives are addressed by means of different security mechanisms like access and usage control policy, partial-encryption and watermarking.
Data integration: Combined imaging and electrophysiology data in the cloud.
Kini, Lohith G; Davis, Kathryn A; Wagenaar, Joost B
2016-01-01
There has been an increasing effort to correlate electrophysiology data with imaging in patients with refractory epilepsy over recent years. IEEG.org provides a free-access, rapidly growing archive of imaging data combined with electrophysiology data and patient metadata. It currently contains over 1200 human and animal datasets, with multiple data modalities associated with each dataset (neuroimaging, EEG, EKG, de-identified clinical and experimental data, etc.). The platform is developed around the concept that scientific data sharing requires a flexible platform that allows sharing of data from multiple file formats. IEEG.org provides high- and low-level access to the data in addition to providing an environment in which domain experts can find, visualize, and analyze data in an intuitive manner. Here, we present a summary of the current infrastructure of the platform, available datasets and goals for the near future. Copyright © 2015 Elsevier Inc. All rights reserved.
Research on sudden environmental pollution public service platform construction based on WebGIS
NASA Astrophysics Data System (ADS)
Bi, T. P.; Gao, D. Y.; Zhong, X. Y.
2016-08-01
In order to actualize the social sharing and service of the emergency-response information for sudden pollution accidents, the public can share the risk source information service, dangerous goods control technology service and so on, The SQL Server and ArcSDE software are used to establish a spatial database to restore all kinds of information including risk sources, hazardous chemicals and handling methods in case of accidents. Combined with Chinese atmospheric environmental assessment standards, the SCREEN3 atmospheric dispersion model and one-dimensional liquid diffusion model are established to realize the query of related information and the display of the diffusion effect under B/S structure. Based on the WebGIS technology, C#.Net language is used to develop the sudden environmental pollution public service platform. As a result, the public service platform can make risk assessments and provide the best emergency processing services.
The EGS Data Collaboration Platform: Enabling Scientific Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weers, Jonathan D; Johnston, Henry; Huggins, Jay V
Collaboration in the digital age has been stifled in recent years. Reasonable responses to legitimate security concerns have created a virtual landscape of silos and fortified castles incapable of sharing information efficiently. This trend is unfortunately opposed to the geothermal scientific community's migration toward larger, more collaborative projects. To facilitate efficient sharing of information between team members from multiple national labs, universities, and private organizations, the 'EGS Collab' team has developed a universally accessible, secure data collaboration platform and has fully integrated it with the U.S. Department of Energy's (DOE) Geothermal Data Repository (GDR) and the National Geothermal Data Systemmore » (NGDS). This paper will explore some of the challenges of collaboration in the modern digital age, highlight strategies for active data management, and discuss the integration of the EGS Collab data management platform with the GDR to enable scientific discovery through the timely dissemination of information.« less
Data integration: Combined Imaging and Electrophysiology data in the cloud
Kini, Lohith G.; Davis, Kathryn A.; Wagenaar, Joost B.
2015-01-01
There has been an increasing effort to correlate electrophysiology data with imaging in patients with refractory epilepsy over recent years. IEEG.org provides a free-access, rapidly growing archive of imaging data combined with electrophysiology data and patient metadata. It currently contains over 1200 human and animal datasets, with multiple data modalities associated with each dataset (neuroimaging, EEG, EKG, de-identified clinical and experimental data, etc.). The platform is developed around the concept that scientific data sharing requires a flexible platform that allows sharing of data from multiple file-formats. IEEG.org provides high and low-level access to the data in addition to providing an environment in which domain experts can find, visualize, and analyze data in an intuitive manner. Here, we present a summary of the current infrastructure of the platform, available datasets and goals for the near future. PMID:26044858
SPARQLGraph: a web-based platform for graphically querying biological Semantic Web databases.
Schweiger, Dominik; Trajanoski, Zlatko; Pabinger, Stephan
2014-08-15
Semantic Web has established itself as a framework for using and sharing data across applications and database boundaries. Here, we present a web-based platform for querying biological Semantic Web databases in a graphical way. SPARQLGraph offers an intuitive drag & drop query builder, which converts the visual graph into a query and executes it on a public endpoint. The tool integrates several publicly available Semantic Web databases, including the databases of the just recently released EBI RDF platform. Furthermore, it provides several predefined template queries for answering biological questions. Users can easily create and save new query graphs, which can also be shared with other researchers. This new graphical way of creating queries for biological Semantic Web databases considerably facilitates usability as it removes the requirement of knowing specific query languages and database structures. The system is freely available at http://sparqlgraph.i-med.ac.at.
Cloud-based image sharing network for collaborative imaging diagnosis and consultation
NASA Astrophysics Data System (ADS)
Yang, Yuanyuan; Gu, Yiping; Wang, Mingqing; Sun, Jianyong; Li, Ming; Zhang, Weiqiang; Zhang, Jianguo
2018-03-01
In this presentation, we presented a new approach to design cloud-based image sharing network for collaborative imaging diagnosis and consultation through Internet, which can enable radiologists, specialists and physicians locating in different sites collaboratively and interactively to do imaging diagnosis or consultation for difficult or emergency cases. The designed network combined a regional RIS, grid-based image distribution management, an integrated video conferencing system and multi-platform interactive image display devices together with secured messaging and data communication. There are three kinds of components in the network: edge server, grid-based imaging documents registry and repository, and multi-platform display devices. This network has been deployed in a public cloud platform of Alibaba through Internet since March 2017 and used for small lung nodule or early staging lung cancer diagnosis services between Radiology departments of Huadong hospital in Shanghai and the First Hospital of Jiaxing in Zhejiang Province.
Processing structure in language and music: a case for shared reliance on cognitive control.
Slevc, L Robert; Okada, Brooke M
2015-06-01
The relationship between structural processing in music and language has received increasing interest in the past several years, spurred by the influential Shared Syntactic Integration Resource Hypothesis (SSIRH; Patel, Nature Neuroscience, 6, 674-681, 2003). According to this resource-sharing framework, music and language rely on separable syntactic representations but recruit shared cognitive resources to integrate these representations into evolving structures. The SSIRH is supported by findings of interactions between structural manipulations in music and language. However, other recent evidence suggests that such interactions also can arise with nonstructural manipulations, and some recent neuroimaging studies report largely nonoverlapping neural regions involved in processing musical and linguistic structure. These conflicting results raise the question of exactly what shared (and distinct) resources underlie musical and linguistic structural processing. This paper suggests that one shared resource is prefrontal cortical mechanisms of cognitive control, which are recruited to detect and resolve conflict that occurs when expectations are violated and interpretations must be revised. By this account, musical processing involves not just the incremental processing and integration of musical elements as they occur, but also the incremental generation of musical predictions and expectations, which must sometimes be overridden and revised in light of evolving musical input.
Knowledge Management: A Conceptual Platform for the Sharing of Ideas.
ERIC Educational Resources Information Center
Mahdjoubi, Darius; Harmon, Glynn
2001-01-01
The concept of the learning organization and intellectual capital were instrumental in the beginning stage of knowledge management, about 1995. From the spontaneous combination of these two fields, the modern concept of knowledge management as a conceptual platform emerged. The seven main fields that are so far most intimately connected to…
MetaCoMET: a web platform for discovery and visualization of the core microbiome
USDA-ARS?s Scientific Manuscript database
A key component of the analysis of microbiome datasets is the identification of OTUs shared between multiple experimental conditions, commonly referred to as the core microbiome. Results: We present a web platform named MetaCoMET that enables the discovery and visualization of the core microbiome an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-29
... activities are to enable effective sharing, integration, standardization, and analysis of heterogeneous data from collaborative translational research by mobilizing the tranSMART open- source and open-data...: (a) Establish and sustain tranSMART as the preferred data sharing and analytics platform for...
Many of us nowadays invest significant amounts of time in sharing our activities and opinions with friends and family via social networking tools such as Facebook, Twitter or other related websites. However, despite the availability of many platforms for scientists to connect and...
Resource Sharing in an Electronic Age: Past, Present, and Future.
ERIC Educational Resources Information Center
Jones, Adrian
Librarians' work has become more challenging and complex over the past 15 years. Fifteen years ago, the telephone was a librarian's most used and most effective instrument, and librarians mostly relied on the resources within their own walls. In that era, resource sharing placed substantial burdens on larger libraries, and the resources of smaller…
Studying interregional wildland fire engine assignments for large fire suppression
Erin J. Belval; Yu Wei; David E. Calkin; Crystal S. Stonesifer; Matthew P. Thompson; John R. Tipton
2017-01-01
One crucial component of large fire response in the United States (US) is the sharing of wildland firefighting resources between regions: resources from regions experiencing low fire activity supplement resources in regions experiencing high fire activity. An important step towards improving the efficiency of resource sharing and related policies is to develop a better...
Improving Evidence Dissemination and Accessibility through a Mobile-based Resource Platform.
Zhu, Zheng; Xing, Weijie; Hu, Yan; Zhou, Yingfeng; Gu, Ying
2018-05-28
Current mobile information technologies fundamentally influence evidence dissemination from the perspective of both evidence seekers and evidence providers. However, there is no related study which tried using a mobile-based platform to disseminate evidence in China. The main object of this study is to develop a mobile-based evidence resource platform and to evaluate its effects of improving nurses' access to evidence-based practice resources and meeting users' demands. The mobile-based evidence resource platform was developed in 2014. A cross-sectional study was conducted over a period of 2 months between December 2015 and January 2016 to evaluate user experiences of and preferences regarding the platform. Descriptive analysis was adopted to analyze information and its communication effects from December 2014 to March 2017. A total of 472 participants met the inclusion criteria and responded to the survey. High scores were received for the overall rating (4.34 ± 0.67), evidence section (4.30 ± 0.63), learning materials section (4.26 ± 0.65), news section (4.27 ± 0.66), readability (4.38 ± 0.63), design and structure (4.38 ± 0.63), and interactivity (3.58 ± 0.84). As of March 31, 2017, the total number of followers was 28,954. The total number of readings was 584,834. The most current WCI value was 388.72. Our study demonstrated that the mobile-based platform for evidence transfer can promote the accessibility of evidence and meet users' demands. This mobile-based platform is currently available in the WeChat application environment. It will be a wise option for healthcare professionals for the purposes of learning about EBP and disseminating evidence in China.
Sharing Ideas: Tough Times Encourage Colleges to Collaborate
ERIC Educational Resources Information Center
Fain, Paul; Blumenstyk, Goldie; Sander, Libby
2009-01-01
Tough times are encouraging colleges to share resources in a variety of areas, including campus security, research, and degree programs. Despite its veneer of cooperation, higher education is a competitive industry, where resource sharing is eyed warily. But the recession is chipping away at that reluctance, and institutions are pursuing…
Luo, Jake; Apperson-Hansen, Carolyn; Pelfrey, Clara M; Zhang, Guo-Qiang
2014-11-30
Cross-institutional cross-disciplinary collaboration has become a trend as researchers move toward building more productive and innovative teams for scientific research. Research collaboration is significantly changing the organizational structure and strategies used in the clinical and translational science domain. However, due to the obstacles of diverse administrative structures, differences in area of expertise, and communication barriers, establishing and managing a cross-institutional research project is still a challenging task. We address these challenges by creating an integrated informatics platform to reduce the barriers to biomedical research collaboration. The Request Management System (RMS) is an informatics infrastructure designed to transform a patchwork of expertise and resources into an integrated support network. The RMS facilitates investigators' initiation of new collaborative projects and supports the management of the collaboration process. In RMS, experts and their knowledge areas are categorized and managed structurally to provide consistent service. A role-based collaborative workflow is tightly integrated with domain experts and services to streamline and monitor the life-cycle of a research project. The RMS has so far tracked over 1,500 investigators with over 4,800 tasks. The research network based on the data collected in RMS illustrated that the investigators' collaborative projects increased close to 3 times from 2009 to 2012. Our experience with RMS indicates that the platform reduces barriers for cross-institutional collaboration of biomedical research projects. Building a new generation of infrastructure to enhance cross-disciplinary and multi-institutional collaboration has become an important yet challenging task. In this paper, we share the experience of developing and utilizing a collaborative project management system. The results of this study demonstrate that a web-based integrated informatics platform can facilitate and increase research interactions among investigators.
VOLTTRON™: An Agent Platform for Integrating Electric Vehicles and Smart Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haack, Jereme N.; Akyol, Bora A.; Tenney, Nathan D.
2013-12-06
The VOLTTRON™ platform provides a secure environment for the deployment of intelligent applications in the smart grid. VOLTTRON design is based on the needs of control applications running on small form factor devices, namely security and resource guarantees. Services such as resource discovery, secure agent mobility, and interacting with smart and legacy devices are provided by the platform to ease the development of control applications and accelerate their deployment. VOLTTRON platform has been demonstrated in several different domains that influenced and enhanced its capabilities. This paper will discuss the features of VOLTTRON and highlight its usage to coordinate electric vehiclemore » charging with home energy usage« less
30 CFR 250.1725 - When do I have to remove platforms and other facilities?
Code of Federal Regulations, 2011 CFR
2011-07-01
... production and transportation, as well as other energy-related or marine-related uses (including LNG) for... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When do I have to remove platforms and other facilities? 250.1725 Section 250.1725 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND...
STEMEdhub: Supporting STEM Education Initiatives via the HUBzero Platform
ERIC Educational Resources Information Center
Lehman, James D.; Ertmer, Peggy A.; Bessenbacher, Ann M.
2015-01-01
Built as one of 60+ hubs on the HUBzero platform, STEMEdhub was developed in 2011 as a resource for research, education, and collaboration in STEM education. The hub currently supports 82 different groups. In this article, the authors describe two specific groups (SLED and AAU) that are taking advantage of numerous communication and resource tools…
The International Space Station human life sciences experiment implementation process
NASA Technical Reports Server (NTRS)
Miller, L. J.; Haven, C. P.; McCollum, S. G.; Lee, A. M.; Kamman, M. R.; Baumann, D. K.; Anderson, M. E.; Buderer, M. C.
2001-01-01
The selection, definition, and development phases of a Life Sciences flight research experiment has been consistent throughout the past decade. The implementation process, however, has changed significantly within the past two years. This change is driven primarily by the shift from highly integrated, dedicated research missions on platforms with well defined processes to self contained experiments with stand alone operations on platforms which are being concurrently designed. For experiments manifested on the International Space Station (ISS) and/or on short duration missions, the more modular, streamlined, and independent the individual experiment is, the more likely it is to be successfully implemented before the ISS assembly is completed. During the assembly phase of the ISS, science operations are lower in priority than the construction of the station. After the station has been completed, it is expected that more resources will be available to perform research. The complexity of implementing investigations increases with the logistics needed to perform the experiment. Examples of logistics issues include- hardware unique to the experiment; large up and down mass and volume needs; access to crew and hardware during the ascent or descent phases; maintenance of hardware and supplies with a limited shelf life,- baseline data collection schedules with lengthy sessions or sessions close to the launch or landing; onboard stowage availability, particularly cold stowage; and extensive training where highly proficient skills must be maintained. As the ISS processes become better defined, experiment implementation will meet new challenges due to distributed management, on-orbit resource sharing, and adjustments to crew availability pre- and post-increment. c 2001. Elsevier Science Ltd. All rights reserved.
Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing
NASA Astrophysics Data System (ADS)
Chine, Karim
The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.
Chang, Chia-Jung; Osoegawa, Kazutoyo; Milius, Robert P; Maiers, Martin; Xiao, Wenzhong; Fernandez-Viňa, Marcelo; Mack, Steven J
2018-02-01
For over 50 years, the International HLA and Immunogenetics Workshops (IHIW) have advanced the fields of histocompatibility and immunogenetics (H&I) via community sharing of technology, experience and reagents, and the establishment of ongoing collaborative projects. Held in the fall of 2017, the 17th IHIW focused on the application of next generation sequencing (NGS) technologies for clinical and research goals in the H&I fields. NGS technologies have the potential to allow dramatic insights and advances in these fields, but the scope and sheer quantity of data associated with NGS raise challenges for their analysis, collection, exchange and storage. The 17th IHIW adopted a centralized approach to these issues, and we developed the tools, services and systems to create an effective system for capturing and managing these NGS data. We worked with NGS platform and software developers to define a set of distinct but equivalent NGS typing reports that record NGS data in a uniform fashion. The 17th IHIW database applied our standards, tools and services to collect, validate and store those structured, multi-platform data in an automated fashion. We have created community resources to enable exploration of the vast store of curated sequence and allele-name data in the IPD-IMGT/HLA Database, with the goal of creating a long-term community resource that integrates these curated data with new NGS sequence and polymorphism data, for advanced analyses and applications. Copyright © 2017 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.
A web-based online collaboration platform for formulating engineering design projects
NASA Astrophysics Data System (ADS)
Varikuti, Sainath
Effective communication and collaboration among students, faculty and industrial sponsors play a vital role while formulating and solving engineering design projects. With the advent in the web technology, online platforms and systems have been proposed to facilitate interactions and collaboration among different stakeholders in the context of senior design projects. However, there are noticeable gaps in the literature with respect to understanding the effects of online collaboration platforms for formulating engineering design projects. Most of the existing literature is focused on exploring the utility of online platforms on activities after the problem is defined and teams are formed. Also, there is a lack of mechanisms and tools to guide the project formation phase in senior design projects, which makes it challenging for students and faculty to collaboratively develop and refine project ideas and to establish appropriate teams. In this thesis a web-based online collaboration platform is designed and implemented to share, discuss and obtain feedback on project ideas and to facilitate collaboration among students and faculty prior to the start of the semester. The goal of this thesis is to understand the impact of an online collaboration platform for formulating engineering design projects, and how a web-based online collaboration platform affects the amount of interactions among stakeholders during the early phases of design process. A survey measuring the amount of interactions among students and faculty is administered. Initial findings show a marked improvement in the students' ability to share project ideas and form teams with other students and faculty. Students found the online platform simple to use. The suggestions for improving the tool generally included features that were not necessarily design specific, indicating that the underlying concept of this collaborative platform provides a strong basis and can be extended for future online platforms. Although the platform was designed to promote collaboration, adoption of the collaborative platform by students and faculty has been slow. While the platform appears to be very useful for collaboration, more time is required for it to be widely used by all the stakeholders and to fully convert from email communication to the use of the online collaboration platform.
NASA Astrophysics Data System (ADS)
Pedrozo-Acuña, A.; Magos-Hernández, J. A.; Sánchez-Peralta, J. A.; Blanco-Figueroa, J.; Breña-Naranjo, J. A.
2017-12-01
This contribution presents a real-time system for issuing warnings of intense precipitation events during major storms, developed for Mexico City, Mexico. The system is based on high-temporal resolution (Dt=1min) measurements of precipitation in 10 different points within the city, which report variables such as intensity, number of raindrops, raindrop size, kinetic energy, fall velocity, etc. Each one of these stations, is comprised of an optical disdrometer to measure size and fall velocity of hydrometeors, a solar panel to guarantee an uninterrupted power supply, a wireless broadband access to internet, and a resource constrained device known as Raspberry Pi3 for the processing, storage and sharing of the sensor data over the world wide web. The self-made developed platform follows a component-based system paradigm allowing users to implement custom algorithms and models depending on application requirements. The system is in place since July 2016, and continuous measurements of rainfall in real-time are published over the internet through the webpage www.oh-iiunam.mx. Additionally, the developed platform for the data collection and management interacts with the social network known as Twitter to enable real-time warnings of precipitation events. Key contribution of this development is the design and implementation of a scalable, easy to use, interoperable platform that facilitates the development of real-time precipitation sensor networks and warnings. The system is easy to implement and could be used as a prototype for systems in other regions of the world.
Publishing Platform for Aerial Orthophoto Maps, the Complete Stack
NASA Astrophysics Data System (ADS)
Čepický, J.; Čapek, L.
2016-06-01
When creating set of orthophoto maps from mosaic compositions, using airborne systems, such as popular drones, we need to publish results of the work to users. Several steps need to be performed in order get large scale raster data published. As first step, data have to be shared as service (OGC WMS as view service, OGC WCS as download service). But for some applications, OGC WMTS is handy as well, for faster view of the data. Finally the data have to become a part of web mapping application, so that they can be used and evaluated by non-technical users. In this talk, we would like to present automated line of those steps, where user puts in orthophoto image and as a result, OGC Open Web Services are published as well as web mapping application with the data. The web mapping application can be used as standard presentation platform for such type of big raster data to generic user. The publishing platform - Geosense online map information system - can be also used for combination of data from various resources and for creating of unique map compositions and as input for better interpretations of photographed phenomenons. The whole process is successfully tested with eBee drone with raster data resolution 1.5-4 cm/px on many areas and result is also used for creation of derived datasets, usually suited for property management - the records of roads, pavements, traffic signs, public lighting, sewage system, grave locations, and others.
Mahler, Cornelia; Seidling, Hanna Marita; Stützle, Marion; Ose, Dominik; Baudendistel, Ines; Wensing, Michel; Szecsenyi, Joachim
2018-01-01
Background Information technology tools such as shared patient-centered, Web-based medication platforms hold promise to support safe medication use by strengthening patient participation, enhancing patients’ knowledge, helping patients to improve self-management of their medications, and improving communication on medications among patients and health care professionals (HCPs). However, the uptake of such platforms remains a challenge also due to inadequate user involvement in the development process. Employing a user-centered design (UCD) approach is therefore critical to ensure that user’ adoption is optimal. Objective The purpose of this study was to identify what patients with type 2 diabetes mellitus (T2DM) and their HCPs regard necessary requirements in terms of functionalities and usability of a shared patient-centered, Web-based medication platform for patients with T2DM. Methods This qualitative study included focus groups with purposeful samples of patients with T2DM (n=25), general practitioners (n=13), and health care assistants (n=10) recruited from regional health care settings in southwestern Germany. In total, 8 semistructured focus groups were conducted. Sessions were audio- and video-recorded, transcribed verbatim, and subjected to a computer-aided qualitative content analysis. Results Appropriate security and access methods, supported data entry, printing, and sending information electronically, and tracking medication history were perceived as the essential functionalities. Although patients wanted automatic interaction checks and safety alerts, HCPs on the contrary were concerned that unspecific alerts confuse patients and lead to nonadherence. Furthermore, HCPs were opposed to patients’ ability to withhold or restrict access to information in the platform. To optimize usability, there was consensus among participants to display information in a structured, chronological format, to provide information in lay language, to use visual aids and customize information content, and align the platform to users’ workflow. Conclusions By employing a UCD, this study provides insight into the desired functionalities and usability of patients and HCPs regarding a shared patient-centered, Web-based medication platform, thus increasing the likelihood to achieve a functional and useful system. Substantial and ongoing engagement by all intended user groups is necessary to reconcile differences in requirements of patients and HCPs, especially regarding medication safety alerts and access control. Moreover, effective training of patients and HCPs on medication self-management (support) and optimal use of the tool will be a prerequisite to unfold the platform’s full potential. PMID:29588269
Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.
2008-07-30
As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less
Silicon Era of Carbon-Based Life: Application of Genomics and Bioinformatics in Crop Stress Research
Li, Man-Wah; Qi, Xinpeng; Ni, Meng; Lam, Hon-Ming
2013-01-01
Abiotic and biotic stresses lead to massive reprogramming of different life processes and are the major limiting factors hampering crop productivity. Omics-based research platforms allow for a holistic and comprehensive survey on crop stress responses and hence may bring forth better crop improvement strategies. Since high-throughput approaches generate considerable amounts of data, bioinformatics tools will play an essential role in storing, retrieving, sharing, processing, and analyzing them. Genomic and functional genomic studies in crops still lag far behind similar studies in humans and other animals. In this review, we summarize some useful genomics and bioinformatics resources available to crop scientists. In addition, we also discuss the major challenges and advancements in the “-omics” studies, with an emphasis on their possible impacts on crop stress research and crop improvement. PMID:23759993
Prefixed-threshold real-time selection method in free-space quantum key distribution
NASA Astrophysics Data System (ADS)
Wang, Wenyuan; Xu, Feihu; Lo, Hoi-Kwong
2018-03-01
Free-space quantum key distribution allows two parties to share a random key with unconditional security, between ground stations, between mobile platforms, and even in satellite-ground quantum communications. Atmospheric turbulence causes fluctuations in transmittance, which further affect the quantum bit error rate and the secure key rate. Previous postselection methods to combat atmospheric turbulence require a threshold value determined after all quantum transmission. In contrast, here we propose a method where we predetermine the optimal threshold value even before quantum transmission. Therefore, the receiver can discard useless data immediately, thus greatly reducing data storage requirements and computing resources. Furthermore, our method can be applied to a variety of protocols, including, for example, not only single-photon BB84 but also asymptotic and finite-size decoy-state BB84, which can greatly increase its practicality.
A Virtual Hosting Environment for Distributed Online Gaming
NASA Astrophysics Data System (ADS)
Brossard, David; Prieto Martinez, Juan Luis
With enterprise boundaries becoming fuzzier, it’s become clear that businesses need to share resources, expose services, and interact in many different ways. In order to achieve such a distribution in a dynamic, flexible, and secure way, we have designed and implemented a virtual hosting environment (VHE) which aims at integrating business services across enterprise boundaries and virtualising the ICT environment within which these services operate in order to exploit economies of scale for the businesses as well as achieve shorter concept-to-market time scales. To illustrate the relevance of the VHE, we have applied it to the online gaming world. Online gaming is an early adopter of distributed computing and more than 30% of gaming developer companies, being aware of the shift, are focusing on developing high performance platforms for the new online trend.
Design and Implementation of a Secure Wireless Mote-Based Medical Sensor Network
Malasri, Kriangsiri; Wang, Lan
2009-01-01
A medical sensor network can wirelessly monitor vital signs of humans, making it useful for long-term health care without sacrificing patient comfort and mobility. For such a network to be viable, its design must protect data privacy and authenticity given that medical data are highly sensitive. We identify the unique security challenges of such a sensor network and propose a set of resource-efficient mechanisms to address these challenges. Our solution includes (1) a novel two-tier scheme for verifying the authenticity of patient data, (2) a secure key agreement protocol to set up shared keys between sensor nodes and base stations, and (3) symmetric encryption/decryption for protecting data confidentiality and integrity. We have implemented the proposed mechanisms on a wireless mote platform, and our results confirm their feasibility. PMID:22454585
ERIC Educational Resources Information Center
National Library of Australia, Canberra.
The proceedings of this 1979 conference on library cooperation begin with proposals for the promotion of resource sharing among the national libraries of Asia and Oceania, the text of a policy statement on the role of national and international systems as approved at a 1976 meeting of directors of national libraries held in Lausanne, and a summary…
A national-scale authentication infrastructure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, R.; Engert, D.; Foster, I.
2000-12-01
Today, individuals and institutions in science and industry are increasingly forming virtual organizations to pool resources and tackle a common goal. Participants in virtual organizations commonly need to share resources such as data archives, computer cycles, and networks - resources usually available only with restrictions based on the requested resource's nature and the user's identity. Thus, any sharing mechanism must have the ability to authenticate the user's identity and determine if the user is authorized to request the resource. Virtual organizations tend to be fluid, however, so authentication mechanisms must be flexible and lightweight, allowing administrators to quickly establish andmore » change resource-sharing arrangements. However, because virtual organizations complement rather than replace existing institutions, sharing mechanisms cannot change local policies and must allow individual institutions to maintain control over their own resources. Our group has created and deployed an authentication and authorization infrastructure that meets these requirements: the Grid Security Infrastructure. GSI offers secure single sign-ons and preserves site control over access policies and local security. It provides its own versions of common applications, such as FTP and remote login, and a programming interface for creating secure applications.« less
Bernhard, Gerda; Mahler, Cornelia; Seidling, Hanna Marita; Stützle, Marion; Ose, Dominik; Baudendistel, Ines; Wensing, Michel; Szecsenyi, Joachim
2018-03-27
Information technology tools such as shared patient-centered, Web-based medication platforms hold promise to support safe medication use by strengthening patient participation, enhancing patients' knowledge, helping patients to improve self-management of their medications, and improving communication on medications among patients and health care professionals (HCPs). However, the uptake of such platforms remains a challenge also due to inadequate user involvement in the development process. Employing a user-centered design (UCD) approach is therefore critical to ensure that user' adoption is optimal. The purpose of this study was to identify what patients with type 2 diabetes mellitus (T2DM) and their HCPs regard necessary requirements in terms of functionalities and usability of a shared patient-centered, Web-based medication platform for patients with T2DM. This qualitative study included focus groups with purposeful samples of patients with T2DM (n=25), general practitioners (n=13), and health care assistants (n=10) recruited from regional health care settings in southwestern Germany. In total, 8 semistructured focus groups were conducted. Sessions were audio- and video-recorded, transcribed verbatim, and subjected to a computer-aided qualitative content analysis. Appropriate security and access methods, supported data entry, printing, and sending information electronically, and tracking medication history were perceived as the essential functionalities. Although patients wanted automatic interaction checks and safety alerts, HCPs on the contrary were concerned that unspecific alerts confuse patients and lead to nonadherence. Furthermore, HCPs were opposed to patients' ability to withhold or restrict access to information in the platform. To optimize usability, there was consensus among participants to display information in a structured, chronological format, to provide information in lay language, to use visual aids and customize information content, and align the platform to users' workflow. By employing a UCD, this study provides insight into the desired functionalities and usability of patients and HCPs regarding a shared patient-centered, Web-based medication platform, thus increasing the likelihood to achieve a functional and useful system. Substantial and ongoing engagement by all intended user groups is necessary to reconcile differences in requirements of patients and HCPs, especially regarding medication safety alerts and access control. Moreover, effective training of patients and HCPs on medication self-management (support) and optimal use of the tool will be a prerequisite to unfold the platform's full potential. ©Gerda Bernhard, Cornelia Mahler, Hanna Marita Seidling, Marion Stützle, Dominik Ose, Ines Baudendistel, Michel Wensing, Joachim Szecsenyi. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 27.03.2018.
CROPPER: a metagene creator resource for cross-platform and cross-species compendium studies.
Paananen, Jussi; Storvik, Markus; Wong, Garry
2006-09-22
Current genomic research methods provide researchers with enormous amounts of data. Combining data from different high-throughput research technologies commonly available in biological databases can lead to novel findings and increase research efficiency. However, combining data from different heterogeneous sources is often a very arduous task. These sources can be different microarray technology platforms, genomic databases, or experiments performed on various species. Our aim was to develop a software program that could facilitate the combining of data from heterogeneous sources, and thus allow researchers to perform genomic cross-platform/cross-species studies and to use existing experimental data for compendium studies. We have developed a web-based software resource, called CROPPER that uses the latest genomic information concerning different data identifiers and orthologous genes from the Ensembl database. CROPPER can be used to combine genomic data from different heterogeneous sources, allowing researchers to perform cross-platform/cross-species compendium studies without the need for complex computational tools or the requirement of setting up one's own in-house database. We also present an example of a simple cross-platform/cross-species compendium study based on publicly available Parkinson's disease data derived from different sources. CROPPER is a user-friendly and freely available web-based software resource that can be successfully used for cross-species/cross-platform compendium studies.
ExpressionDB: An open source platform for distributing genome-scale datasets.
Hughes, Laura D; Lewis, Scott A; Hughes, Michael E
2017-01-01
RNA-sequencing (RNA-seq) and microarrays are methods for measuring gene expression across the entire transcriptome. Recent advances have made these techniques practical and affordable for essentially any laboratory with experience in molecular biology. A variety of computational methods have been developed to decrease the amount of bioinformatics expertise necessary to analyze these data. Nevertheless, many barriers persist which discourage new labs from using functional genomics approaches. Since high-quality gene expression studies have enduring value as resources to the entire research community, it is of particular importance that small labs have the capacity to share their analyzed datasets with the research community. Here we introduce ExpressionDB, an open source platform for visualizing RNA-seq and microarray data accommodating virtually any number of different samples. ExpressionDB is based on Shiny, a customizable web application which allows data sharing locally and online with customizable code written in R. ExpressionDB allows intuitive searches based on gene symbols, descriptions, or gene ontology terms, and it includes tools for dynamically filtering results based on expression level, fold change, and false-discovery rates. Built-in visualization tools include heatmaps, volcano plots, and principal component analysis, ensuring streamlined and consistent visualization to all users. All of the scripts for building an ExpressionDB with user-supplied data are freely available on GitHub, and the Creative Commons license allows fully open customization by end-users. We estimate that a demo database can be created in under one hour with minimal programming experience, and that a new database with user-supplied expression data can be completed and online in less than one day.
Fryer, Ashley-Kay; Doty, Michelle M; Audet, Anne-Marie J
2011-03-01
Most Americans get their health care in small physician practices. Yet, small practice settings are often unable to provide the same range of services or participate in quality improvement initiatives as large practices because they lack the staff, information technology, and office systems. One promising strategy is to share clinical support services and information systems with other practices. New findings from the 2009 Commonwealth Fund International Health Policy Survey of Primary Care Physicians suggest smaller practices that share resources are more likely than those without shared resources to have advanced electronic medical records and health information technology, routinely track and manage patient information, have after-hours care arrangements, and engage in quality monitoring and benchmarking. This issue brief highlights strategies that can increase resources among small- and medium-sized practices and efforts supported by states, the private sector, and the Affordable Care Act that encourage the expansion of shared-resource models.
Medical students' use of Facebook for educational purposes.
Ali, Anam
2016-06-01
Medical students use Facebook to interact with one another both socially and educationally. This study investigates how medical students in a UK medical school use Facebook to support their learning. In particular, it identifies the nature of their educational activities, and details their experiences of using an educational Facebook group. Twenty-four medical students who self-identified as being Facebook users were invited to focus groups to attain a general overview of Facebook use within an educational context. A textual analysis was then conducted on a small group of intercalating medical students who used a self-created Facebook group to supplement their learning. Five of these students participated in semi-structured interviews. Six common themes were generated. These included 'collaborative learning', 'strategic uses for the preparation for assessment', 'sharing experiences and providing support', 'creating and maintaining connections', 'personal planning and practical organization' and 'sharing and evaluating educational resources'. Evidence from this study shows that medical students are using Facebook informally to enhance their learning and undergraduate lives. Facebook has enabled students to create a supportive learning community amongst their peers. Medical educators wishing to capitalize on Facebook, as a platform for formal educational initiatives, should remain cautious of intruding on this peer online learning community.
Nishimura, Toshihide; Kawamura, Takeshi; Sugihara, Yutaka; Bando, Yasuhiko; Sakamoto, Shigeru; Nomura, Masaharu; Ikeda, Norihiko; Ohira, Tatsuo; Fujimoto, Junichiro; Tojo, Hiromasa; Hamakubo, Takao; Kodama, Tatsuhiko; Andersson, Roland; Fehniger, Thomas E; Kato, Harubumi; Marko-Varga, György
2014-12-01
The Tokyo Medical University Hospital in Japan and the Lund University hospital in Sweden have recently initiated a research program with the objective to impact on patient treatment by clinical disease stage characterization (phenotyping), utilizing proteomics sequencing platforms. By sharing clinical experiences, patient treatment principles, and biobank strategies, our respective clinical teams in Japan and Sweden will aid in the development of predictive and drug related protein biomarkers. Data from joint lung cancer studies are presented where protein expression from Neuro- Endocrine lung cancer (LCNEC) phenotype patients can be separated from Small cell- (SCLC) and Large Cell lung cancer (LCC) patients by deep sequencing and spectral counting analysis. LCNEC, a subtype of large cell carcinoma (LCC), is characterized by neuroendocrine differentiation that small cell lung carcinoma (SCLC) shares. Pre-therapeutic histological distinction between LCNEC and SCLC has so far been problematic, leading to adverse clinical outcome. An establishment of protein targets characteristic of LCNEC is quite helpful for decision of optimal therapeutic strategy by diagnosing individual patients. Proteoform annotation and clinical biobanking is part of the HUPO initiative (http://www.hupo.org) within chromosome 10 and chromosome 19 consortia.
Everware toolkit. Supporting reproducible science and challenge-driven education.
NASA Astrophysics Data System (ADS)
Ustyuzhanin, A.; Head, T.; Babuschkin, I.; Tiunov, A.
2017-10-01
Modern science clearly demands for a higher level of reproducibility and collaboration. To make research fully reproducible one has to take care of several aspects: research protocol description, data access, environment preservation, workflow pipeline, and analysis script preservation. Version control systems like git help with the workflow and analysis scripts part. Virtualization techniques like Docker or Vagrant can help deal with environments. Jupyter notebooks are a powerful platform for conducting research in a collaborative manner. We present project Everware that seamlessly integrates git repository management systems such as Github or Gitlab, Docker and Jupyter helping with a) sharing results of real research and b) boosts education activities. With the help of Everware one can not only share the final artifacts of research but all the depth of the research process. This been shown to be extremely helpful during organization of several data analysis hackathons and machine learning schools. Using Everware participants could start from an existing solution instead of starting from scratch. They could start contributing immediately. Everware allows its users to make use of their own computational resources to run the workflows they are interested in, which leads to higher scalability of the toolkit.
ERIC Educational Resources Information Center
Tennessee Univ., Knoxville. Center for Literacy Studies.
The Arizona Adult Literacy and Technology Resource Center and the University of Tennessee's Center for Literacy Studies undertook a collaborative project to explore the feasibility and effectiveness of regional sharing of resources and expertise in field of adult education and literacy education. The project's goals were as follows: involve a…
Krause, Denise D.
2015-01-01
Background: There are a variety of challenges to developing strategies to improve access to health care, but access to data is critical for effective evidence-based decision-making. Many agencies and organizations throughout Mississippi have been collecting quality health data for many years. However, those data have historically resided in data silos and have not been readily shared. A strategy was developed to build and coordinate infrastructure, capacity, tools, and resources to facilitate health workforce and population health planning throughout the state. Objective: Realizing data as the foundation upon which to build, the primary objective was to develop the capacity to collect, store, maintain, visualize, and analyze data from a variety of disparate sources -- with the ultimate goal of improving access to health care. Specific aims were to: 1) build a centralized data repository and scalable informatics platform, 2) develop a data management solution for this platform and then, 3) derive value from this platform by facilitating data visualization and analysis. Methods: A managed data lake was designed and constructed for health data from disparate sources throughout the state of Mississippi. A data management application was developed to log and track all data sources, maps and geographies, and data marts. With this informatics platform as a foundation, a variety of tools are used to visualize and analyze data. To illustrate, a web mapping application was developed to examine the health workforce geographically and attractive data visualizations and dynamic dashboards were created to facilitate health planning and research. Results: Samples of data visualizations that aim to inform health planners and policymakers are presented. Many agencies and organizations throughout the state benefit from this platform. Conclusion: The overarching goal is that by providing timely, reliable information to stakeholders, Mississippians in general will experience improved access to quality care. PMID:26834938
Yang, Yaojin; Ahtinen, Aino; Lahteenmaki, Jaakko; Nyman, Petri; Paajanen, Henrik; Peltoniemi, Teijo; Quiroz, Carlos
2007-01-01
System integration is one of the major challenges for building wellbeing or healthcare related information systems. In this paper, we are going to share our experiences on how to design a service platform called Nuadu service platform, for providing integrated services in occupational health promotion and health risk management through two heterogeneous systems. Our design aims for a light integration covering the layers, from data through service up to presentation, while maintaining the integrity of the underlying systems.
30 CFR 280.73 - Will MMS share data and information with coastal States?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Data Requirements Protections § 280.73 Will MMS share data and information with coastal States? (a) We... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Will MMS share data and information with coastal States? 280.73 Section 280.73 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE...
30 CFR 580.73 - Will BOEM share data and information with coastal States?
Code of Federal Regulations, 2012 CFR
2012-07-01
... CONTINENTAL SHELF Data Requirements Protections § 580.73 Will BOEM share data and information with coastal... 30 Mineral Resources 2 2012-07-01 2012-07-01 false Will BOEM share data and information with coastal States? 580.73 Section 580.73 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF...
30 CFR 580.73 - Will BOEM share data and information with coastal States?
Code of Federal Regulations, 2014 CFR
2014-07-01
... CONTINENTAL SHELF Data Requirements Protections § 580.73 Will BOEM share data and information with coastal... 30 Mineral Resources 2 2014-07-01 2014-07-01 false Will BOEM share data and information with coastal States? 580.73 Section 580.73 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF...
ERIC Educational Resources Information Center
Ronnie, Mary; And Others
1980-01-01
Describes four library resource sharing projects in (1) New Zealand, (2) Papua New Guinea, (3) Australia, and (4) Fiji. Numerous shared services are discussed, including national bibliographies, publications exchanges, staff exchanges, clearing centers for duplicates, library planning, and national collections. (LLS)
30 CFR 580.73 - Will BOEM share data and information with coastal States?
Code of Federal Regulations, 2013 CFR
2013-07-01
... CONTINENTAL SHELF Data Requirements Protections § 580.73 Will BOEM share data and information with coastal... 30 Mineral Resources 2 2013-07-01 2013-07-01 false Will BOEM share data and information with coastal States? 580.73 Section 580.73 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF...
2006-09-01
Control Force Agility Shared Situational Awareness Attentional Demand Interoperability Network Based Operations Effect Based Operations Speed of...Command Self Synchronization Reach Back Reach Forward Information Superiority Increased Mission Effectiveness Humansystems® Team Modelling...communication effectiveness and Distributed Mission Training (DMT) effectiveness . The NASA Ames Centre - Distributed Research Facilities platform could
Cancer Slide Digital Archive (CDSA) | Informatics Technology for Cancer Research (ITCR)
The CDSA is a web-based platform to support the sharing, managment and analysis of digital pathology data. The Emory Instance currently hosts over 23,000 images from The Cancer Genome Atlas, and the software is being developed within the ITCR grant to be deployable as a digital pathology platform for other labs and Cancer Institutes.
Information Services in New Zealand and the Pacific.
ERIC Educational Resources Information Center
Ronnie, Mary A.
This paper examines information services and resource sharing within New Zealand with a view to future participation in a Pacific resource sharing network. Activities of the National Library, the New Zealand Library Resources Committee, and the Information Services Committee are reviewed over a 40-year period, illustrating library cooperative…
Sharing Resources in the Small School.
ERIC Educational Resources Information Center
Uxer, John E.
Improved strategies for sharing resources are absolutely essential to the survival of small schools. Although not all, or even a major portion, of school programs should be provided by a cooperative delivery system, a discerning superintendent and board will mobilize every resource available to them in conducting their educational programs.…
Valuing Local Knowledge: Indigenous People and Intellectual Property Rights.
ERIC Educational Resources Information Center
Brush, Stephen B., Ed.; Stabinsky, Doreen, Ed.
Intellectual property enables individuals to gain financially from sharing unique and useful knowledge. Compensating indigenous people for sharing their knowledge and resources might both validate and be an equitable reward for indigenous knowledge of biological resources, and might promote the conservation of those resources. This book contains…
All inequality is not equal: children correct inequalities using resource value
Shaw, Alex; Olson, Kristina R.
2013-01-01
Fairness concerns guide children's judgments about how to share resources with others. However, it is unclear from past research if children take extant inequalities or the value of resources involved in an inequality into account when sharing with others; these questions are the focus of the current studies. In all experiments, children saw an inequality between two recipients—one had two more resources than another. What varied between conditions was the value of the resources that the child could subsequently distribute. When the resources were equal in value to those involved in the original inequality, children corrected the previous inequality by giving two resources to the child with fewer resources (Experiment 1). However, as the value of the resources increased relative to those initially shared by the experimenter, children were more likely to distribute the two high value resources equally between the two recipients, presumably to minimize the overall inequality in value (Experiments 1 and 2). We found that children specifically use value, not just size, when trying to equalize outcomes (Experiment 3) and further found that children focus on the relative rather than absolute value of the resources they share—when the experimenter had unequally distributed the same high value resource that the child would later share, children corrected the previous inequality by giving two high value resources to the person who had received fewer high value resources. These results illustrate that children attempt to correct past inequalities and try to maintain equality not just in the count of resources but also by using the value of resources. PMID:23882227
Siberian Platform: Geology and Natural Bitumen Resources
Meyer, Richard F.; Freeman, P.A.
2006-01-01
Summary: The Siberian platform is located between the Yenisey River on the west and the Lena River on the south and east. The Siberian platform is vast in size and inhospitable in its climate. This report is concerned principally with the setting, formation, and potential volumes of natural bitumen. In this report the volumes of maltha and asphalt referred to in the Russian literature are combined to represent natural bitumen. The generation of hydrocarbons and formation of hydrocarbon accumulations are discussed. The sedimentary basins of the Platform are described in terms of the Klemme basin classification system and the conditions controlling formation of natural bitumen. Estimates of in-place bitumen resources are reviewed and evaluated. If the bitumen volume estimate is confined to parts of identified deposits where field observations have verified rock and bitumen grades values, the bitumen resource amounts to about 62 billion barrels of oil in-place. However, estimates of an order of magnitude larger can be obtained if additional speculative and unverified rock volumes and grade measures are included.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I submit an initial platform removal application and what must it include? 250.1726 Section 250.1726 Mineral Resources BUREAU OF OCEAN ENERGY... disposal plans; (d) Plans to protect marine life and the environment during decommissioning operations...
Drake II, Ronald M.; Hatch, Joseph R.; Schenk, Christopher J.; Charpentier, Ronald R.; Klett, Timothy R.; Le, Phuong A.; Leathers, Heidi M.; Brownfield, Michael E.; Gaswirth, Stephanie B.; Marra, Kristen R.; Pitman, Janet K.; Potter, Christopher J.; Tennyson, Marilyn E.
2015-09-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated mean volumes of undiscovered, technically recoverable resources of 463 million barrels of oil, 11.2 trillion cubic feet of gas, and 35 million barrels of natural gas liquids in the Cherokee Platform Province area of Kansas, Oklahoma, and Missouri.
NASA Astrophysics Data System (ADS)
Lenhardt, W. C.; Krishnamurthy, A.; Blanton, B.; Conway, M.; Coposky, J.; Castillo, C.; Idaszak, R.
2017-12-01
An integrated science cyberinfrastructure platform is fast becoming a norm in science, particularly where access to distributed resources, access to compute, data management tools, and collaboration tools are accessible to the end-user scientist without the need to spin up these services on their own. There platforms have various types of labels ranging from data commons to science-as-a-service. They tend to share common features, as outlined above. What tends to distinguish these platforms, however, is their affinity for particular domains, NanoHub - nanomaterials, iPlant - plant biology, Hydroshare - hydrology, and so on. The challenge still remains how to enable these platforms to be more easily adopted for use by other domains. This paper will provide an overview of RENCI's approach to creating a science platform that can be more easily adopted by new communities while also endeavoring to accelerate their research. At RENCI, we started with Hydroshare, but have now worked to generalize the methodology for application to other domains. This new effort is called xDCi, or {cross-disciplinary} Data CyberInfrastructure. We have adopted a broader approach to the challenge of domain adoption and includes two key elements in addition to the technology component. The first of these is how development is operationalized. RENCI implements a DevOps model of continuous development and deployment. This greatly increases the speed by which a new platform can come online and be refined to meet domain needs. DevOps also allows for migration over time, i.e. sustainability. The second element is a concierge model. In addition to the technical elements, and the more responsive development process, RENCI also supports domain adoption of the platform by providing a concierge service— dedicated expertise- in the following areas, Information Technology, Sustainable Software, Data Science, and Sustainability. The success of the RENCI methodology is illustrated by the adoption of the approach by two domains in conjunction with its release, neurobiology and an advanced care planning information system. In addition to the overview of the approach, this paper will describe the existing integrations in the Earth and environmental science domains as well as illustrations of how the technology may be adopted for other related research.
LMS Projects: A Platform for Intergenerational E-Learning Collaboration
ERIC Educational Resources Information Center
Lyashenko, Maria Sergeyevna; Frolova, Natalja Hidarovna
2014-01-01
Intergenerational learning (IGL) is the process of bringing seniors and juniors together in a collaborative space. Universities have been known to create a stimulating context for generations to share and acquire skills. The purpose of this paper is to present the results of research in the field of intergenerational learning and skills sharing.…
Creating Micro-Videos to Demonstrate Technology Learning
ERIC Educational Resources Information Center
Frydenberg, Mark; Andone, Diana
2016-01-01
Short videos, also known as micro-videos, have emerged as a platform for sharing ideas, experiences, and life events on online social networks. This paper shares preliminary results of a study involving students from two universities who created six-second videos using the Vine mobile app to explain or illustrate technology concepts. An analysis…
Quigley, Matthew; Dillon, Michael P; Fatone, Stefania
2018-02-01
Shared decision making is a consultative process designed to encourage patient participation in decision making by providing accurate information about the treatment options and supporting deliberation with the clinicians about treatment options. The process can be supported by resources such as decision aids and discussion guides designed to inform and facilitate often difficult conversations. As this process increases in use, there is opportunity to raise awareness of shared decision making and the international standards used to guide the development of quality resources for use in areas of prosthetic/orthotic care. To describe the process used to develop shared decision-making resources, using an illustrative example focused on decisions about the level of dysvascular partial foot amputation or transtibial amputation. Development process: The International Patient Decision Aid Standards were used to guide the development of the decision aid and discussion guide focused on decisions about the level of dysvascular partial foot amputation or transtibial amputation. Examples from these shared decision-making resources help illuminate the stages of development including scoping and design, research synthesis, iterative development of a prototype, and preliminary testing with patients and clinicians not involved in the development process. Lessons learnt through the process, such as using the International Patient Decision Aid Standards checklist and development guidelines, may help inform others wanting to develop similar shared decision-making resources given the applicability of shared decision making to many areas of prosthetic-/orthotic-related practice. Clinical relevance Shared decision making is a process designed to guide conversations that help patients make an informed decision about their healthcare. Raising awareness of shared decision making and the international standards for development of high-quality decision aids and discussion guides is important as the approach is introduced in prosthetic-/orthotic-related practice.
When Personal Tracking Becomes Social: Examining the Use of Instagram for Healthy Eating.
Chung, Chia-Fang; Agapie, Elena; Schroeder, Jessica; Mishra, Sonali; Fogarty, James; Munson, Sean A
2017-05-02
Many people appropriate social media and online communities in their pursuit of personal health goals, such as healthy eating or increased physical activity. However, people struggle with impression management, and with reaching the right audiences when they share health information on these platforms. Instagram, a popular photo-based social media platform, has attracted many people who post and share their food photos. We aim to inform the design of tools to support healthy behaviors by understanding how people appropriate Instagram to track and share food data, the benefits they obtain from doing so, and the challenges they encounter. We interviewed 16 women who consistently record and share what they eat on Instagram. Participants tracked to support themselves and others in their pursuit of healthy eating goals. They sought social support for their own tracking and healthy behaviors and strove to provide that support for others. People adapted their personal tracking practices to better receive and give this support. Applying these results to the design of health tracking tools has the potential to help people better access social support.
When Personal Tracking Becomes Social: Examining the Use of Instagram for Healthy Eating
Chung, Chia-Fang; Agapie, Elena; Schroeder, Jessica; Mishra, Sonali; Fogarty, James; Munson, Sean A.
2017-01-01
Many people appropriate social media and online communities in their pursuit of personal health goals, such as healthy eating or increased physical activity. However, people struggle with impression management, and with reaching the right audiences when they share health information on these platforms. Instagram, a popular photo-based social media platform, has attracted many people who post and share their food photos. We aim to inform the design of tools to support healthy behaviors by understanding how people appropriate Instagram to track and share food data, the benefits they obtain from doing so, and the challenges they encounter. We interviewed 16 women who consistently record and share what they eat on Instagram. Participants tracked to support themselves and others in their pursuit of healthy eating goals. They sought social support for their own tracking and healthy behaviors and strove to provide that support for others. People adapted their personal tracking practices to better receive and give this support. Applying these results to the design of health tracking tools has the potential to help people better access social support. PMID:28516174
Numerical cognition explains age-related changes in third-party fairness.
Chernyak, Nadia; Sandham, Beth; Harris, Paul L; Cordes, Sara
2016-10-01
Young children share fairly and expect others to do the same. Yet little is known about the underlying cognitive mechanisms that support fairness. We investigated whether children's numerical competencies are linked with their sharing behavior. Preschoolers (aged 2.5-5.5) participated in third-party resource allocation tasks in which they split a set of resources between 2 puppets. Children's numerical competence was assessed using the Give-N task (Sarnecka & Carey, 2008; Wynn, 1990). Numerical competence-specifically knowledge of the cardinal principle-explained age-related changes in fair sharing. Although many subset-knowers (those without knowledge of the cardinal principle) were still able to share fairly, they invoked turn-taking strategies and did not remember the number of resources they shared. These results suggest that numerical cognition serves as an important mechanism for fair sharing behavior, and that children employ different sharing strategies (division or turn-taking) depending on their numerical competence. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Does a House Divided Stand? Kinship and the Continuity of Shared Living Arrangements
Glick, Jennifer E.; Van Hook, Jennifer
2011-01-01
Shared living arrangements can provide housing, economies of scale, and other instrumental support and may become an important resource in times of economic constraint. But the extent to which such living arrangements experience continuity or rapid change in composition is unclear. Previous research on extended-family households tended to focus on factors that trigger the onset of coresidence, including life course events or changes in health status and related economic needs. Relying on longitudinal data from 9,932 households in the Survey of Income and Program Participation (SIPP), the analyses demonstrate that the distribution of economic resources in the household also influences the continuity of shared living arrangements. The results suggest that multigenerational households of parents and adult children experience greater continuity in composition when one individual or couple has a disproportionate share of the economic resources in the household. Other coresidential households, those shared by other kin or nonkin, experience greater continuity when resources are more evenly distributed. PMID:22259218
Decentralized Real-Time Scheduling
1990-08-01
must provide several alternative resource management policies, including FIFO and deadline queueing for shared resources that are not available. 5...When demand exceeds the supply of shared resources (even within a single switch), some calls cannot be completed. In that case, a call’s priority...associated chiefly with the need to manage resources in a timely and decentralized fashion. The Alpha programming model permits the convenient expression of
NASA Astrophysics Data System (ADS)
Gan, T.; Tarboton, D. G.; Dash, P. K.; Gichamo, T.; Horsburgh, J. S.
2017-12-01
Web based apps, web services and online data and model sharing technology are becoming increasingly available to support research. This promises benefits in terms of collaboration, platform independence, transparency and reproducibility of modeling workflows and results. However, challenges still exist in real application of these capabilities and the programming skills researchers need to use them. In this research we combined hydrologic modeling web services with an online data and model sharing system to develop functionality to support reproducible hydrologic modeling work. We used HydroDS, a system that provides web services for input data preparation and execution of a snowmelt model, and HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. To make the web services easy to use, we developed a HydroShare app (based on the Tethys platform) to serve as a browser based user interface for HydroDS. In this integration, HydroDS receives web requests from the HydroShare app to process the data and execute the model. HydroShare supports storage and sharing of the results generated by HydroDS web services. The snowmelt modeling example served as a use case to test and evaluate this approach. We show that, after the integration, users can prepare model inputs or execute the model through the web user interface of the HydroShare app without writing program code. The model input/output files and metadata describing the model instance are stored and shared in HydroShare. These files include a Python script that is automatically generated by the HydroShare app to document and reproduce the model input preparation workflow. Once stored in HydroShare, inputs and results can be shared with other users, or published so that other users can directly discover, repeat or modify the modeling work. This approach provides a collaborative environment that integrates hydrologic web services with a data and model sharing system to enable model development and execution. The entire system comprised of the HydroShare app, HydroShare and HydroDS web services is open source and contributes to capability for web based modeling research.
Allocation of Resources to Collaborators and Free-Riders in 3-Year-Olds
ERIC Educational Resources Information Center
Melis, Alicia P.; Altrichter, Kristin; Tomasello, Michael
2013-01-01
Recent studies have shown that in situations where resources have been acquired collaboratively, children at around 3 years of age share mostly equally. We investigated 3-year-olds' sharing behavior with a collaborating partner and a free-riding partner who explicitly expressed her preference not to collaborate. Children shared more equally with…
Interstitial Cystitis Association
... Donor Resources My Profile Login Social Media Twitter YouTube Facebook Pinterest Community ShareThis Google Search Toggle navigation ... Resources for Donors Corporate Contributions Social Media Twitter YouTube Facebook Pinterest Community ShareThis Google Search Home About ...