Science.gov

Sample records for open source geospatial

  1. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  2. Free and Open Source Software for Geospatial in the field of planetary science

    NASA Astrophysics Data System (ADS)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and

  3. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  4. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java* for EDUCATION

    NASA Astrophysics Data System (ADS)

    Hogan, P.; Kuehnel, F.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  5. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java*

    NASA Astrophysics Data System (ADS)

    Hogan, P.; Coughlan, J.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  6. Tools for open geospatial science

    NASA Astrophysics Data System (ADS)

    Petras, V.; Petrasova, A.; Mitasova, H.

    2017-12-01

    Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.

  7. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    NASA Astrophysics Data System (ADS)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via

  8. Building a multi-scaled geospatial temporal ecology database from disparate data sources: fostering open science and data reuse.

    PubMed

    Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  9. Building a multi-scaled geospatial temporal ecology database from disparate data sources: Fostering open science through data reuse

    USGS Publications Warehouse

    Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  10. Open Technology Approaches to Geospatial Interface Design

    NASA Astrophysics Data System (ADS)

    Crevensten, B.; Simmons, D.; Alaska Satellite Facility

    2011-12-01

    What problems do you not want your software developers to be solving? Choosing open technologies across the entire stack of software development-from low-level shared libraries to high-level user interaction implementations-is a way to help ensure that customized software yields innovative and valuable tools for Earth Scientists. This demonstration will review developments in web application technologies and the recurring patterns of interaction design regarding exploration and discovery of geospatial data through the Vertex: ASF's Dataportal interface, a project utilizing current open web application standards and technologies including HTML5, jQueryUI, Backbone.js and the Jasmine unit testing framework.

  11. Deductive Coordination of Multiple Geospatial Knowledge Sources

    NASA Astrophysics Data System (ADS)

    Waldinger, R.; Reddy, M.; Culy, C.; Hobbs, J.; Jarvis, P.; Dungan, J. L.

    2002-12-01

    Deductive inference is applied to choreograph the cooperation of multiple knowledge sources to respond to geospatial queries. When no one source can provide an answer, the response may be deduced from pieces of the answer provided by many sources. Examples of sources include (1) The Alexandria Digital Library Gazetteer, a repository that gives the locations for almost six million place names, (2) The Cia World Factbook, an online almanac with basic information about more than 200 countries. (3) The SRI TerraVision 3D Terrain Visualization System, which displays a flight-simulator-like interactive display of geographic data held in a database, (4) The NASA GDACC WebGIS client for searching satellite and other geographic data available through OpenGIS Consortium (OGC) Web Map Servers, and (5) The Northern Arizona University Latitude/Longitude Distance Calculator. Queries are phrased in English and are translated into logical theorems by the Gemini Natural Language Parser. The theorems are proved by SNARK, a first-order-logic theorem prover, in the context of an axiomatic geospatial theory. The theory embodies a representational scheme that takes into account the fact that the same place may have many names, and the same name may refer to many places. SNARK has built-in procedures (RCC8 and the Allen calculus, respectively) for reasoning about spatial and temporal concepts. External knowledge sources may be consulted by SNARK as the proof is in progress, so that most knowledge need not be stored axiomatically. The Open Agent Architecture (OAA) facilitates communication between sources that may be implemented on different machines in different computer languages. An answer to the query, in the form of text or an image, is extracted from the proof. Currently, three-dimensional images are displayed by TerraVision but other displays are possible. The combined system is called Geo-Logica. Some example queries that can be handled by Geo-Logica include: (1) show the

  12. The Wildland Fire Emissions Information System: Providing information for carbon cycle studies with open source geospatial tools

    NASA Astrophysics Data System (ADS)

    French, N. H.; Erickson, T.; McKenzie, D.

    2008-12-01

    A major goal of the North American Carbon Program is to resolve uncertainties in understanding and managing the carbon cycle of North America. As carbon modeling tools become more comprehensive and spatially oriented, accurate datasets to spatially quantify carbon emissions from fire are needed, and these data resources need to be accessible to users for decision-making. Under a new NASA Carbon Cycle Science project, Drs. Nancy French and Tyler Erickson, of the Michigan Technological University, Michigan Tech Research Institute (MTRI), are teaming with specialists with the USDA Forest Service Fire and Environmental Research Applications (FERA) team to provide information for mapping fire-derived carbon emissions to users. The project focus includes development of a web-based system to provide spatially resolved fire emissions estimates for North America in a user-friendly environment. The web-based Decision Support System will be based on a variety of open source technologies. The Fuel Characteristic Classification System (FCCS) raster map of fuels and MODIS-derived burned area vector maps will be processed using the Geographic Data Abstraction Library (GDAL) and OGR Simple Features Library. Tabular and spatial project data will be stored in a PostgreSQL/PostGIS, a spatially enabled relational database server. The browser-based user interface will be created using the Django web page framework to allow user input for the decision support system. The OpenLayers mapping framework will be used to provide users with interactive maps within the browser. In addition, the data products will be made available in standard open data formats such as KML, to allow for easy integration into other spatial models and data systems.

  13. Delivery of Forecasted Atmospheric Ozone and Dust for the New Mexico Environmental Public Health Tracking System - An Open Source Geospatial Solution

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Sanchez-Silva, R.; Cavner, J. A.

    2010-12-01

    New Mexico's Environmental Public Health Tracking System (EPHTS), funded by the Centers for Disease Control (CDC) Environmental Public Health Tracking Network (EPHTN), aims to improve health awareness and services by linking health effects data with levels and frequency of environmental exposure. As a public health decision-support system, EPHTS systems include: state-of-the-art statistical analysis tools; geospatial visualization tools; data discovery, extraction, and delivery tools; and environmental/public health linkage information. As part of its mandate, EPHTS issues public health advisories and forecasts of environmental conditions that have consequences for human health. Through a NASA-funded partnership between the University of New Mexico and the University of Arizona, NASA Earth Science results are fused into two existing models (the Dust Regional Atmospheric Model (DREAM) and the Community Multiscale Air Quality (CMAQ) model) in order to improve forecasts of atmospheric dust, ozone, and aerosols. The results and products derived from the outputs of these models are made available to an Open Source mapping component of the New Mexico EPHTS. In particular, these products are integrated into a Django content management system using GeoDjango, GeoAlchemy, and other OGC-compliant geospatial libraries written in the Python and C++ programming languages. Capabilities of the resultant mapping system include indicator-based thematic mapping, data delivery, and analytical capabilities. DREAM and CMAQ outputs can be inspected, via REST calls, through temporal and spatial subsetting of the atmospheric concentration data across analytical units employed by the public health community. This paper describes details of the architecture and integration of NASA Earth Science into the EPHTS decision-support system.

  14. Audiovisual heritage preservation in Earth and Space Science Informatics: Videos from Free and Open Source Software for Geospatial (FOSS4G) conferences in the TIB|AV-Portal.

    NASA Astrophysics Data System (ADS)

    Löwe, Peter; Marín Arraiza, Paloma; Plank, Margret

    2016-04-01

    The influence of Free and Open Source Software (FOSS) projects on Earth and Space Science Informatics (ESSI) continues to grow, particularly in the emerging context of Data Science or Open Science. The scientific significance and heritage of FOSS projects is only to a limited amount covered by traditional scientific journal articles: Audiovisual conference recordings contain significant information for analysis, reference and citation. In the context of data driven research, this audiovisual content needs to be accessible by effective search capabilities, enabling the content to be searched in depth and retrieved. Thereby, it is ensured that the content producers receive credit for their efforts within the respective communities. For Geoinformatics and ESSI, one distinguished driver is the OSGeo Foundation (OSGeo), founded in 2006 to support and promote the interdisciplinary collaborative development of open geospatial technologies and data. The organisational structure is based on software projects that have successfully passed the OSGeo incubation process, proving their compliance with FOSS licence models. This quality assurance is crucial for the transparent and unhindered application in (Open) Science. The main communication channels within and between the OSGeo-hosted community projects for face to face meetings are conferences on national, regional and global scale. Video recordings have been complementing the scientific proceedings since 2006. During the last decade, the growing body of OSGeo videos has been negatively affected by content loss, obsolescence of video technology and dependence on commercial video portals. Even worse, the distributed storage and lack of metadata do not guarantee concise and efficient access of the content. This limits the retrospective analysis of video content from past conferences. But, it also indicates a need for reliable, standardized, comparable audiovisual repositories for the future, as the number of OSGeo projects

  15. Distributed geospatial model sharing based on open interoperability standards

    USGS Publications Warehouse

    Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin

    2009-01-01

    Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.

  16. Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc

    NASA Astrophysics Data System (ADS)

    Percivall, George; Simonis, Ingo

    2016-06-01

    The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.

  17. Multi-source Geospatial Data Analysis with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  18. Increasing the value of geospatial informatics with open approaches for Big Data

    NASA Astrophysics Data System (ADS)

    Percivall, G.; Bermudez, L. E.

    2017-12-01

    Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."

  19. Open-Source GIS

    SciTech Connect

    Vatsavai, Raju; Burk, Thomas E; Lime, Steve

    2012-01-01

    The components making up an Open Source GIS are explained in this chapter. A map server (Sect. 30.1) can broadly be defined as a software platform for dynamically generating spatially referenced digital map products. The University of Minnesota MapServer (UMN Map Server) is one such system. Its basic features are visualization, overlay, and query. Section 30.2 names and explains many of the geospatial open source libraries, such as GDAL and OGR. The other libraries are FDO, JTS, GEOS, JCS, MetaCRS, and GPSBabel. The application examples include derived GIS-software and data format conversions. Quantum GIS, its origin and its applications explainedmore » in detail in Sect. 30.3. The features include a rich GUI, attribute tables, vector symbols, labeling, editing functions, projections, georeferencing, GPS support, analysis, and Web Map Server functionality. Future developments will address mobile applications, 3-D, and multithreading. The origins of PostgreSQL are outlined and PostGIS discussed in detail in Sect. 30.4. It extends PostgreSQL by implementing the Simple Feature standard. Section 30.5 details the most important open source licenses such as the GPL, the LGPL, the MIT License, and the BSD License, as well as the role of the Creative Commons.« less

  20. PlanetSense: A Real-time Streaming and Spatio-temporal Analytics Platform for Gathering Geo-spatial Intelligence from Open Source Data

    SciTech Connect

    Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O

    Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less

  1. Searching and exploitation of distributed geospatial data sources via the Naval Research Lab's Geospatial Information Database (GIDB) Portal System

    NASA Astrophysics Data System (ADS)

    McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.

    2005-05-01

    The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.

  2. OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Koziol, B. W.; Rood, R. B.

    2011-12-01

    The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.

  3. Open cyberGIS software for geospatial research and education in the big data era

    NASA Astrophysics Data System (ADS)

    Wang, Shaowen; Liu, Yan; Padmanabhan, Anand

    CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  4. Commercial observation satellites: broadening the sources of geospatial data

    NASA Astrophysics Data System (ADS)

    Baker, John C.; O'Connell, Kevin M.; Venzor, Jose A.

    2002-09-01

    Commercial observation satellites promise to broaden substantially the sources of imagery data available to potential users of geospatial data and related information products. We examine the new trend toward private firms acquiring and operating high-resolution imagery satellites. These commercial observation satellites build on the substantial experience in Earth observation operations provided by government-owned imaging satellites for civilian and military purposes. However, commercial satellites will require governments and companies to reconcile public and private interests in allowing broad public access to high-resolution satellite imagery data without creating national security risks or placing the private firms at a disadvantage compared with other providers of geospatial data.

  5. Brokered virtual hubs for facilitating access and use of geospatial Open Data

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Latre, Miguel; Kamali, Nargess; Brumana, Raffaella; Braumann, Stefan; Nativi, Stefano

    2016-04-01

    Open Data is a major trend in current information technology scenario and it is often publicised as one of the pillars of the information society in the near future. In particular, geospatial Open Data have a huge potential also for Earth Sciences, through the enablement of innovative applications and services integrating heterogeneous information. However, open does not mean usable. As it was recognized at the very beginning of the Web revolution, many different degrees of openness exist: from simple sharing in a proprietary format to advanced sharing in standard formats and including semantic information. Therefore, to fully unleash the potential of geospatial Open Data, advanced infrastructures are needed to increase the data openness degree, enhancing their usability. In October 2014, the ENERGIC OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". The ENERGIC OD Virtual Hubs aim to facilitate the use of geospatial Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus

  6. OpenSearch technology for geospatial resources discovery

    NASA Astrophysics Data System (ADS)

    Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo

    2010-05-01

    In 2005, the term Web 2.0 has been coined by Tim O'Reilly to describe a quickly growing set of Web-based applications that share a common philosophy of "mutually maximizing collective intelligence and added value for each participant by formalized and dynamic information sharing". Around this same period, OpenSearch a new Web 2.0 technology, was developed. More properly, OpenSearch is a collection of technologies that allow publishing of search results in a format suitable for syndication and aggregation. It is a way for websites and search engines to publish search results in a standard and accessible format. Due to its strong impact on the way the Web is perceived by users and also due its relevance for businesses, Web 2.0 has attracted the attention of both mass media and the scientific community. This explosive growth in popularity of Web 2.0 technologies like OpenSearch, and practical applications of Service Oriented Architecture (SOA) resulted in an increased interest in similarities, convergence, and a potential synergy of these two concepts. SOA is considered as the philosophy of encapsulating application logic in services with a uniformly defined interface and making these publicly available via discovery mechanisms. Service consumers may then retrieve these services, compose and use them according to their current needs. A great degree of similarity between SOA and Web 2.0 may be leading to a convergence between the two paradigms. They also expose divergent elements, such as the Web 2.0 support to the human interaction in opposition to the typical SOA machine-to-machine interaction. According to these considerations, the Geospatial Information (GI) domain, is also moving first steps towards a new approach of data publishing and discovering, in particular taking advantage of the OpenSearch technology. A specific GI niche is represented by the OGC Catalog Service for Web (CSW) that is part of the OGC Web Services (OWS) specifications suite, which provides a

  7. Open Source Software Development

    DTIC Science & Technology

    2011-01-01

    Software, 2002, 149(1), 3-17. 3. DiBona , C., Cooper, D., and Stone, M. (Eds.), Open Sources 2.0, 2005, O’Reilly Media, Sebastopol, CA. Also see, C... DiBona , S. Ockman, and M. Stone (Eds.). Open Sources: Vocides from the Open Source Revolution, 1999. O’Reilly Media, Sebastopol, CA. 4. Ducheneaut, N

  8. Assessing the socioeconomic impact and value of open geospatial information

    USGS Publications Warehouse

    Pearlman, Francoise; Pearlman, Jay; Bernknopf, Richard; Coote, Andrew; Craglia, Massimo; Friedl, Lawrence; Gallo, Jason; Hertzfeld, Henry; Jolly, Claire; Macauley, Molly K.; Shapiro, Carl; Smart, Alan

    2016-03-10

    The workshop included 68 participants coming from international organizations, the U.S. public and private sectors, nongovernmental organizations, and academia. Participants included policy makers and analysts, financial analysts, economists, information scientists, geospatial practitioners, and other discipline experts.

  9. Creating Open Source Conversation

    ERIC Educational Resources Information Center

    Sheehan, Kate

    2009-01-01

    Darien Library, where the author serves as head of knowledge and learning services, launched a new website on September 1, 2008. The website is built with Drupal, an open source content management system (CMS). In this article, the author describes how she and her colleagues overhauled the library's website to provide an open source content…

  10. Open Source Vision

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    Increasingly, colleges and universities are turning to open source as a way to meet their technology infrastructure and application needs. Open source has changed life for visionary CIOs and their campus communities nationwide. The author discusses what these technologists see as the benefits--and the considerations.

  11. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  12. WebGL Visualisation of 3D Environmental Models Based on Finnish Open Geospatial Data Sets

    NASA Astrophysics Data System (ADS)

    Krooks, A.; Kahkonen, J.; Lehto, L.; Latvala, P.; Karjalainen, M.; Honkavaara, E.

    2014-08-01

    Recent developments in spatial data infrastructures have enabled real time GIS analysis and visualization using open input data sources and service interfaces. In this study we present a new concept where metric point clouds derived from national open airborne laser scanning (ALS) and photogrammetric image data are processed, analyzed, finally visualised a through open service interfaces to produce user-driven analysis products from targeted areas. The concept is demonstrated in three environmental applications: assessment of forest storm damages, assessment of volumetric changes in open pit mine and 3D city model visualization. One of the main objectives was to study the usability and requirements of national level photogrammetric imagery in these applications. The results demonstrated that user driven 3D geospatial analyses were possible with the proposed approach and current technology, for instance, the landowner could assess the amount of fallen trees within his property borders after a storm easily using any web browser. On the other hand, our study indicated that there are still many uncertainties especially due to the insufficient standardization of photogrammetric products and processes and their quality indicators.

  13. Weather forecasting with open source software

    NASA Astrophysics Data System (ADS)

    Rautenhaus, Marc; Dörnbrack, Andreas

    2013-04-01

    To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.

  14. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  15. Open-Source Colorimeter

    PubMed Central

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032

  16. Open Source in Education

    ERIC Educational Resources Information Center

    Lakhan, Shaheen E.; Jhunjhunwala, Kavita

    2008-01-01

    Educational institutions have rushed to put their academic resources and services online, beginning the global community onto a common platform and awakening the interest of investors. Despite continuing technical challenges, online education shows great promise. Open source software offers one approach to addressing the technical problems in…

  17. Evaluating Open Source Portals

    ERIC Educational Resources Information Center

    Goh, Dion; Luyt, Brendan; Chua, Alton; Yee, See-Yong; Poh, Kia-Ngoh; Ng, How-Yeu

    2008-01-01

    Portals have become indispensable for organizations of all types trying to establish themselves on the Web. Unfortunately, there have only been a few evaluative studies of portal software and even fewer of open source portal software. This study aims to add to the available literature in this important area by proposing and testing a checklist for…

  18. Open-source colorimeter.

    PubMed

    Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M

    2013-04-19

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.

  19. Open source posturography.

    PubMed

    Rey-Martinez, Jorge; Pérez-Fernández, Nicolás

    2016-12-01

    The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.

  20. Open Source, Openness, and Higher Education

    ERIC Educational Resources Information Center

    Wiley, David

    2006-01-01

    In this article David Wiley provides an overview of how the general expansion of open source software has affected the world of education in particular. In doing so, Wiley not only addresses the development of open source software applications for teachers and administrators, he also discusses how the fundamental philosophy of the open source…

  1. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    NASA Astrophysics Data System (ADS)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  2. How Is Open Source Special?

    ERIC Educational Resources Information Center

    Kapor, Mitchell

    2005-01-01

    Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…

  3. Open source clustering software.

    PubMed

    de Hoon, M J L; Imoto, S; Nolan, J; Miyano, S

    2004-06-12

    We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.

  4. Establishing Transportation Framework Services Using the Open Geospatial Consortium Web Feature Service Specification

    NASA Astrophysics Data System (ADS)

    Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.

    2005-12-01

    As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS

  5. Toward Open Science at the European Scale: Geospatial Semantic Array Programming for Integrated Environmental Modelling

    NASA Astrophysics Data System (ADS)

    de Rigo, Daniele; Corti, Paolo; Caudullo, Giovanni; McInerney, Daniel; Di Leo, Margherita; San-Miguel-Ayanz, Jesús

    2013-04-01

    Interfacing science and policy raises challenging issues when large spatial-scale (regional, continental, global) environmental problems need transdisciplinary integration within a context of modelling complexity and multiple sources of uncertainty [1]. This is characteristic of science-based support for environmental policy at European scale [1], and key aspects have also long been investigated by European Commission transnational research [2-5]. Parameters ofthe neededdata- transformations ? = {?1????m} (a.5) Wide-scale transdisciplinary modelling for environment. Approaches (either of computational science or of policy-making) suitable at a given domain-specific scale may not be appropriate for wide-scale transdisciplinary modelling for environment (WSTMe) and corresponding policy-making [6-10]. In WSTMe, the characteristic heterogeneity of available spatial information (a) and complexity of the required data-transformation modelling (D- TM) appeal for a paradigm shift in how computational science supports such peculiarly extensive integration processes. In particular, emerging wide-scale integration requirements of typical currently available domain-specific modelling strategies may include increased robustness and scalability along with enhanced transparency and reproducibility [11-15]. This challenging shift toward open data [16] and reproducible research [11] (open science) is also strongly suggested by the potential - sometimes neglected - huge impact of cascading effects of errors [1,14,17-19] within the impressively growing interconnection among domain-specific computational models and frameworks. From a computational science perspective, transdisciplinary approaches to integrated natural resources modelling and management (INRMM) [20] can exploit advanced geospatial modelling techniques with an awesome battery of free scientific software [21,22] for generating new information and knowledge from the plethora of composite data [23-26]. From the perspective

  6. An Open Source Tool to Test Interoperability

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  7. Monitoring of In-Field Variability for Site Specific Crop Management Through Open Geospatial Information

    NASA Astrophysics Data System (ADS)

    Řezník, T.; Lukas, V.; Charvát, K.; Charvát, K., Jr.; Horáková, Š.; Křivánek, Z.; Herman, L.

    2016-06-01

    The agricultural sector is in a unique position due to its strategic importance around the world. It is crucial for both citizens (consumers) and the economy (both regional and global), which, ideally, should ensure that the whole sector is a network of interacting organisations. It is important to develop new tools, management methods, and applications to improve the management and logistic operations of agricultural producers (farms) and agricultural service providers. From a geospatial perspective, this involves identifying cost optimization pathways, reducing transport, reducing environmental loads, and improving the energy balance, while maintaining production levels, etc. This paper describes the benefits of, and open issues arising from, the development of the Open Farm Management Information System. Emphasis is placed on descriptions of available remote sensing and other geospatial data, and their harmonization, processing, and presentation to users. At the same time, the FOODIE platform also offers a novel approach of yield potential estimations. Validation for one farm demonstrated 70% successful rate when comparing yield results at a farm counting 1'284 hectares on one hand and results of a theoretical model of yield potential on the other hand. The presented Open Farm Management Information System has already been successfully registered under Phase 8 of the Global Earth Observation System of Systems (GEOSS) Architecture Implementation Pilot in order to support the wide variety of demands that are primarily aimed at agriculture and water pollution monitoring by means of remote sensing.

  8. Your Personal Analysis Toolkit - An Open Source Solution

    NASA Astrophysics Data System (ADS)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  9. Benefits of using Open Geo-spatial Data for valorization of Cultural Heritage: GeoPan app

    NASA Astrophysics Data System (ADS)

    Cuca, Branka; Previtali, Mattia; Barazzetti, Luigi; Brumana, Raffaella

    2017-04-01

    Experts evaluate the spatial data to be one of the categories of Public Sector Information (PSI), of which the exchange is particularly important. On the other side an initiative with a great vision such as Digital Agenda for Europe, emphasizes on intelligent processing of information as essential factor for tackling the challenges of the contemporary society. In such context, the Open Data are considered to be crucial in addressing, environmental pressures, energy efficiency issues, land use and climate change, pollution and traffic management. Furthermore, Open Data are thought to have an important impact on more informed decision making and policy creation for multiple domains that could be addressed even through "apps" of our smart devices. Activities performed in ENERGIC OD project - "European NEtwork for Redistributing Geospatial Information to user Communities - Open Data" have led to some first conclusions on the use and re-use of geo-spatial Open Data by means of Virtual Hubs - an innovative method for brokering of geo-spatial information. This paper illustrates some main benefits of using Open Geo-spatial Data for valorisation of Cultural Heritage through a case of an innovative app called "GeoPan Atl@s". GeoPan, inserted in a dynamic policy context described, aims to provide all information valuable for a sustainable territorial development in a common platform, in particular the material that regards history and changes of the cultural landscapes in Lombardy region. Furthermore, this innovative app is used as a test-bed to facilitate and encourage a more active exchange and exploitation of open geo-spatial information for purposes of valorisation of cultural heritage and landscapes. The aim of this practice is also to achieve a more active participation of experts, VGI communities and citizens and a higher awareness of the multiple use-possibilities of historic and contemporary geo-spatial information for smarter decision making.

  10. The Geoinformatica free and open source software stack

    NASA Astrophysics Data System (ADS)

    Jolma, A.

    2012-04-01

    The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object

  11. Open Source GIS based integrated watershed management

    NASA Astrophysics Data System (ADS)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address

  12. Hydrologic Geospatial Fabric as Community Cyberinfrastructure: International standardization best practices and the U.S. Open Water Data Initiative implementation.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.

    2016-12-01

    Recent prolonged droughts, catastrophic flooding, and the need to protect and restore aquatic ecosystems, has increased the emphasis on information sharing in the water resources science and engineering domains. Internationally the joint World Meteorological Organization (WMO) and Open Geospatial Consortium (OGC) Hydrology Domain Working Group (HDWG) has been working toward a comprehensive system of standards and best practices for the Hydrology Domain. In the U.S. the multi-agency led and open to all U.S. Advisory Committee on Water Information (ACWI) was tasked to implement an Open Water Data Initiative (OWDI), "that will integrate currently fragmented water information into a connected, national water data framework"[1]. The status of both will be presented with focus on a community hydrologic geospatial fabric. Hydrology observations data standardization was the emphasis of the first 5 years of the HDWG. This work included WaterML 2.0 parts 1 - timeseries and part 2 - ratings and gagings. In 2016, the first of two new hydrographic feature models, GroundwaterML2, was completed and the second, for surface water features, was in active development. The WMO Commission for Hydrology is considering adoption of all these standards and their adoption is central to the U.S. OWDI. OWDI participants have produced a special collection in the Journal of American Water Resources Association and several initiative working groups have concluded their activities. One early deliverable from the OWDI was a new easier to use structure for the NHDPlus dataset. Building on this, a project to create a national Network Linked Data Index (NLDI) is being undertaken as an open-source community endeavor. The NLDI centralizes river network data, network navigation tools, crawlers that index data to the network, and utilities to register or remove data from the network. Research that informed the design of the NLDI will be presented along with recent development and findings of the project

  13. PLANNING QUALITY IN GEOSPATIAL PROJECTS

    EPA Science Inventory

    This presentation will briefly review some legal drivers and present a structure for the writing of geospatial Quality Assurance Projects Plans. In addition, the Geospatial Quality Council geospatial information life-cycle and sources of error flowchart will be reviewed.

  14. An Open Source Simulation System

    NASA Technical Reports Server (NTRS)

    Slack, Thomas

    2005-01-01

    An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.

  15. THE OPEN SOURCING OF EPANET

    EPA Science Inventory

    A proposal was made at the 2009 EWRI Congress in Kansas City, MO to establish an Open Source Project (OSP) for the widely used EPANET pipe network analysis program. This would be an ongoing collaborative effort among a group of geographically dispersed advisors and developers, wo...

  16. Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness

    ERIC Educational Resources Information Center

    Committee for Economic Development, 2006

    2006-01-01

    Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…

  17. AKM in Open Source Communities

    NASA Astrophysics Data System (ADS)

    Stamelos, Ioannis; Kakarontzas, George

    Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.

  18. Open Access to Multi-Domain Collaborative Analysis of Geospatial Data Through the Internet

    NASA Astrophysics Data System (ADS)

    Turner, A.

    2009-12-01

    The internet has provided us with a high bandwidth, low latency, globally connected network in which to rapidly share realtime data from sensors, reports, and imagery. In addition, the availability of this data is even easier to obtain, consume and analyze. Another aspect of the internet has been the increased approachability of complex systems through lightweight interfaces - with additional complex services able to provide more advanced connections into data services. These analyses and discussions have primarily been siloed within single domains, or kept out of the reach of amateur scientists and interested citizens. However, through more open access to analytical tools and data, experts can collaborate with citizens to gather information, provide interfaces for experimenting and querying results, and help make improved insights and feedback for further investigation. For example, farmers in Uganda are able to use their mobile phones to query, analyze, and be alerted to banana crop disease based on agriculture and climatological data. In the U.S., local groups use online social media sharing sites to gather data on storm-water runoff and stream siltation in order to alert wardens and environmental agencies. This talk will present various web-based geospatial visualization and analysis techniques and tools such as Google Earth and GeoCommons that have emerged that provide for a collaboration between experts of various domains as well as between experts, government, and citizen scientists. Through increased communication and the sharing of data and tools, it is possible to gain broad insight and development of joint, working solutions to a variety of difficult scientific and policy related questions.

  19. Integrating Remote Sensing Data with Directional Two- Dimensional Wavelet Analysis and Open Geospatial Techniques for Efficient Disaster Monitoring and Management.

    PubMed

    Lin, Yun-Bin; Lin, Yu-Pin; Deng, Dong-Po; Chen, Kuan-Wei

    2008-02-19

    In Taiwan, earthquakes have long been recognized as a major cause oflandslides that are wide spread by floods brought by typhoons followed. Distinguishingbetween landslide spatial patterns in different disturbance regimes is fundamental fordisaster monitoring, management, and land-cover restoration. To circumscribe landslides,this study adopts the normalized difference vegetation index (NDVI), which can bedetermined by simply applying mathematical operations of near-infrared and visible-redspectral data immediately after remotely sensed data is acquired. In real-time disastermonitoring, the NDVI is more effective than using land-cover classifications generatedfrom remotely sensed data as land-cover classification tasks are extremely time consuming.Directional two-dimensional (2D) wavelet analysis has an advantage over traditionalspectrum analysis in that it determines localized variations along a specific direction whenidentifying dominant modes of change, and where those modes are located in multi-temporal remotely sensed images. Open geospatial techniques comprise a series ofsolutions developed based on Open Geospatial Consortium specifications that can beapplied to encode data for interoperability and develop an open geospatial service for sharing data. This study presents a novel approach and framework that uses directional 2Dwavelet analysis of real-time NDVI images to effectively identify landslide patterns andshare resulting patterns via open geospatial techniques. As a case study, this study analyzedNDVI images derived from SPOT HRV images before and after the ChiChi earthquake(7.3 on the Richter scale) that hit the Chenyulan basin in Taiwan, as well as images aftertwo large typhoons (Xangsane and Toraji) to delineate the spatial patterns of landslidescaused by major disturbances. Disturbed spatial patterns of landslides that followed theseevents were successfully delineated using 2D wavelet analysis, and results of patternrecognitions of landslides were

  20. Newspaper archives + text mining = rich sources of historical geo-spatial data

    NASA Astrophysics Data System (ADS)

    Yzaguirre, A.; Smit, M.; Warren, R.

    2016-04-01

    Newspaper archives are rich sources of cultural, social, and historical information. These archives, even when digitized, are typically unstructured and organized by date rather than by subject or location, and require substantial manual effort to analyze. The effort of journalists to be accurate and precise means that there is often rich geo-spatial data embedded in the text, alongside text describing events that editors considered to be of sufficient importance to the region or the world to merit column inches. A regional newspaper can add over 100,000 articles to its database each year, and extracting information from this data for even a single country would pose a substantial Big Data challenge. In this paper, we describe a pilot study on the construction of a database of historical flood events (location(s), date, cause, magnitude) to be used in flood assessment projects, for example to calibrate models, estimate frequency, establish high water marks, or plan for future events in contexts ranging from urban planning to climate change adaptation. We then present a vision for extracting and using the rich geospatial data available in unstructured text archives, and suggest future avenues of research.

  1. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo

    2010-05-01

    integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and

  2. NASA's Earth Imagery Service as Open Source Software

    NASA Astrophysics Data System (ADS)

    De Cesare, C.; Alarcon, C.; Huang, T.; Roberts, J. T.; Rodriguez, J.; Cechini, M. F.; Boller, R. A.; Baynes, K.

    2016-12-01

    The NASA Global Imagery Browse Service (GIBS) is a software system that provides access to an archive of historical and near-real-time Earth imagery from NASA-supported satellite instruments. The imagery itself is open data, and is accessible via standards such as the Open Geospatial Consortium (OGC)'s Web Map Tile Service (WMTS) protocol. GIBS includes three core software projects: The Imagery Exchange (TIE), OnEarth, and the Meta Raster Format (MRF) project. These projects are developed using a variety of open source software, including: Apache HTTPD, GDAL, Mapserver, Grails, Zookeeper, Eclipse, Maven, git, and Apache Commons. TIE has recently been released for open source, and is now available on GitHub. OnEarth, MRF, and their sub-projects have been on GitHub since 2014, and the MRF project in particular receives many external contributions from the community. Our software has been successful beyond the scope of GIBS: the PO.DAAC State of the Ocean and COVERAGE visualization projects reuse components from OnEarth. The MRF source code has recently been incorporated into GDAL, which is a core library in many widely-used GIS software such as QGIS and GeoServer. This presentation will describe the challenges faced in incorporating open software and open data into GIBS, and also showcase GIBS as a platform on which scientists and the general public can build their own applications.

  3. Open source tools for ATR development and performance evaluation

    NASA Astrophysics Data System (ADS)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  4. The HYPE Open Source Community

    NASA Astrophysics Data System (ADS)

    Strömbäck, L.; Pers, C.; Isberg, K.; Nyström, K.; Arheimer, B.

    2013-12-01

    The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model. It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. In the model, the landscape is divided into classes according to soil type, vegetation and altitude. The soil representation is stratified and can be divided in up to three layers. Water and substances are routed through the same flow paths and storages (snow, soil, groundwater, streams, rivers, lakes) considering turn-over and transformation on the way towards the sea. HYPE has been successfully used in many hydrological applications at SMHI. For Europe, we currently have three different models; The S-HYPE model for Sweden; The BALT-HYPE model for the Baltic Sea; and the E-HYPE model for the whole Europe. These models simulate hydrological conditions and nutrients for their respective areas and are used for characterization, forecasts, and scenario analyses. Model data can be downloaded from hypeweb.smhi.se. In addition, we provide models for the Arctic region, the Arab (Middle East and Northern Africa) region, India, the Niger River basin, the La Plata Basin. This demonstrates the applicability of the HYPE model for large scale modeling in different regions of the world. An important goal with our work is to make our data and tools available as open data and services. For this aim we created the HYPE Open Source Community (OSC) that makes the source code of HYPE available for anyone interested in further development of HYPE. The HYPE OSC (hype.sourceforge.net) is an open source initiative under the Lesser GNU Public License taken by SMHI to strengthen international collaboration in hydrological modeling and hydrological data production. The hypothesis is that more brains and more testing will result in better models and better code. The code is transparent and can be changed

  5. International outreach for promoting open geoscience content in Finnish university libraries - libraries as the advocates of citizen science awareness on emerging open geospatial data repositories in Finnish society

    NASA Astrophysics Data System (ADS)

    Rousi, A. M.; Branch, B. D.; Kong, N.; Fosmire, M.

    2013-12-01

    In their Finnish National Spatial Strategy 2010-2015 the Finland's Ministry of Agriculture and Forestry delineated e.g. that spatial data skills should support citizens everyday activities and facilitate decision-making and participation of citizens. Studies also predict that open data, particularly open spatial data, would create, when fully realizing their potential, a 15% increase into the turnovers of Finnish private sector companies. Finnish libraries have a long tradition of serving at the heart of Finnish information society. However, with the emerging possibilities of educating their users on open spatial data a very few initiatives have been made. The National Survey of Finland opened its data in 2012. Finnish technology university libraries, such as Aalto University Library, are open environments for all citizens, and seem suitable of being the first thriving entities in educating citizens on open geospatial data. There are however many obstacles to overcome, such as lack of knowledge about policies, lack of understanding of geospatial data services and insufficient know-how of GIS software among the personnel. This framework examines the benefits derived from an international collaboration between Purdue University Libraries and Aalto University Library to create local strategies in implementing open spatial data education initiatives in Aalto University Library's context. The results of this international collaboration are explicated for the benefit of the field as a whole.

  6. Water sources and mixing in riparian wetlands revealed by tracers and geospatial analysis.

    PubMed

    Lessels, Jason S; Tetzlaff, Doerthe; Birkel, Christian; Dick, Jonathan; Soulsby, Chris

    2016-01-01

    Mixing of waters within riparian zones has been identified as an important influence on runoff generation and water quality. Improved understanding of the controls on the spatial and temporal variability of water sources and how they mix in riparian zones is therefore of both fundamental and applied interest. In this study, we have combined topographic indices derived from a high-resolution Digital Elevation Model (DEM) with repeated spatially high-resolution synoptic sampling of multiple tracers to investigate such dynamics of source water mixing. We use geostatistics to estimate concentrations of three different tracers (deuterium, alkalinity, and dissolved organic carbon) across an extended riparian zone in a headwater catchment in NE Scotland, to identify spatial and temporal influences on mixing of source waters. The various biogeochemical tracers and stable isotopes helped constrain the sources of runoff and their temporal dynamics. Results show that spatial variability in all three tracers was evident in all sampling campaigns, but more pronounced in warmer dryer periods. The extent of mixing areas within the riparian area reflected strong hydroclimatic controls and showed large degrees of expansion and contraction that was not strongly related to topographic indices. The integrated approach of using multiple tracers, geospatial statistics, and topographic analysis allowed us to classify three main riparian source areas and mixing zones. This study underlines the importance of the riparian zones for mixing soil water and groundwater and introduces a novel approach how this mixing can be quantified and the effect on the downstream chemistry be assessed.

  7. Open-source software: not quite endsville.

    PubMed

    Stahl, Matthew T

    2005-02-01

    Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.

  8. Introduction to geospatial semantics and technology workshop handbook

    USGS Publications Warehouse

    Varanka, Dalia E.

    2012-01-01

    The workshop is a tutorial on introductory geospatial semantics with hands-on exercises using standard Web browsers. The workshop is divided into two sections, general semantics on the Web and specific examples of geospatial semantics using data from The National Map of the U.S. Geological Survey and the Open Ontology Repository. The general semantics section includes information and access to publicly available semantic archives. The specific session includes information on geospatial semantics with access to semantically enhanced data for hydrography, transportation, boundaries, and names. The Open Ontology Repository offers open-source ontologies for public use.

  9. The HYPE Open Source Community

    NASA Astrophysics Data System (ADS)

    Strömbäck, Lena; Arheimer, Berit; Pers, Charlotta; Isberg, Kristina

    2013-04-01

    The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model (Lindström et al., 2010). It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. In the model, the landscape is divided into classes according to soil type, vegetation and altitude. The soil representation is stratified and can be divided in up to three layers. Water and substances are routed through the same flow paths and storages (snow, soil, groundwater, streams, rivers, lakes) considering turn-over and transformation on the way towards the sea. In Sweden, the model is used by water authorities to fulfil the Water Framework Directive and the Marine Strategy Framework Directive. It is used for characterization, forecasts, and scenario analyses. Model data can be downloaded for free from three different HYPE applications: Europe (www.smhi.se/e-hype), Baltic Sea basin (www.smhi.se/balt-hype), and Sweden (vattenweb.smhi.se) The HYPE OSC (hype.sourceforge.net) is an open source initiative under the Lesser GNU Public License taken by SMHI to strengthen international collaboration in hydrological modelling and hydrological data production. The hypothesis is that more brains and more testing will result in better models and better code. The code is transparent and can be changed and learnt from. New versions of the main code will be delivered frequently. The main objective of the HYPE OSC is to provide public access to a state-of-the-art operational hydrological model and to encourage hydrologic expertise from different parts of the world to contribute to model improvement. HYPE OSC is open to everyone interested in hydrology, hydrological modelling and code development - e.g. scientists, authorities, and consultancies. The HYPE Open Source Community was initiated in November 2011 by a kick-off and workshop with 50 eager participants

  10. ESMPy and OpenClimateGIS: Python Interfaces for High Performance Grid Remapping and Geospatial Dataset Manipulation

    NASA Astrophysics Data System (ADS)

    O'Kuinghttons, Ryan; Koziol, Benjamin; Oehmke, Robert; DeLuca, Cecelia; Theurich, Gerhard; Li, Peggy; Jacob, Joseph

    2016-04-01

    The Earth System Modeling Framework (ESMF) Python interface (ESMPy) supports analysis and visualization in Earth system modeling codes by providing access to a variety of tools for data manipulation. ESMPy started as a Python interface to the ESMF grid remapping package, which provides mature and robust high-performance and scalable grid remapping between 2D and 3D logically rectangular and unstructured grids and sets of unconnected data. ESMPy now also interfaces with OpenClimateGIS (OCGIS), a package that performs subsetting, reformatting, and computational operations on climate datasets. ESMPy exposes a subset of ESMF grid remapping utilities. This includes bilinear, finite element patch recovery, first-order conservative, and nearest neighbor grid remapping methods. There are also options to ignore unmapped destination points, mask points on source and destination grids, and provide grid structure in the polar regions. Grid remapping on the sphere takes place in 3D Cartesian space, so the pole problem is not an issue as it can be with other grid remapping software. Remapping can be done between any combination of 2D and 3D logically rectangular and unstructured grids with overlapping domains. Grid pairs where one side of the regridding is represented by an appropriate set of unconnected data points, as is commonly found with observational data streams, is also supported. There is a developing interoperability layer between ESMPy and OpenClimateGIS (OCGIS). OCGIS is a pure Python, open source package designed for geospatial manipulation, subsetting, and computation on climate datasets stored in local NetCDF files or accessible remotely via the OPeNDAP protocol. Interfacing with OCGIS has brought GIS-like functionality to ESMPy (i.e. subsetting, coordinate transformations) as well as additional file output formats (i.e. CSV, ESRI Shapefile). ESMPy is distinguished by its strong emphasis on open source, community governance, and distributed development. The user

  11. The Commercial Open Source Business Model

    NASA Astrophysics Data System (ADS)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  12. Openly Published Environmental Sensing (OPEnS) | Advancing Open-Source Research, Instrumentation, and Dissemination

    NASA Astrophysics Data System (ADS)

    Udell, C.; Selker, J. S.

    2017-12-01

    The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.

  13. Open Source Cloud-Based Technologies for Bim

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.

    2018-05-01

    This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  14. The 2017 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year’s theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest. PMID:29118973

  15. The 2017 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year's theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest.

  16. Free for All: Open Source Software

    ERIC Educational Resources Information Center

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  17. Geospatial Analysis of Atmospheric Haze Effect by Source and Sink Landscape

    NASA Astrophysics Data System (ADS)

    Yu, T.; Xu, K.; Yuan, Z.

    2017-09-01

    Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD) of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN) and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents atmospheric haze

  18. Open-source hardware for medical devices.

    PubMed

    Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold

    2016-04-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.

  19. Open-source hardware for medical devices

    PubMed Central

    2016-01-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528

  20. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    NASA Technical Reports Server (NTRS)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  1. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data

    SciTech Connect

    Ananthakrishnan, Rachana; Bell, Gavin; Cinquini, Luca

    2013-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  2. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data

    SciTech Connect

    Cinquini, Luca; Crichton, Daniel; Miller, Neill

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  3. Open Genetic Code: on open source in the life sciences.

    PubMed

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  4. The 2016 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science.

  5. The 2016 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science. PMID:27781083

  6. Interim Open Source Software (OSS) Policy

    EPA Pesticide Factsheets

    This interim Policy establishes a framework to implement the requirements of the Office of Management and Budget's (OMB) Federal Source Code Policy to achieve efficiency, transparency and innovation through reusable and open source software.

  7. 7 Questions to Ask Open Source Vendors

    ERIC Educational Resources Information Center

    Raths, David

    2012-01-01

    With their budgets under increasing pressure, many campus IT directors are considering open source projects for the first time. On the face of it, the savings can be significant. Commercial emergency-planning software can cost upward of six figures, for example, whereas the open source Kuali Ready might run as little as $15,000 per year when…

  8. Open Source 2010: Reflections on 2007

    ERIC Educational Resources Information Center

    Wheeler, Brad

    2007-01-01

    Colleges and universities and commercial firms have demonstrated great progress in realizing the vision proffered for "Open Source 2007," and 2010 will mark even greater progress. Although much work remains in refining open source for higher education applications, the signals are now clear: the collaborative development of software can provide…

  9. "Open-Sourcing" Personal Learning

    ERIC Educational Resources Information Center

    Fiedler, Sebastian H.D.

    2014-01-01

    This article offers a critical reflection on the contemporary Open Educational Resource (OER) movement, its unquestioned investment in a collective "content fetish" and an educational "problem description" that focuses on issues of scarcity, access, and availability of quality materials. It also argues that OER proponents fail…

  10. Identifying and characterizing major emission point sources as a basis for geospatial distribution of mercury emissions inventories

    NASA Astrophysics Data System (ADS)

    Steenhuisen, Frits; Wilson, Simon J.

    2015-07-01

    Mercury is a global pollutant that poses threats to ecosystem and human health. Due to its global transport, mercury contamination is found in regions of the Earth that are remote from major emissions areas, including the Polar regions. Global anthropogenic emission inventories identify important sectors and industries responsible for emissions at a national level; however, to be useful for air transport modelling, more precise information on the locations of emission is required. This paper describes the methodology applied, and the results of work that was conducted to assign anthropogenic mercury emissions to point sources as part of geospatial mapping of the 2010 global anthropogenic mercury emissions inventory prepared by AMAP/UNEP. Major point-source emission sectors addressed in this work account for about 850 tonnes of the emissions included in the 2010 inventory. This work allocated more than 90% of these emissions to some 4600 identified point source locations, including significantly more point source locations in Africa, Asia, Australia and South America than had been identified during previous work to geospatially-distribute the 2005 global inventory. The results demonstrate the utility and the limitations of using existing, mainly public domain resources to accomplish this work. Assumptions necessary to make use of selected online resources are discussed, as are artefacts that can arise when these assumptions are applied to assign (national-sector) emissions estimates to point sources in various countries and regions. Notwithstanding the limitations of the available information, the value of this procedure over alternative methods commonly used to geo-spatially distribute emissions, such as use of 'proxy' datasets to represent emissions patterns, is illustrated. Improvements in information that would facilitate greater use of these methods in future work to assign emissions to point-sources are discussed. These include improvements to both national

  11. The Efficient Utilization of Open Source Information

    SciTech Connect

    Baty, Samuel R.

    These are a set of slides on the efficient utilization of open source information. Open source information consists of a vast set of information from a variety of sources. Not only does the quantity of open source information pose a problem, the quality of such information can hinder efforts. To show this, two case studies are mentioned: Iran and North Korea, in order to see how open source information can be utilized. The huge breadth and depth of open source information can complicate an analysis, especially because open information has no guarantee of accuracy. Open source information can provide keymore » insights either directly or indirectly: looking at supporting factors (flow of scientists, products and waste from mines, government budgets, etc.); direct factors (statements, tests, deployments). Fundamentally, it is the independent verification of information that allows for a more complete picture to be formed. Overlapping sources allow for more precise bounds on times, weights, temperatures, yields or other issues of interest in order to determine capability. Ultimately, a "good" answer almost never comes from an individual, but rather requires the utilization of a wide range of skill sets held by a team of people.« less

  12. GEOSPATIAL QA

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. Quality Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  13. Geospatial Engineering

    DTIC Science & Technology

    2017-02-22

    manages operations through guidance, policies, programs, and organizations. The NSG is designed to be a mutually supportive enterprise that...deliberate technical design and deliberate human actions. Geospatial engineer teams (GETs) within the geospatial intelligence cells are the day-to-day...standards working group and are designated by the AGC Geospatial Acquisition Support Directorate as required for interoperability. Applicable standards

  14. A clinic compatible, open source electrophysiology system.

    PubMed

    Hermiz, John; Rogers, Nick; Kaestner, Erik; Ganji, Mehran; Cleary, Dan; Snider, Joseph; Barba, David; Dayeh, Shadi; Halgren, Eric; Gilja, Vikash

    2016-08-01

    Open source electrophysiology (ephys) recording systems have several advantages over commercial systems such as customization and affordability enabling more researchers to conduct ephys experiments. Notable open source ephys systems include Open-Ephys, NeuroRighter and more recently Willow, all of which have high channel count (64+), scalability, and advanced software to develop on top of. However, little work has been done to build an open source ephys system that is clinic compatible, particularly in the operating room where acute human electrocorticography (ECoG) research is performed. We developed an affordable (<; $10,000) and open system for research purposes that features power isolation for patient safety, compact and water resistant enclosures and 256 recording channels sampled up to 20ksam/sec, 16-bit. The system was validated by recording ECoG with a high density, thin film device for an acute, awake craniotomy study at UC San Diego, Thornton Hospital Operating Room.

  15. Freeing Worldview's development process: Open source everything!

    NASA Astrophysics Data System (ADS)

    Gunnoe, T.

    2016-12-01

    Freeing your code and your project are important steps for creating an inviting environment for collaboration, with the added side effect of keeping a good relationship with your users. NASA Worldview's codebase was released with the open source NOSA (NASA Open Source Agreement) license in 2014, but this is only the first step. We also have to free our ideas, empower our users by involving them in the development process, and open channels that lead to the creation of a community project. There are many highly successful examples of Free and Open Source Software (FOSS) projects of which we can take note: the Linux kernel, Debian, GNOME, etc. These projects owe much of their success to having a passionate mix of developers/users with a great community and a common goal in mind. This presentation will describe the scope of this openness and how Worldview plans to move forward with a more community-inclusive approach.

  16. Open Source and Open Standard based decision support system: the example of lake Verbano floods management.

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Antonovic, Milan; Pozzoni, Maurizio; Graf, Andrea

    2015-04-01

    The Locarno area (Switzerland, Canton Ticino) is exposed to lacual floods with a return period of about 7-8 years. The risk is of particular concern because the area is located in a floodplain that registered in the last decades a great increase in settlement and values of the real estates. Moreover small differences in lake altitude may produce a significant increase in flooded area due to the very low average slope of the terrain. While fatalities are not generally registered, several important economic costs are associated, e.g.: damages to real estates, interruption of activities, evacuation and relocation and environmental damages. While important events were registered in 1978, 1993, 2000, 2002 and 2014 the local stakeholder invested time and money in the set-up of an up-to-date decision support system that allows for the reduction of risks. Thanks to impressive technological advances the visionary concept of the Digital Earth (Gore 1992, 1998) is being realizing: geospatial coverages and monitoring systems data are increasingly available on the Web, and more importantly, in a standard format. As a result, today is possible to develop innovative decision support systems (Molinari et al. 2013) which mesh-up several information sources and offers special features for risk scenarios evaluation. In agreement with the exposed view, the authors have recently developed a new Web system whose design is based on the Service Oriented Architecture pattern. Open source software (e.g.: Geoserver, PostGIS, OpenLayers) has been used throughout the whole system and geospatial Open Standards (e.g.: SOS, WMS, WFS) are the pillars it rely on. SITGAP 2.0, implemented in collaboration with the Civil protection of Locarno e Vallemaggia, combines a number of data sources such as the Federal Register of Buildings and Dwellings, the Cantonal Register of residents, the Cadastral Surveying, the Cantonal Hydro-meteorological monitoring observations, the Meteoswiss weather forecasts, and

  17. GIS-Based Noise Simulation Open Source Software: N-GNOIS

    NASA Astrophysics Data System (ADS)

    Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh

    2015-12-01

    Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.

  18. An Open Source Model for Open Access Journal Publication

    PubMed Central

    Blesius, Carl R.; Williams, Michael A.; Holzbach, Ana; Huntley, Arthur C.; Chueh, Henry

    2005-01-01

    We describe an electronic journal publication infrastructure that allows a flexible publication workflow, academic exchange around different forms of user submissions, and the exchange of articles between publishers and archives using a common XML based standard. This web-based application is implemented on a freely available open source software stack. This publication demonstrates the Dermatology Online Journal's use of the platform for non-biased independent open access publication. PMID:16779183

  19. Meteorological Error Budget Using Open Source Data

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using Open- Source Data by J Cogan, J Smith, P...needed. Do not return it to the originator. ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using...Error Budget Using Open-Source Data 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) J Cogan, J Smith, P Haines

  20. A GeoNode-Based Multiscale Platform For Management, Visualization And Integration Of DInSAR Data With Different Geospatial Information Sources

    NASA Astrophysics Data System (ADS)

    Buonanno, Sabatino; Fusco, Adele; Zeni, Giovanni; Manunta, Michele; Lanari, Riccardo

    2017-04-01

    This work describes the implementation of an efficient system for managing, viewing, analyzing and updating remotely sensed data, with special reference to Differential Interferometric Synthetic Aperture Radar (DInSAR) data. The DInSAR products measure Earth surface deformation both in space and time, producing deformation maps and time series[1,2]. The use of these data in research or operational contexts requires tools that have to handle temporal and spatial variability with high efficiency. For this aim we present an implementation based on Spatial Data Infrastructure (SDI) for data integration, management and interchange, by using standard protocols[3]. SDI tools provide access to static datasets that operate only with spatial variability . In this paper we use the open source project GeoNode as framework to extend SDI infrastructure functionalities to ingest very efficiently DInSAR deformation maps and deformation time series. GeoNode allows to realize comprehensive and distributed infrastructure, following the standards of the Open Geospatial Consortium, Inc. - OGC, for remote sensing data management, analysis and integration [4,5]. In the current paper we explain the methodology used for manage the data complexity and data integration using the opens source project GeoNode. The solution presented in this work for the ingestion of DinSAR products is a very promising starting point for future developments of the OGC compliant implementation of a semi-automatic remote sensing data processing chain . [1] Berardino, P., Fornaro, G., Lanari, R., & Sansosti, E. (2002). A new Algorithm for Surface Deformation Monitoring based on Small Baseline Differential SAR Interferograms. IEEE Transactions on Geoscience and Remote Sensing, 40, 11, pp. 2375-2383. [2] Lanari R., F. Casu, M. Manzo, G. Zeni,, P. Berardino, M. Manunta and A. Pepe (2007), An overview of the Small Baseline Subset Algorithm: a DInSAR Technique for Surface Deformation Analysis, P. Appl. Geophys., 164

  1. Maps and the Geospatial Revolution: Teaching a Massive Open Online Course (MOOC) in Geography

    ERIC Educational Resources Information Center

    Robinson, Anthony C.; Kerski, Joseph; Long, Erin C.; Luo, Heng; DiBiase, David; Lee, Angela

    2015-01-01

    The massive open online course (MOOC) is a new approach for teaching online. MOOCs stand apart from traditional online classes in that they support thousands of learners through content and assessment mechanisms that can scale. A reason for their size is that MOOCs are free for anyone to take. Here we describe the design, development, and teaching…

  2. NCI's Distributed Geospatial Data Server

    NASA Astrophysics Data System (ADS)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under

  3. Open source tools for fluorescent imaging.

    PubMed

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Open source bioimage informatics for cell biology

    PubMed Central

    Swedlow, Jason R.; Eliceiri, Kevin W.

    2009-01-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery. PMID:19833518

  5. Open source bioimage informatics for cell biology.

    PubMed

    Swedlow, Jason R; Eliceiri, Kevin W

    2009-11-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.

  6. Managing multicentre clinical trials with open source.

    PubMed

    Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan

    2014-03-01

    Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.

  7. Maximizing the use of EO products: how to leverage the potential of open geospatial service architectures

    NASA Astrophysics Data System (ADS)

    Usländer, Thomas

    2012-10-01

    The demand for the rapid provision of EO products with well-defined characteristics in terms of temporal, spatial, image-specific and thematic criteria is increasing. Examples are products to support near real-time damage assessment after a natural disaster event, e.g. an earthquake. However, beyond the organizational and economic questions, there are technological and systemic barriers to enable a comfortable search, order, delivery or even combination of EO products. Most portals of space agencies and EO product providers require sophisticated satellite and product knowledge and, even worse, are all different and not interoperable. This paper gives an overview about the use cases and the architectural solutions that aim at an open and flexible EO mission infrastructure with application-oriented user interfaces and well-defined service interfaces based upon open standards. It presents corresponding international initiatives such as INSPIRE (Infrastructure for Spatial Information in the European Community), GMES (Global Monitoring for Environment and Security), GEOSS (Global Earth Observation System of Systems) and HMA (Heterogeneous Missions Accessibility) and their associated infrastructure approaches. The paper presents a corresponding analysis and design methodology and two examples how such architectures are already successfully used in early warning systems for geo-hazards and toolsets for environmentallyinduced health risks. Finally, the paper concludes with an outlook how these ideas relate to the vision of the Future Internet.

  8. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  9. An Evolving Worldview: Making Open Source Easy

    NASA Astrophysics Data System (ADS)

    Rice, Z.

    2017-12-01

    NASA Worldview is an interactive interface for browsing full-resolution, global satellite imagery. Worldview supports an open data policy so that academia, private industries and the general public can use NASA's satellite data to address Earth science related issues. Worldview was open sourced in 2014. By shifting to an open source approach, the Worldview application has evolved to better serve end-users. Project developers are able to have discussions with end-users and community developers to understand issues and develop new features. Community developers are able to track upcoming features, collaborate on them and make their own contributions. Developers who discover issues are able to address those issues and submit a fix. This reduces the time it takes for a project developer to reproduce an issue or develop a new feature. Getting new developers to contribute to the project has been one of the most important and difficult aspects of open sourcing Worldview. After witnessing potential outside contributors struggle, a focus has been made on making the installation of Worldview simple to reduce the initial learning curve and make contributing code easy. One way we have addressed this is through a simplified setup process. Our setup documentation includes a set of prerequisites and a set of straightforward commands to clone, configure, install and run. This presentation will emphasize our focus to simplify and standardize Worldview's open source code so that more people are able to contribute. The more people who contribute, the better the application will become over time.

  10. An Evolving Worldview: Making Open Source Easy

    NASA Technical Reports Server (NTRS)

    Rice, Zachary

    2017-01-01

    NASA Worldview is an interactive interface for browsing full-resolution, global satellite imagery. Worldview supports an open data policy so that academia, private industries and the general public can use NASA's satellite data to address Earth science related issues. Worldview was open sourced in 2014. By shifting to an open source approach, the Worldview application has evolved to better serve end-users. Project developers are able to have discussions with end-users and community developers to understand issues and develop new features. New developers are able to track upcoming features, collaborate on them and make their own contributions. Getting new developers to contribute to the project has been one of the most important and difficult aspects of open sourcing Worldview. A focus has been made on making the installation of Worldview simple to reduce the initial learning curve and make contributing code easy. One way we have addressed this is through a simplified setup process. Our setup documentation includes a set of prerequisites and a set of straight forward commands to clone, configure, install and run. This presentation will emphasis our focus to simplify and standardize Worldview's open source code so more people are able to contribute. The more people who contribute, the better the application will become over time.

  11. OSIRIX: open source multimodality image navigation software

    NASA Astrophysics Data System (ADS)

    Rosset, Antoine; Pysher, Lance; Spadola, Luca; Ratib, Osman

    2005-04-01

    The goal of our project is to develop a completely new software platform that will allow users to efficiently and conveniently navigate through large sets of multidimensional data without the need of high-end expensive hardware or software. We also elected to develop our system on new open source software libraries allowing other institutions and developers to contribute to this project. OsiriX is a free and open-source imaging software designed manipulate and visualize large sets of medical images: http://homepage.mac.com/rossetantoine/osirix/

  12. Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on

    PubMed Central

    2011-01-01

    Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community. PMID:21999342

  13. The SAMI2 Open Source Project

    NASA Astrophysics Data System (ADS)

    Huba, J. D.; Joyce, G.

    2001-05-01

    In the past decade, the Open Source Model for software development has gained popularity and has had numerous major achievements: emacs, Linux, the Gimp, and Python, to name a few. The basic idea is to provide the source code of the model or application, a tutorial on its use, and a feedback mechanism with the community so that the model can be tested, improved, and archived. Given the success of the Open Source Model, we believe it may prove valuable in the development of scientific research codes. With this in mind, we are `Open Sourcing' the low to mid-latitude ionospheric model that has recently been developed at the Naval Research Laboratory: SAMI2 (Sami2 is Another Model of the Ionosphere). The model is comprehensive and uses modern numerical techniques. The structure and design of SAMI2 make it relatively easy to understand and modify: the numerical algorithms are simple and direct, and the code is reasonably well-written. Furthermore, SAMI2 is designed to run on personal computers; prohibitive computational resources are not necessary, thereby making the model accessible and usable by virtually all researchers. For these reasons, SAMI2 is an excellent candidate to explore and test the open source modeling paradigm in space physics research. We will discuss various topics associated with this project. Research supported by the Office of Naval Research.

  14. Open Source, Open Standards, and Health Care Information Systems

    PubMed Central

    2011-01-01

    Recognition of the improvements in patient safety, quality of patient care, and efficiency that health care information systems have the potential to bring has led to significant investment. Globally the sale of health care information systems now represents a multibillion dollar industry. As policy makers, health care professionals, and patients, we have a responsibility to maximize the return on this investment. To this end we analyze alternative licensing and software development models, as well as the role of standards. We describe how licensing affects development. We argue for the superiority of open source licensing to promote safer, more effective health care information systems. We claim that open source licensing in health care information systems is essential to rational procurement strategy. PMID:21447469

  15. Open source, open standards, and health care information systems.

    PubMed

    Reynolds, Carl J; Wyatt, Jeremy C

    2011-02-17

    Recognition of the improvements in patient safety, quality of patient care, and efficiency that health care information systems have the potential to bring has led to significant investment. Globally the sale of health care information systems now represents a multibillion dollar industry. As policy makers, health care professionals, and patients, we have a responsibility to maximize the return on this investment. To this end we analyze alternative licensing and software development models, as well as the role of standards. We describe how licensing affects development. We argue for the superiority of open source licensing to promote safer, more effective health care information systems. We claim that open source licensing in health care information systems is essential to rational procurement strategy.

  16. Application of Open Source Technologies for Oceanographic Data Analysis

    NASA Astrophysics Data System (ADS)

    Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.

    2015-12-01

    NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes

  17. Open Source and These United States

    DTIC Science & Technology

    1999-04-01

    the ability of all participants to freely access the source code and keep abreast of progress. There can be no information hoarding on an open source... developed in this way depends upon ready and reliable communications. Just as the internet has increased the ability of people to exchange information...investment is maximized through long use and reuse. This process results in systems which harnesses the collaborative abilities of its user developers

  18. Web Server Security on Open Source Environments

    NASA Astrophysics Data System (ADS)

    Gkoutzelis, Dimitrios X.; Sardis, Manolis S.

    Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.

  19. Geospatial Technology

    ERIC Educational Resources Information Center

    Reed, Philip A.; Ritz, John

    2004-01-01

    Geospatial technology refers to a system that is used to acquire, store, analyze, and output data in two or three dimensions. This data is referenced to the earth by some type of coordinate system, such as a map projection. Geospatial systems include thematic mapping, the Global Positioning System (GPS), remote sensing (RS), telemetry, and…

  20. Open Source Software and the Intellectual Commons.

    ERIC Educational Resources Information Center

    Dorman, David

    2002-01-01

    Discusses the Open Source Software method of software development and its relationship to control over information content. Topics include digital library resources; reference services; preservation; the legal and economic status of information; technical standards; access to digital data; control of information use; and copyright and patent laws.…

  1. Open source OCR framework using mobile devices

    NASA Astrophysics Data System (ADS)

    Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan

    2008-02-01

    Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.

  2. Open-Source Syringe Pump Library

    PubMed Central

    Wijnen, Bas; Hunt, Emily J.; Anzalone, Gerald C.; Pearce, Joshua M.

    2014-01-01

    This article explores a new open-source method for developing and manufacturing high-quality scientific equipment suitable for use in virtually any laboratory. A syringe pump was designed using freely available open-source computer aided design (CAD) software and manufactured using an open-source RepRap 3-D printer and readily available parts. The design, bill of materials and assembly instructions are globally available to anyone wishing to use them. Details are provided covering the use of the CAD software and the RepRap 3-D printer. The use of an open-source Rasberry Pi computer as a wireless control device is also illustrated. Performance of the syringe pump was assessed and the methods used for assessment are detailed. The cost of the entire system, including the controller and web-based control interface, is on the order of 5% or less than one would expect to pay for a commercial syringe pump having similar performance. The design should suit the needs of a given research activity requiring a syringe pump including carefully controlled dosing of reagents, pharmaceuticals, and delivery of viscous 3-D printer media among other applications. PMID:25229451

  3. Communal Resources in Open Source Software Development

    ERIC Educational Resources Information Center

    Spaeth, Sebastian; Haefliger, Stefan; von Krogh, Georg; Renzl, Birgit

    2008-01-01

    Introduction: Virtual communities play an important role in innovation. The paper focuses on the particular form of collective action in virtual communities underlying as Open Source software development projects. Method: Building on resource mobilization theory and private-collective innovation, we propose a theory of collective action in…

  4. Of Birkenstocks and Wingtips: Open Source Licenses

    ERIC Educational Resources Information Center

    Gandel, Paul B.; Wheeler, Brad

    2005-01-01

    The notion of collaborating to create open source applications for higher education is rapidly gaining momentum. From course management systems to ERP financial systems, higher education institutions are working together to explore whether they can in fact build a better mousetrap. As Lois Brooks, of Stanford University, recently observed, the…

  5. The role of national and international geospatial data sources in the management of natural disasters

    NASA Astrophysics Data System (ADS)

    Kayi, A.; Erdogan, M.; Yilmaz, A.

    2014-11-01

    An earthquake occurred at Van City on 23 October 2011 at 13:41 local time. The magnitude, moment magnitude and depth of earthquake were respectively MI:6.7, Mw:7.0 and 19.07 km. Van city centre and its surrounding villages were affected from this destructive earthquake. Many buildings were ruined and approximately 600 people died. Acquisition and use of geospatial data is very important and crucial for the management of such kind of natural disasters. In this paper, the role of national and international geospatial data in the management of Van earthquake is investigated.. With an international collaboration with Charter, pre and post-earthquake satellite images were acquired in 24 hours following the Earthquake. Also General Command of Mapping (GCM), the national mapping agency of Turkey, produced the high resolution multispectral orthophotos of the region. Charter presented the orthophotos through 26-28 October 2012. Just after the earthquake with a quick reaction, GCM made the flight planning of the 1296 km2 disaster area to acquire aerial photos. The aerial photos were acquired on 24 October 2012 (one day after the earthquake) by UltraCamX large format digital aerial camera. 152 images were taken with 30 cm ground sample distance (GSD) by %30 sidelap and %60 overlap. In the evening of same flight day, orthophotos were produced without ground control points by direct georeferencing and GCM supplied the orthophotos to the disaster management authorities. Also 45 cm GSD archive orthophotos, acquired in 2010, were used as a reference in order to find out the effects of the disaster. The subjects written here do not represent the ideas of Turkish Armed Forces.

  6. OpenFOAM: Open source CFD in research and industry

    NASA Astrophysics Data System (ADS)

    Jasak, Hrvoje

    2009-12-01

    The current focus of development in industrial Computational Fluid Dynamics (CFD) is integration of CFD into Computer-Aided product development, geometrical optimisation, robust design and similar. On the other hand, in CFD research aims to extend the boundaries ofpractical engineering use in "non-traditional " areas. Requirements of computational flexibility and code integration are contradictory: a change of coding paradigm, with object orientation, library components, equation mimicking is proposed as a way forward. This paper describes OpenFOAM, a C++ object oriented library for Computational Continuum Mechanics (CCM) developed by the author. Efficient and flexible implementation of complex physical models is achieved by mimicking the form ofpartial differential equation in software, with code functionality provided in library form. Open Source deployment and development model allows the user to achieve desired versatility in physical modeling without the sacrifice of complex geometry support and execution efficiency.

  7. Open Access, Open Source and Digital Libraries: A Current Trend in University Libraries around the World

    ERIC Educational Resources Information Center

    Krishnamurthy, M.

    2008-01-01

    Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…

  8. Computer Forensics Education - the Open Source Approach

    NASA Astrophysics Data System (ADS)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  9. From open source communications to knowledge

    NASA Astrophysics Data System (ADS)

    Preece, Alun; Roberts, Colin; Rogers, David; Webberley, Will; Innes, Martin; Braines, Dave

    2016-05-01

    Rapid processing and exploitation of open source information, including social media sources, in order to shorten decision-making cycles, has emerged as an important issue in intelligence analysis in recent years. Through a series of case studies and natural experiments, focussed primarily upon policing and counter-terrorism scenarios, we have developed an approach to information foraging and framing to inform decision making, drawing upon open source intelligence, in particular Twitter, due to its real-time focus and frequent use as a carrier for links to other media. Our work uses a combination of natural language (NL) and controlled natural language (CNL) processing to support information collection from human sensors, linking and schematising of collected information, and the framing of situational pictures. We illustrate the approach through a series of vignettes, highlighting (1) how relatively lightweight and reusable knowledge models (schemas) can rapidly be developed to add context to collected social media data, (2) how information from open sources can be combined with reports from trusted observers, for corroboration or to identify con icting information; and (3) how the approach supports users operating at or near the tactical edge, to rapidly task information collection and inform decision-making. The approach is supported by bespoke software tools for social media analytics and knowledge management.

  10. Spatial Information Processing: Standards-Based Open Source Visualization Technology

    NASA Astrophysics Data System (ADS)

    Hogan, P.

    2009-12-01

    elements. World Wind therefore changed its mission from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating a single program, World Wind is a suite of components that can be selectively used in any number of programs. World Wind technology can be a part of any application, or it can be a window in a web page. Or it can be extended with additional functionalities by application and web developers. World Wind makes it possible to include virtual globe visualization and server technology in support of any objective. The world community can continually benefit from advances made in the technology by NASA in concert with the world community. 3. OPEN SOURCE AND OPEN STANDARDS NASA World Wind is NASA Open Source software. This means that the source code is fully accessible for anyone to freely use, even in association with proprietary technology. Imagery and other data provided by the World Wind servers reside in the public domain, including the data server technology itself. This allows others to deliver their own geospatial data and to provide custom solutions based on users specific needs.

  11. Building Energy Management Open Source Software

    SciTech Connect

    This is the repository for Building Energy Management Open Source Software (BEMOSS), which is an open source operating system that is engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. BEMOSS offers the following key features: (1) Open source, open architecture – BEMOSS is an open source operating system that is built upon VOLTTRON – a distributed agent platform developed by Pacific Northwest National Laboratory (PNNL). BEMOSS was designed to make it easy for hardware manufacturers to seamlessly interface their devices with BEMOSS. Software developers can also contribute to adding additional BEMOSS functionalities and applications.more » (2) Plug & play – BEMOSS was designed to automatically discover supported load controllers (including smart thermostats, VAV/RTUs, lighting load controllers and plug load controllers) in commercial buildings. (3) Interoperability – BEMOSS was designed to work with load control devices form different manufacturers that operate on different communication technologies and data exchange protocols. (4) Cost effectiveness – Implementation of BEMOSS deemed to be cost-effective as it was built upon a robust open source platform that can operate on a low-cost single-board computer, such as Odroid. This feature could contribute to its rapid deployment in small- or medium-sized commercial buildings. (5) Scalability and ease of deployment – With its multi-node architecture, BEMOSS provides a distributed architecture where load controllers in a multi-floor and high occupancy building could be monitored and controlled by multiple single-board computers hosting BEMOSS. This makes it possible for a building engineer to deploy BEMOSS in one zone of a building, be comfortable with its operation, and later on expand the deployment to the entire building to make it more energy efficient. (6) Ability to provide local and remote monitoring – BEMOSS provides both local and remote

  12. Open-source tools for data mining.

    PubMed

    Zupan, Blaz; Demsar, Janez

    2008-03-01

    With a growing volume of biomedical databases and repositories, the need to develop a set of tools to address their analysis and support knowledge discovery is becoming acute. The data mining community has developed a substantial set of techniques for computational treatment of these data. In this article, we discuss the evolution of open-source toolboxes that data mining researchers and enthusiasts have developed over the span of a few decades and review several currently available open-source data mining suites. The approaches we review are diverse in data mining methods and user interfaces and also demonstrate that the field and its tools are ready to be fully exploited in biomedical research.

  13. The Emergence of Open-Source Software in China

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…

  14. New Open-Source Version of FLORIS Released | News | NREL

    Science.gov Websites

    New Open-Source Version of FLORIS Released New Open-Source Version of FLORIS Released January 26 , 2018 National Renewable Energy Laboratory (NREL) researchers recently released an updated open-source simplified and documented. Because of the living, open-source nature of the newly updated utility, NREL

  15. Datacube Services in Action, Using Open Source and Open Standards

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2016-12-01

    Array Databases comprise novel, promising technology for massive spatio-temporal datacubes, extending the SQL paradigm of "any query, anytime" to n-D arrays. On server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. The rasdaman ("raster data manager") system, which has pioneered Array Databases, is available in open source on www.rasdaman.org. Its declarative query language extends SQL with array operators which are optimized and parallelized on server side. The rasdaman engine, which is part of OSGeo Live, is mature and in operational use databases individually holding dozens of Terabytes. Further, the rasdaman concepts have strongly impacted international Big Data standards in the field, including the forthcoming MDA ("Multi-Dimensional Array") extension to ISO SQL, the OGC Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) standards, and the forthcoming INSPIRE WCS/WCPS; in both OGC and INSPIRE, OGC is WCS Core Reference Implementation. In our talk we present concepts, architecture, operational services, and standardization impact of open-source rasdaman, as well as experiences made.

  16. Open Knee: Open Source Modeling and Simulation in Knee Biomechanics.

    PubMed

    Erdemir, Ahmet

    2016-02-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical functions of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor-intensive reproduction of model development steps can be avoided. Interested parties can immediately utilize readily available models for scientific discovery and clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes the detailed anatomical representation of the joint's major tissue structures and their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next-generation knee models is noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age

  17. Geospatial Authentication

    NASA Technical Reports Server (NTRS)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server.

  18. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  19. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  20. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  1. Borderless Geospatial Web (bolegweb)

    NASA Astrophysics Data System (ADS)

    Cetl, V.; Kliment, T.; Kliment, M.

    2016-06-01

    The effective access and use of geospatial information (GI) resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC) are frequently used within the implementations of spatial data infrastructures (SDIs) to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery) service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project "Crosswalking the layers of geospatial information resources to enable a borderless geospatial web" with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/). The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013) under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.

  2. Open Data, Jupyter Notebooks and Geospatial Data Standards Combined - Opening up large volumes of marine and climate data to other communities

    NASA Astrophysics Data System (ADS)

    Clements, O.; Siemen, S.; Wagemann, J.

    2017-12-01

    The EU-funded Earthserver-2 project aims to offer on-demand access to large volumes of environmental data (Earth Observation, Marine, Climate data and Planetary data) via the interface standard Web Coverage Service defined by the Open Geospatial Consortium. Providing access to data via OGC web services (e.g. WCS and WMS) has the potential to open up services to a wider audience, especially to users outside the respective communities. Especially WCS 2.0 with its processing extension Web Coverage Processing Service (WCPS) is highly beneficial to make large volumes accessible to non-expert communities. Users do not have to deal with custom community data formats, such as GRIB for the meteorological community, but can directly access the data in a format they are more familiar with, such as NetCDF, JSON or CSV. Data requests can further directly be integrated into custom processing routines and users are not required to download Gigabytes of data anymore. WCS supports trim (reduction of data extent) and slice (reduction of data dimension) operations on multi-dimensional data, providing users a very flexible on-demand access to the data. WCPS allows the user to craft queries to run on the data using a text-based query language, similar to SQL. These queries can be very powerful, e.g. condensing a three-dimensional data cube into its two-dimensional mean. However, the more processing-intensive the more complex the query. As part of the EarthServer-2 project, we developed a python library that helps users to generate complex WCPS queries with Python, a programming language they are more familiar with. The interactive presentation aims to give practical examples how users can benefit from two specific WCS services from the Marine and Climate community. Use-cases from the two communities will show different approaches to take advantage of a Web Coverage (Processing) Service. The entire content is available with Jupyter Notebooks, as they prove to be a highly beneficial tool

  3. Beyond Open Source: According to Jim Hirsch, Open Technology, Not Open Source, Is the Wave of the Future

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    This article presents an interview with Jim Hirsch, an associate superintendent for technology at Piano Independent School District in Piano, Texas. Hirsch serves as a liaison for the open technologies committee of the Consortium for School Networking. In this interview, he shares his opinion on the significance of open source in K-12.

  4. Open Source Live Distributions for Computer Forensics

    NASA Astrophysics Data System (ADS)

    Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele

    Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.

  5. OnEarth: An Open Source Solution for Efficiently Serving High-Resolution Mapped Image Products

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Plesea, L.; Hall, J. R.; Roberts, J. T.; Cechini, M. F.; Schmaltz, J. E.; Alarcon, C.; Huang, T.; McGann, J. M.; Chang, G.; Boller, R. A.; Ilavajhala, S.; Murphy, K. J.; Bingham, A. W.

    2013-12-01

    This presentation introduces OnEarth, a server side software package originally developed at the Jet Propulsion Laboratory (JPL), that facilitates network-based, minimum-latency geolocated image access independent of image size or spatial resolution. The key component in this package is the Meta Raster Format (MRF), a specialized raster file extension to the Geospatial Data Abstraction Library (GDAL) consisting of an internal indexed pyramid of image tiles. Imagery to be served is converted to the MRF format and made accessible online via an expandable set of server modules handling requests in several common protocols, including the Open Geospatial Consortium (OGC) compliant Web Map Tile Service (WMTS) as well as Tiled WMS and Keyhole Markup Language (KML). OnEarth has recently transitioned to open source status and is maintained and actively developed as part of GIBS (Global Imagery Browse Services), a collaborative project between JPL and Goddard Space Flight Center (GSFC). The primary function of GIBS is to enhance and streamline the data discovery process and to support near real-time (NRT) applications via the expeditious ingestion and serving of full-resolution imagery representing science products from across the NASA Earth Science spectrum. Open source software solutions are leveraged where possible in order to utilize existing available technologies, reduce development time, and enlist wider community participation. We will discuss some of the factors and decision points in transitioning OnEarth to a suitable open source paradigm, including repository and licensing agreement decision points, institutional hurdles, and perceived benefits. We will also provide examples illustrating how OnEarth is integrated within GIBS and other applications.

  6. Implementation of an OAIS Repository Using Free, Open Source Software

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Gessler, P. E.; Seamon, E.

    2015-12-01

    The Northwest Knowledge Network (NKN) is a regional data repository located at the University of Idaho that focuses on the collection, curation, and distribution of research data. To support our home institution and others in the region, we offer services to researchers at all stages of the data lifecycle—from grant application and data management planning to data distribution and archive. In this role, we recognize the need to work closely with other data management efforts at partner institutions and agencies, as well as with larger aggregation efforts such as our state geospatial data clearinghouses, data.gov, DataONE, and others. In the past, one of our challenges with monolithic, prepackaged data management solutions is that customization can be difficult to implement and maintain, especially as new versions of the software are released that are incompatible with our local codebase. Our solution is to break the monolith up into its constituent parts, which offers us several advantages. First, any customizations that we make are likely to fall into areas that can be accessed through Application Program Interfaces (API) that are likely to remain stable over time, so our code stays compatible. Second, as components become obsolete or insufficient to meet new demands that arise, we can replace the individual components with minimal effect on the rest of the infrastructure, causing less disruption to operations. Other advantages include increased system reliability, staggered rollout of new features, enhanced compatibility with legacy systems, reduced dependence on a single software company as a point of failure, and the separation of development into manageable tasks. In this presentation, we describe our application of the Service Oriented Architecture (SOA) design paradigm to assemble a data repository that conforms to the Open Archival Information System (OAIS) Reference Model primarily using a collection of free and open-source software. We detail the design

  7. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    NASA Astrophysics Data System (ADS)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  8. Spatial rainfall data in open source environment

    NASA Astrophysics Data System (ADS)

    Schuurmans, Hanneke; Maarten Verbree, Jan; Leijnse, Hidde; van Heeringen, Klaas-Jan; Uijlenhoet, Remko; Bierkens, Marc; van de Giesen, Nick; Gooijer, Jan; van den Houten, Gert

    2013-04-01

    Since January 2013 The Netherlands have access to innovative high-quality rainfall data that is used for watermanagers. This product is innovative because of the following reasons. (i) The product is developed in a 'golden triangle' construction - corporation between government, business and research. (ii) Second the rainfall products are developed according to the open-source GPL license. The initiative comes from a group of water boards in the Netherlands that joined their forces to fund the development of a new rainfall product. Not only data from Dutch radar stations (as is currently done by the Dutch meteorological organization KNMI) is used but also data from radars in Germany and Belgium. After a radarcomposite is made, it is adjusted according to data from raingauges (ground truth). This results in 9 different rainfall products that give for each moment the best rainfall data. Specific knowledge is necessary to develop these kind of data. Therefore a pool of experts (KNMI, Deltares and 3 universities) participated in the development. The philosophy of the developers (being corporations) is that products like this should be developed in open source. This way knowledge is shared and the whole community is able to make suggestions for improvement. In our opinion this is the only way to make real progress in product development. Furthermore the financial resources of government organizations are optimized. More info (in Dutch): www.nationaleregenradar.nl

  9. Open Source Tools for Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Powers, P.

    2010-12-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.

  10. Open Source Hardware for DIY Environmental Sensing

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Hicks, S. D.; Damiano, S. G.; Montgomery, D. S.

    2014-12-01

    The Arduino open source electronics platform has been very popular within the DIY (Do It Yourself) community for several years, and it is now providing environmental science researchers with an inexpensive alternative to commercial data logging and transmission hardware. Here we present the designs for our latest series of custom Arduino-based dataloggers, which include wireless communication options like self-meshing radio networks and cellular phone modules. The main Arduino board uses a custom interface board to connect to various research-grade sensors to take readings of turbidity, dissolved oxygen, water depth and conductivity, soil moisture, solar radiation, and other parameters. Sensors with SDI-12 communications can be directly interfaced to the logger using our open Arduino-SDI-12 software library (https://github.com/StroudCenter/Arduino-SDI-12). Different deployment options are shown, like rugged enclosures to house the loggers and rigs for mounting the sensors in both fresh water and marine environments. After the data has been collected and transmitted by the logger, the data is received by a mySQL-PHP stack running on a web server that can be accessed from anywhere in the world. Once there, the data can be visualized on web pages or served though REST requests and Water One Flow (WOF) services. Since one of the main benefits of using open source hardware is the easy collaboration between users, we are introducing a new web platform for discussion and sharing of ideas and plans for hardware and software designs used with DIY environmental sensors and data loggers.

  11. Geospatial Authentication

    NASA Technical Reports Server (NTRS)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time has been developed. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server. The Geospatial Authentication software has two parts Server and Client. The server software is a virtual private network (VPN) developed in Linux operating system using Perl programming language. The server can be a stand-alone VPN server or can be combined with other applications and services. The client software is a GUI Windows CE software, or Mobile Graphical Software, that allows users to authenticate into a network. The purpose of the client software is to pass the needed satellite information to the server for authentication.

  12. An open source Java web application to build self-contained Web GIS sites

    NASA Astrophysics Data System (ADS)

    Zavala Romero, O.; Ahmed, A.; Chassignet, E.; Zavala-Hidalgo, J.

    2014-12-01

    This work describes OWGIS, an open source Java web application that creates Web GIS sites by automatically writing HTML and JavaScript code. OWGIS is configured by XML files that define which layers (geographic datasets) will be displayed on the websites. This project uses several Open Geospatial Consortium standards to request data from typical map servers, such as GeoServer, and is also able to request data from ncWMS servers. The latter allows for the displaying of 4D data stored using the NetCDF file format (widely used for storing environmental model datasets). Some of the features available on the sites built with OWGIS are: multiple languages, animations, vertical profiles and vertical transects, color palettes, color ranges, and the ability to download data. OWGIS main users are scientists, such as oceanographers or climate scientists, who store their data in NetCDF files and want to analyze, visualize, share, or compare their data using a website.

  13. Open Source Service Agent (OSSA) in the intelligence community's Open Source Architecture

    NASA Technical Reports Server (NTRS)

    Fiene, Bruce F.

    1994-01-01

    The Community Open Source Program Office (COSPO) has developed an architecture for the intelligence community's new Open Source Information System (OSIS). The architecture is a multi-phased program featuring connectivity, interoperability, and functionality. OSIS is based on a distributed architecture concept. The system is designed to function as a virtual entity. OSIS will be a restricted (non-public), user configured network employing Internet communications. Privacy and authentication will be provided through firewall protection. Connection to OSIS can be made through any server on the Internet or through dial-up modems provided the appropriate firewall authentication system is installed on the client.

  14. Developing a GIS for CO2 analysis using lightweight, open source components

    NASA Astrophysics Data System (ADS)

    Verma, R.; Goodale, C. E.; Hart, A. F.; Kulawik, S. S.; Law, E.; Osterman, G. B.; Braverman, A.; Nguyen, H. M.; Mattmann, C. A.; Crichton, D. J.; Eldering, A.; Castano, R.; Gunson, M. R.

    2012-12-01

    There are advantages to approaching the realm of geographic information systems (GIS) using lightweight, open source components in place of a more traditional web map service (WMS) solution. Rapid prototyping, schema-less data storage, the flexible interchange of components, and open source community support are just some of the benefits. In our effort to develop an application supporting the geospatial and temporal rendering of remote sensing carbon-dioxide (CO2) data for the CO2 Virtual Science Data Environment project, we have connected heterogeneous open source components together to form a GIS. Utilizing widely popular open source components including the schema-less database MongoDB, Leaflet interactive maps, the HighCharts JavaScript graphing library, and Python Bottle web-services, we have constructed a system for rapidly visualizing CO2 data with reduced up-front development costs. These components can be aggregated together, resulting in a configurable stack capable of replicating features provided by more standard GIS technologies. The approach we have taken is not meant to replace the more established GIS solutions, but to instead offer a rapid way to provide GIS features early in the development of an application and to offer a path towards utilizing more capable GIS technology in the future.

  15. Delineating Heavy Metal Sources with Environmental Magnetism, X-Ray Fluorescence, and Geospatial Analysis: Baton Rouge, Louisiana

    NASA Astrophysics Data System (ADS)

    Taylor, Delmetria

    The objective of this study is to detect the presence of anthropogenic magnetic particles by measuring the magnetic signature of soil samples in Baton Rouge, LA. Baton Rouge is currently ranked third among the Unites States' most polluted cities, so it is necessary to locate the sources of the problem and find solutions (U.S. Environmental Protection Agency [EPA], 2014). Magnetic susceptibility is a property that is easily, quickly, and inexpensively determined. It provides a highly-sensitive measurement of the compositional changes of minerals in soil. Magnetic susceptibility is influenced both anthropogenically and naturally by lithological and pedological heavy metal content. In this study, roughly 260 km2 (110 mi2) were sampled within the Baton Rouge, Louisiana area, and multiple environmental settings were covered. Geospatial and x-ray fluorescence analysis (XRF) were used to correlate magnetic susceptibility measurements and indicate increased anthropogenic activity near major industrial areas and interstates. Soil samples were analyzed, and isothermal remanent magnetization (IRM) acquisition curves indicate the presence of high- and low coercivity minerals, presumably magnetite and hematite. Enhanced susceptibility measurements do not appear to be dominated by lithology or soil in this particular area.

  16. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  17. Environmental Remote Sensing Analysis Using Open Source Virtual Earths and Public Domain Imagery

    NASA Astrophysics Data System (ADS)

    Pilant, A. N.; Worthy, L. D.

    2008-12-01

    Human activities increasingly impact natural environments. Globally, many ecosystems are stressed to unhealthy limits, leading to loss of valuable ecosystem services- economic, ecologic and intrinsic. Virtual earths (virtual globes) (e.g., NASA World Wind, ossimPlanet, ArcGIS Explorer, Google Earth, Microsoft Virtual Earth) are geospatial data integration tools that can aid our efforts to understand and protect the environment. Virtual earths provide unprecedented desktop views of our planet, not only to professional scientists, but also to citizen scientists, students, environmental stewards, decision makers, urban developers and planners. Anyone with a broadband internet connection can explore the planet virtually, due in large part to freely available open source software and public domain imagery. This has at least two important potential benefits. One, individuals can study the planet from the visually intuitive perspective of the synoptic aerial view, promoting environmental awareness and stewardship. Two, it opens up the possibility of harnessing the in situ knowledge and observations of citizen scientists familiar with landscape conditions in their locales. Could this collective knowledge be harnessed (crowd sourcing) to validate and quality assure land cover and other maps? In this presentation we present examples using public domain imagery and two open source virtual earths to highlight some of the functionalities currently available. OssimPlanet is used to view aerial data from the USDA Geospatial Data Gateway. NASA World Wind is used to extract georeferenced high resolution USGS urban area orthoimagery. ArcGIS Explorer is used to demonstrate an example of image analysis using web processing services. The research presented here was conducted under the Environmental Feature Finder project of the Environmental Protection Agency's Advanced Monitoring Initiative. Although this work was reviewed by EPA and approved for publication, it may not necessarily

  18. Shipping Science Worldwide with Open Source Containers

    NASA Astrophysics Data System (ADS)

    Molineaux, J. P.; McLaughlin, B. D.; Pilone, D.; Plofchan, P. G.; Murphy, K. J.

    2014-12-01

    Scientific applications often present difficult web-hosting needs. Their compute- and data-intensive nature, as well as an increasing need for high-availability and distribution, combine to create a challenging set of hosting requirements. In the past year, advancements in container-based virtualization and related tooling have offered new lightweight and flexible ways to accommodate diverse applications with all the isolation and portability benefits of traditional virtualization. This session will introduce and demonstrate an open-source, single-interface, Platform-as-a-Serivce (PaaS) that empowers application developers to seamlessly leverage geographically distributed, public and private compute resources to achieve highly-available, performant hosting for scientific applications.

  19. Building Energy Management Open Source Software

    SciTech Connect

    Rahman, Saifur

    Funded by the U.S. Department of Energy in November 2013, a Building Energy Management Open Source Software (BEMOSS) platform was engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. According to the Energy Information Administration (EIA), small- (5,000 square feet or smaller) and medium-sized (between 5,001 to 50,000 square feet) commercial buildings constitute about 95% of all commercial buildings in the U.S. These buildings typically do not have Building Automation Systems (BAS) to monitor and control building operation. While commercial BAS solutions exist, including those from Siemens, Honeywell, Johnsons Controls and many more, they aremore » not cost effective in the context of small- and medium-sized commercial buildings, and typically work with specific controller products from the same company. BEMOSS targets small and medium-sized commercial buildings to address this gap.« less

  20. An open source business model for malaria.

    PubMed

    Årdal, Christine; Røttingen, John-Arne

    2015-01-01

    Greater investment is required in developing new drugs and vaccines against malaria in order to eradicate malaria. These precious funds must be carefully managed to achieve the greatest impact. We evaluate existing efforts to discover and develop new drugs and vaccines for malaria to determine how best malaria R&D can benefit from an enhanced open source approach and how such a business model may operate. We assess research articles, patents, clinical trials and conducted a smaller survey among malaria researchers. Our results demonstrate that the public and philanthropic sectors are financing and performing the majority of malaria drug/vaccine discovery and development, but are then restricting access through patents, 'closed' publications and hidden away physical specimens. This makes little sense since it is also the public and philanthropic sector that purchases the drugs and vaccines. We recommend that a more "open source" approach is taken by making the entire value chain more efficient through greater transparency which may lead to more extensive collaborations. This can, for example, be achieved by empowering an existing organization like the Medicines for Malaria Venture (MMV) to act as a clearing house for malaria-related data. The malaria researchers that we surveyed indicated that they would utilize such registry data to increase collaboration. Finally, we question the utility of publicly or philanthropically funded patents for malaria medicines, where little to no profits are available. Malaria R&D benefits from a publicly and philanthropically funded architecture, which starts with academic research institutions, product development partnerships, commercialization assistance through UNITAID and finally procurement through mechanisms like The Global Fund to Fight AIDS, Tuberculosis and Malaria and the U.S.' President's Malaria Initiative. We believe that a fresh look should be taken at the cost/benefit of patents particularly related to new malaria

  1. The Emergence of Open-Source Software in North America

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…

  2. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    ERIC Educational Resources Information Center

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  3. The Open Source Teaching Project (OSTP): Research Note.

    ERIC Educational Resources Information Center

    Hirst, Tony

    The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…

  4. An Analysis of Open Source Security Software Products Downloads

    ERIC Educational Resources Information Center

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  5. The Open Source Snowpack modelling ecosystem

    NASA Astrophysics Data System (ADS)

    Bavay, Mathias; Fierz, Charles; Egger, Thomas; Lehning, Michael

    2016-04-01

    As a large number of numerical snow models are available, a few stand out as quite mature and widespread. One such model is SNOWPACK, the Open Source model that is developed at the WSL Institute for Snow and Avalanche Research SLF. Over the years, various tools have been developed around SNOWPACK in order to expand its use or to integrate additional features. Today, the model is part of a whole ecosystem that has evolved to both offer seamless integration and high modularity so each tool can easily be used outside the ecosystem. Many of these Open Source tools experience their own, autonomous development and are successfully used in their own right in other models and applications. There is Alpine3D, the spatially distributed version of SNOWPACK, that forces it with terrain-corrected radiation fields and optionally with blowing and drifting snow. This model can be used on parallel systems (either with OpenMP or MPI) and has been used for applications ranging from climate change to reindeer herding. There is the MeteoIO pre-processing library that offers fully integrated data access, data filtering, data correction, data resampling and spatial interpolations. This library is now used by several other models and applications. There is the SnopViz snow profile visualization library and application that supports both measured and simulated snow profiles (relying on the CAAML standard) as well as time series. This JavaScript application can be used standalone without any internet connection or served on the web together with simulation results. There is the OSPER data platform effort with a data management service (build on the Global Sensor Network (GSN) platform) as well as a data documenting system (metadata management as a wiki). There are several distributed hydrological models for mountainous areas in ongoing development that require very little information about the soil structure based on the assumption that in step terrain, the most relevant information is

  6. Integrating semantic web technologies and geospatial catalog services for geospatial information discovery and processing in cyberinfrastructure

    SciTech Connect

    Yue, Peng; Gong, Jianya; Di, Liping

    Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less

  7. Some legal concerns with the use of crowd-sourced Geospatial Information

    NASA Astrophysics Data System (ADS)

    Cho, George

    2014-06-01

    Volunteered geographic Information (VGI), citizens as sensors, crowd-sourcing and 'Wikipedia' of maps have been used to describe activity facilitated by the Internet and the dynamic Web 2.0 environment to collect geographic information (GI). Legal concerns raised in the creation, assembly and dissemination of GI by produsers include: quality, ownership and liability. In detail, accuracy and authoritativeness of the crowd-sourced GI; the ownership and moral rights to the information, and contractual and tort liability are key concerns. A legal framework and governance structure may be necessary whereby technology, networked governance and provision of legal protections may be combined to mitigate geo-liability as a 'chilling' factor in VGI development.

  8. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    NASA Astrophysics Data System (ADS)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems

  9. Finding geospatial pattern of unstructured data by clustering routes

    NASA Astrophysics Data System (ADS)

    Boustani, M.; Mattmann, C. A.; Ramirez, P.; Burke, W.

    2016-12-01

    Today the majority of data generated has a geospatial context to it. Either in attribute form as a latitude or longitude, or name of location or cross referenceable using other means such as an external gazetteer or location service. Our research is interested in exploiting geospatial location and context in unstructured data such as that found on the web in HTML pages, images, videos, documents, and other areas, and in structured information repositories found on intranets, in scientific environments, and otherwise. We are working together on the DARPA MEMEX project to exploit open source software tools such as the Lucene Geo Gazetteer, Apache Tika, Apache Lucene, and Apache OpenNLP, to automatically extract, and make meaning out of geospatial information. In particular, we are interested in unstructured descriptors e.g., a phone number, or a named entity, and the ability to automatically learn geospatial paths related to these descriptors. For example, a particular phone number may represent an entity that travels on a monthly basis, according to easily identifiable and somes more difficult to track patterns. We will present a set of automatic techniques to extract descriptors, and then to geospatially infer their paths across unstructured data.

  10. Developing an Open Source Option for NASA Software

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Parks, John W. (Technical Monitor)

    2003-01-01

    We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.

  11. Making geospatial data in ASF archive readily accessible

    NASA Astrophysics Data System (ADS)

    Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.

    2015-12-01

    The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.

  12. An Open Source Business Model for Malaria

    PubMed Central

    Årdal, Christine; Røttingen, John-Arne

    2015-01-01

    Greater investment is required in developing new drugs and vaccines against malaria in order to eradicate malaria. These precious funds must be carefully managed to achieve the greatest impact. We evaluate existing efforts to discover and develop new drugs and vaccines for malaria to determine how best malaria R&D can benefit from an enhanced open source approach and how such a business model may operate. We assess research articles, patents, clinical trials and conducted a smaller survey among malaria researchers. Our results demonstrate that the public and philanthropic sectors are financing and performing the majority of malaria drug/vaccine discovery and development, but are then restricting access through patents, ‘closed’ publications and hidden away physical specimens. This makes little sense since it is also the public and philanthropic sector that purchases the drugs and vaccines. We recommend that a more “open source” approach is taken by making the entire value chain more efficient through greater transparency which may lead to more extensive collaborations. This can, for example, be achieved by empowering an existing organization like the Medicines for Malaria Venture (MMV) to act as a clearing house for malaria-related data. The malaria researchers that we surveyed indicated that they would utilize such registry data to increase collaboration. Finally, we question the utility of publicly or philanthropically funded patents for malaria medicines, where little to no profits are available. Malaria R&D benefits from a publicly and philanthropically funded architecture, which starts with academic research institutions, product development partnerships, commercialization assistance through UNITAID and finally procurement through mechanisms like The Global Fund to Fight AIDS, Tuberculosis and Malaria and the U.S.’ President’s Malaria Initiative. We believe that a fresh look should be taken at the cost/benefit of patents particularly related to new

  13. Open Source Software Projects Needing Security Investments

    DTIC Science & Technology

    2015-06-19

    modtls, BouncyCastle, gpg, otr, axolotl. 7. Static analyzers: Clang, Frama-C. 8. Nginx. 9. OpenVPN . It was noted that the funding model may be similar...to OpenSSL, where consulting funds the company. It was also noted that OpenVPN needs to correctly use OpenSSL in order to be secure, so focusing on...Dovecot 4. Other high-impact network services: OpenSSH, OpenVPN , BIND, ISC DHCP, University of Delaware NTPD 5. Core infrastructure data parsers

  14. The successes and challenges of open-source biopharmaceutical innovation.

    PubMed

    Allarakhia, Minna

    2014-05-01

    Increasingly, open-source-based alliances seek to provide broad access to data, research-based tools, preclinical samples and downstream compounds. The challenge is how to create value from open-source biopharmaceutical innovation. This value creation may occur via transparency and usage of data across the biopharmaceutical value chain as stakeholders move dynamically between open source and open innovation. In this article, several examples are used to trace the evolution of biopharmaceutical open-source initiatives. The article specifically discusses the technological challenges associated with the integration and standardization of big data; the human capacity development challenges associated with skill development around big data usage; and the data-material access challenge associated with data and material access and usage rights, particularly as the boundary between open source and open innovation becomes more fluid. It is the author's opinion that the assessment of when and how value creation will occur, through open-source biopharmaceutical innovation, is paramount. The key is to determine the metrics of value creation and the necessary technological, educational and legal frameworks to support the downstream outcomes of now big data-based open-source initiatives. The continued focus on the early-stage value creation is not advisable. Instead, it would be more advisable to adopt an approach where stakeholders transform open-source initiatives into open-source discovery, crowdsourcing and open product development partnerships on the same platform.

  15. OpenMx: An Open Source Extended Structural Equation Modeling Framework

    ERIC Educational Resources Information Center

    Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John

    2011-01-01

    OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are…

  16. XNAT Central: Open sourcing imaging research data.

    PubMed

    Herrick, Rick; Horton, William; Olsen, Timothy; McKay, Michael; Archie, Kevin A; Marcus, Daniel S

    2016-01-01

    XNAT Central is a publicly accessible medical imaging data repository based on the XNAT open-source imaging informatics platform. It hosts a wide variety of research imaging data sets. The primary motivation for creating XNAT Central was to provide a central repository to host and provide access to a wide variety of neuroimaging data. In this capacity, XNAT Central hosts a number of data sets from research labs and investigative efforts from around the world, including the OASIS Brains imaging studies, the NUSDAST study of schizophrenia, and more. Over time, XNAT Central has expanded to include imaging data from many different fields of research, including oncology, orthopedics, cardiology, and animal studies, but continues to emphasize neuroimaging data. Through the use of XNAT's DICOM metadata extraction capabilities, XNAT Central provides a searchable repository of imaging data that can be referenced by groups, labs, or individuals working in many different areas of research. The future development of XNAT Central will be geared towards greater ease of use as a reference library of heterogeneous neuroimaging data and associated synthetic data. It will also become a tool for making data available supporting published research and academic articles. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. An open-source laser electronics suite

    NASA Astrophysics Data System (ADS)

    Pisenti, Neal C.; Reschovsky, Benjamin J.; Barker, Daniel S.; Restelli, Alessandro; Campbell, Gretchen K.

    2016-05-01

    We present an integrated set of open-source electronics for controlling external-cavity diode lasers and other instruments in the laboratory. The complete package includes a low-noise circuit for driving high-voltage piezoelectric actuators, an ultra-stable current controller based on the design of, and a high-performance, multi-channel temperature controller capable of driving thermo-electric coolers or resistive heaters. Each circuit (with the exception of the temperature controller) is designed to fit in a Eurocard rack equipped with a low-noise linear power supply capable of driving up to 5 A at +/- 15 V. A custom backplane allows signals to be shared between modules, and a digital communication bus makes the entire rack addressable by external control software over TCP/IP. The modular architecture makes it easy for additional circuits to be designed and integrated with existing electronics, providing a low-cost, customizable alternative to commercial systems without sacrificing performance.

  18. Open source integrated modeling environment Delta Shell

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  19. HERAFitter: Open source QCD fit project

    DOE PAGES

    Alekhin, S.; Behnke, O.; Belov, P.; ...

    2015-07-01

    HERAFitter is an open-source package that provides a framework for the determination of the parton distribution functions (PDFs) of the proton and for many different kinds of analyses in Quantum Chromodynamics (QCD). It encodes results from a wide range of experimental measurements in lepton-proton deep inelastic scattering and proton-proton (proton-antiproton) collisions at hadron colliders. These are complemented with a variety of theoretical options for calculating PDF-dependent cross section predictions corresponding to the measurements. The framework covers a large number of the existing methods and schemes used for PDF determination. The data and theoretical predictions are brought together through numerous methodologicalmore » options for carrying out PDF fits and plotting tools to help visualise the results. While primarily based on the approach of collinear factorisation, HERAFitter also provides facilities for fits of dipole models and transverse-momentum dependent PDFs. The package can be used to study the impact of new precise measurements from hadron colliders. This paper describes the general structure of HERAFitter and its wide choice of options.« less

  20. National Geospatial Program

    USGS Publications Warehouse

    Carswell, William J.

    2011-01-01

    increases the efficiency of the Nation's geospatial community by improving communications about geospatial data, products, services, projects, needs, standards, and best practices. The NGP comprises seven major components (described below), that are managed as a unified set. For example, The National Map establishes data standards and identifies geographic areas where specific types of geospatial data need to be incorporated into The National Map. Partnership Network Liaisons work with Federal, State, local, and tribal partners to help acquire the data. Geospatial technical operations ensure the quality control, integration, and availability to the public of the data acquired. The Emergency Operations Office provides the requirements to The National Map and, during emergencies and natural disasters, provides rapid dissemination of information and data targeted to the needs of emergency responders. The National Atlas uses data from The National Map and other sources to make small-scale maps and multimedia articles about the maps.

  1. OOSTethys - Open Source Software for the Global Earth Observing Systems of Systems

    NASA Astrophysics Data System (ADS)

    Bridger, E.; Bermudez, L. E.; Maskey, M.; Rueda, C.; Babin, B. L.; Blair, R.

    2009-12-01

    An open source software project is much more than just picking the right license, hosting modular code and providing effective documentation. Success in advancing in an open collaborative way requires that the process match the expected code functionality to the developer's personal expertise and organizational needs as well as having an enthusiastic and responsive core lead group. We will present the lessons learned fromOOSTethys , which is a community of software developers and marine scientists who develop open source tools, in multiple languages, to integrate ocean observing systems into an Integrated Ocean Observing System (IOOS). OOSTethys' goal is to dramatically reduce the time it takes to install, adopt and update standards-compliant web services. OOSTethys has developed servers, clients and a registry. Open source PERL, PYTHON, JAVA and ASP tool kits and reference implementations are helping the marine community publish near real-time observation data in interoperable standard formats. In some cases publishing an OpenGeospatial Consortium (OGC), Sensor Observation Service (SOS) from NetCDF files or a database or even CSV text files could take only minutes depending on the skills of the developer. OOSTethys is also developing an OGC standard registry, Catalog Service for Web (CSW). This open source CSW registry was implemented to easily register and discover SOSs using ISO 19139 service metadata. A web interface layer over the CSW registry simplifies the registration process by harvesting metadata describing the observations and sensors from the “GetCapabilities” response of SOS. OPENIOOS is the web client, developed in PERL to visualize the sensors in the SOS services. While the number of OOSTethys software developers is small, currently about 10 around the world, the number of OOSTethys toolkit implementers is larger and growing and the ease of use has played a large role in spreading the use of interoperable standards compliant web services widely

  2. CHRONOS architecture: Experiences with an open-source services-oriented architecture for geoinformatics

    USGS Publications Warehouse

    Fils, D.; Cervato, C.; Reed, J.; Diver, P.; Tang, X.; Bohling, G.; Greer, D.

    2009-01-01

    CHRONOS's purpose is to transform Earth history research by seamlessly integrating stratigraphic databases and tools into a virtual on-line stratigraphic record. In this paper, we describe the various components of CHRONOS's distributed data system, including the encoding of semantic and descriptive data into a service-based architecture. We give examples of how we have integrated well-tested resources available from the open-source and geoinformatic communities, like the GeoSciML schema and the simple knowledge organization system (SKOS), into the services-oriented architecture to encode timescale and phylogenetic synonymy data. We also describe on-going efforts to use geospatially enhanced data syndication and informally including semantic information by embedding it directly into the XHTML Document Object Model (DOM). XHTML DOM allows machine-discoverable descriptive data such as licensing and citation information to be incorporated directly into data sets retrieved by users. ?? 2008 Elsevier Ltd. All rights reserved.

  3. A comparison of geospatially modeled fire behavior and fire management utility of three data sources in the southeastern United States

    Treesearch

    LaWen T. Hollingsworth; Laurie L. Kurth; Bernard R. Parresol; Roger D. Ottmar; Susan J. Prichard

    2012-01-01

    Landscape-scale fire behavior analyses are important to inform decisions on resource management projects that meet land management objectives and protect values from adverse consequences of fire. Deterministic and probabilistic geospatial fire behavior analyses are conducted with various modeling systems including FARSITE, FlamMap, FSPro, and Large Fire Simulation...

  4. An Open Source Extensible Smart Energy Framework

    SciTech Connect

    Rankin, Linda

    Aggregated distributed energy resources are the subject of much interest in the energy industry and are expected to play an important role in meeting our future energy needs by changing how we use, distribute and generate electricity. This energy future includes an increased amount of energy from renewable resources, load management techniques to improve resiliency and reliability, and distributed energy storage and generation capabilities that can be managed to meet the needs of the grid as well as individual customers. These energy assets are commonly referred to as Distributed Energy Resources (DER). DERs rely on a means to communicate informationmore » between an energy provider and multitudes of devices. Today DER control systems are typically vendor-specific, using custom hardware and software solutions. As a result, customers are locked into communication transport protocols, applications, tools, and data formats. Today’s systems are often difficult to extend to meet new application requirements, resulting in stranded assets when business requirements or energy management models evolve. By partnering with industry advisors and researchers, an implementation DER research platform was developed called the Smart Energy Framework (SEF). The hypothesis of this research was that an open source Internet of Things (IoT) framework could play a role in creating a commodity-based eco-system for DER assets that would reduce costs and provide interoperable products. SEF is based on the AllJoynTM IoT open source framework. The demonstration system incorporated DER assets, specifically batteries and smart water heaters. To verify the behavior of the distributed system, models of water heaters and batteries were also developed. An IoT interface for communicating between the assets and a control server was defined. This interface supports a series of “events” and telemetry reporting, similar to those defined by current smart grid communication standards. The results

  5. Web Mapping Architectures Based on Open Specifications and Free and Open Source Software in the Water Domain

    NASA Astrophysics Data System (ADS)

    Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.

    2017-09-01

    The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  6. ENKI - An Open Source environmental modelling platfom

    NASA Astrophysics Data System (ADS)

    Kolberg, S.; Bruland, O.

    2012-04-01

    The ENKI software framework for implementing spatio-temporal models is now released under the LGPL license. Originally developed for evaluation and comparison of distributed hydrological model compositions, ENKI can be used for simulating any time-evolving process over a spatial domain. The core approach is to connect a set of user specified subroutines into a complete simulation model, and provide all administrative services needed to calibrate and run that model. This includes functionality for geographical region setup, all file I/O, calibration and uncertainty estimation etc. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines and various model compositions in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational water resource management. ENKI uses a plug-in structure to invoke separately compiled subroutines, separately built as dynamic-link libraries (dlls). The source code of an ENKI routine is highly compact, with a narrow framework-routine interface allowing the main program to recognise the number, types, and names of the routine's variables. The framework then exposes these variables to the user within the proper context, ensuring that distributed maps coincide spatially, time series exist for input variables, states are initialised, GIS data sets exist for static map data, manually or automatically calibrated values for parameters etc. By using function calls and memory data structures to invoke routines and facilitate information flow, ENKI provides good performance. For a typical distributed hydrological model setup in a spatial domain of 25000 grid cells, 3-4 time steps simulated per second should be expected. Future adaptation to parallel processing may further increase this speed. New modifications to ENKI include a full separation of API and user interface

  7. The 2015 Bioinformatics Open Source Conference (BOSC 2015).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Lapp, Hilmar; Chapman, Brad; Davey, Rob; Fields, Christopher; Hokamp, Karsten; Munoz-Torres, Monica

    2016-02-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.

  8. The case for open-source software in drug discovery.

    PubMed

    DeLano, Warren L

    2005-02-01

    Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.

  9. Efficient Open Source Lidar for Desktop Users

    NASA Astrophysics Data System (ADS)

    Flanagan, Jacob P.

    Lidar --- Light Detection and Ranging --- is a remote sensing technology that utilizes a device similar to a rangefinder to determine a distance to a target. A laser pulse is shot at an object and the time it takes for the pulse to return in measured. The distance to the object is easily calculated using the speed property of light. For lidar, this laser is moved (primarily in a rotational movement usually accompanied by a translational movement) and records the distances to objects several thousands of times per second. From this, a 3 dimensional structure can be procured in the form of a point cloud. A point cloud is a collection of 3 dimensional points with at least an x, a y and a z attribute. These 3 attributes represent the position of a single point in 3 dimensional space. Other attributes can be associated with the points that include properties such as the intensity of the return pulse, the color of the target or even the time the point was recorded. Another very useful, post processed attribute is point classification where a point is associated with the type of object the point represents (i.e. ground.). Lidar has gained popularity and advancements in the technology has made its collection easier and cheaper creating larger and denser datasets. The need to handle this data in a more efficiently manner has become a necessity; The processing, visualizing or even simply loading lidar can be computationally intensive due to its very large size. Standard remote sensing and geographical information systems (GIS) software (ENVI, ArcGIS, etc.) was not originally built for optimized point cloud processing and its implementation is an afterthought and therefore inefficient. Newer, more optimized software for point cloud processing (QTModeler, TopoDOT, etc.) usually lack more advanced processing tools, requires higher end computers and are very costly. Existing open source lidar approaches the loading and processing of lidar in an iterative fashion that requires

  10. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  11. Open-Source Data and the Study of Homicide.

    PubMed

    Parkin, William S; Gruenewald, Jeff

    2015-07-20

    To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.

  12. State-of-the-practice and lessons learned on implementing open data and open source policies.

    DOT National Transportation Integrated Search

    2012-05-01

    This report describes the current government, academic, and private sector practices associated with open data and open source application development. These practices are identified; and the potential uses with the ITS Programs Data Capture and M...

  13. Soil Monitor: an open source web application for real-time soil sealing monitoring and assessment

    NASA Astrophysics Data System (ADS)

    Langella, Giuliano; Basile, Angelo; Giannecchini, Simone; Iamarino, Michela; Munafò, Michele; Terribile, Fabio

    2016-04-01

    Soil sealing is one of the most important causes of land degradation and desertification. In Europe, soil covered by impermeable materials has increased by about 80% from the Second World War till nowadays, while population has only grown by one third. There is an increasing concern at the high political levels about the need to attenuate imperviousness itself and its effects on soil functions. European Commission promulgated a roadmap (COM(2011) 571) by which the net land take would be zero by 2050. Furthermore, European Commission also published a report in 2011 providing best practices and guidelines for limiting soil sealing and imperviousness. In this scenario, we developed an open source and an open source based Soil Sealing Geospatial Cyber Infrastructure (SS-GCI) named as "Soil Monitor". This tool merges a webGIS with parallel geospatial computation in a fast and dynamic fashion in order to provide real-time assessments of soil sealing at high spatial resolution (20 meters and below) over the whole Italy. Common open source webGIS packages are used to implement both the data management and visualization infrastructures, such as GeoServer and MapStore. The high-speed geospatial computation is ensured by a GPU parallelism using the CUDA (Computing Unified Device Architecture) framework by NVIDIA®. This kind of parallelism required the writing - from scratch - all codes needed to fulfil the geospatial computation built behind the soil sealing toolbox. The combination of GPU computing with webGIS infrastructures is relatively novel and required particular attention at the Java-CUDA programming interface. As a result, Soil Monitor is smart because it can perform very high time-consuming calculations (querying for instance an Italian administrative region as area of interest) in less than one minute. The web application is embedded in a web browser and nothing must be installed before using it. Potentially everybody can use it, but the main targets are the

  14. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  15. An Institutional Community-Driven effort to Curate and Preserve Geospatial Data using GeoBlacklight

    NASA Astrophysics Data System (ADS)

    Petters, J.; Coleman, S.; Andrea, O.

    2016-12-01

    A variety of geospatial data is produced or collected by both academic researchers and non-academic groups in the Virginia Tech community. In an effort to preserve, curate and make this geospatial data discoverable, the University Libraries have been building a local implementation of GeoBlacklight, a multi-institutional open-source collaborative project to improve the discoverability and sharing of geospatial data. We will discuss the local implementation of Geoblacklight at Virginia Tech, focusing on the efforts necessary to make it a sustainable resource for the institution and local community going forward. This includes technical challenges such as the development of uniform workflows for geospatial data produced within and outside the course of research, but organizational and economic barriers must be overcome as well. In spearheading this GeoBlacklight effort the Libraries have partnered with University Facilities and University IT. The IT group manages the storage and backup of geospatial data, allowing our group to focus on geospatial data collection and curation. Both IT and University Facilities are in possession of localized geospatial data of interest to Viriginia Tech researchers that all parties agreed should be made discoverable and accessible. The interest and involvement of these and other university stakeholders is key to establishing the sustainability of the infrastructure and the capabilities it can provide to the Virginia Tech community and beyond.

  16. Crawling The Web for Libre: Selecting, Integrating, Extending and Releasing Open Source Software

    NASA Astrophysics Data System (ADS)

    Truslove, I.; Duerr, R. E.; Wilcox, H.; Savoie, M.; Lopez, L.; Brandt, M.

    2012-12-01

    Libre is a project developed by the National Snow and Ice Data Center (NSIDC). Libre is devoted to liberating science data from its traditional constraints of publication, location, and findability. Libre embraces and builds on the notion of making knowledge freely available, and both Creative Commons licensed content and Open Source Software are crucial building blocks for, as well as required deliverable outcomes of the project. One important aspect of the Libre project is to discover cryospheric data published on the internet without prior knowledge of the location or even existence of that data. Inspired by well-known search engines and their underlying web crawling technologies, Libre has explored tools and technologies required to build a search engine tailored to allow users to easily discover geospatial data related to the polar regions. After careful consideration, the Libre team decided to base its web crawling work on the Apache Nutch project (http://nutch.apache.org). Nutch is "an open source web-search software project" written in Java, with good documentation, a significant user base, and an active development community. Nutch was installed and configured to search for the types of data of interest, and the team created plugins to customize the default Nutch behavior to better find and categorize these data feeds. This presentation recounts the Libre team's experiences selecting, using, and extending Nutch, and working with the Nutch user and developer community. We will outline the technical and organizational challenges faced in order to release the project's software as Open Source, and detail the steps actually taken. We distill these experiences into a set of heuristics and recommendations for using, contributing to, and releasing Open Source Software.

  17. Open Source Library Management Systems: A Multidimensional Evaluation

    ERIC Educational Resources Information Center

    Balnaves, Edmund

    2008-01-01

    Open source library management systems have improved steadily in the last five years. They now present a credible option for small to medium libraries and library networks. An approach to their evaluation is proposed that takes account of three additional dimensions that only open source can offer: the developer and support community, the source…

  18. Open-Source Unionism: New Workers, New Strategies

    ERIC Educational Resources Information Center

    Schmid, Julie M.

    2004-01-01

    In "Open-Source Unionism: Beyond Exclusive Collective Bargaining," published in fall 2002 in the journal Working USA, labor scholars Richard B. Freeman and Joel Rogers use the term "open-source unionism" to describe a form of unionization that uses Web technology to organize in hard-to-unionize workplaces. Rather than depend on the traditional…

  19. Migrations of the Mind: The Emergence of Open Source Education

    ERIC Educational Resources Information Center

    Glassman, Michael; Bartholomew, Mitchell; Jones, Travis

    2011-01-01

    The authors describe an Open Source approach to education. They define Open Source Education (OSE) as a teaching and learning framework where the use and presentation of information is non-hierarchical, malleable, and subject to the needs and contributions of students as they become "co-owners" of the course. The course transforms itself into an…

  20. Integrating an Automatic Judge into an Open Source LMS

    ERIC Educational Resources Information Center

    Georgouli, Katerina; Guerreiro, Pedro

    2011-01-01

    This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…

  1. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    ERIC Educational Resources Information Center

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  2. Getting Open Source Software into Schools: Strategies and Challenges

    ERIC Educational Resources Information Center

    Hepburn, Gary; Buley, Jan

    2006-01-01

    In this article Gary Hepburn and Jan Buley outline different approaches to implementing open source software (OSS) in schools; they also address the challenges that open source advocates should anticipate as they try to convince educational leaders to adopt OSS. With regard to OSS implementation, they note that schools have a flexible range of…

  3. Open Source as Appropriate Technology for Global Education

    ERIC Educational Resources Information Center

    Carmichael, Patrick; Honour, Leslie

    2002-01-01

    Economic arguments for the adoption of "open source" software in business have been widely discussed. In this paper we draw on personal experience in the UK, South Africa and Southeast Asia to forward compelling reasons why open source software should be considered as an appropriate and affordable alternative to the currently prevailing…

  4. Open Source Communities in Technical Writing: Local Exigence, Global Extensibility

    ERIC Educational Resources Information Center

    Conner, Trey; Gresham, Morgan; McCracken, Jill

    2011-01-01

    By offering open-source software (OSS)-based networks as an affordable technology alternative, we partnered with a nonprofit community organization. In this article, we narrate the client-based experiences of this partnership, highlighting the ways in which OSS and open-source culture (OSC) transformed our students' and our own expectations of…

  5. Open Source Initiative Powers Real-Time Data Streams

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Under an SBIR contract with Dryden Flight Research Center, Creare Inc. developed a data collection tool called the Ring Buffered Network Bus. The technology has now been released under an open source license and is hosted by the Open Source DataTurbine Initiative. DataTurbine allows anyone to stream live data from sensors, labs, cameras, ocean buoys, cell phones, and more.

  6. The open-source movement: an introduction for forestry professionals

    Treesearch

    Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove

    2005-01-01

    In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....

  7. Open Source, Meet "User-Generated Science"

    ERIC Educational Resources Information Center

    Huwe, Terence K.

    2009-01-01

    This article discusses Research Blogging, a community-run nonprofit organization that is promoting a suite of blogging software to scholars. Research Blogging itself does two things. First, it extends an invitation to a community, and it is open to anyone. Second, it requires its users to follow guidelines. The combination of rigorous guidelines…

  8. Learning from hackers: open-source clinical trials.

    PubMed

    Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico

    2012-05-02

    Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.

  9. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  10. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  11. The 2015 Bioinformatics Open Source Conference (BOSC 2015)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J. A.; Lapp, Hilmar

    2016-01-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included “Data Science;” “Standards and Interoperability;” “Open Science and Reproducibility;” “Translational Bioinformatics;” “Visualization;” and “Bioinformatics Open Source Project Updates”. In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled “Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community,” that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule. PMID:26914653

  12. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    NASA Astrophysics Data System (ADS)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  13. Implementation of Open-Source Web Mapping Technologies to Support Monitoring of Governmental Schemes

    NASA Astrophysics Data System (ADS)

    Pulsani, B. R.

    2015-10-01

    Several schemes are undertaken by the government to uplift social and economic condition of people. The monitoring of these schemes is done through information technology where involvement of Geographic Information System (GIS) is lacking. To demonstrate the benefits of thematic mapping as a tool for assisting the officials in making decisions, a web mapping application for three government programs such as Mother and Child Tracking system (MCTS), Telangana State Housing Corporation Limited (TSHCL) and Ground Water Quality Mapping (GWQM) has been built. Indeed the three applications depicted the distribution of various parameters thematically and helped in identifying the areas with higher and weaker distributions. Based on the three applications, the study tends to find similarities of many government schemes reflecting the nature of thematic mapping and hence deduces to implement this kind of approach for other schemes as well. These applications have been developed using SharpMap Csharp library which is a free and open source mapping library for developing geospatial applications. The study highlights upon the cost benefits of SharpMap and brings out the advantage of this library over proprietary vendors and further discusses its advantages over other open source libraries as well.

  14. An Open-Source Approach for Catchment's Physiographic Characterization

    NASA Astrophysics Data System (ADS)

    Di Leo, M.; Di Stefano, M.

    2013-12-01

    A water catchment's hydrologic response is intimately linked to its morphological shape, which is a signature on the landscape of the particular climate conditions that generated the hydrographic basin over time. Furthermore, geomorphologic structures influence hydrologic regimes and land cover (vegetation). For these reasons, a basin's characterization is a fundamental element in hydrological studies. Physiographic descriptors have been extracted manually for long time, but currently Geographic Information System (GIS) tools ease such task by offering a powerful instrument for hydrologists to save time and improve accuracy of result. Here we present a program combining the flexibility of the Python programming language with the reliability of GRASS GIS, which automatically performing the catchment's physiographic characterization. GRASS (Geographic Resource Analysis Support System) is a Free and Open Source GIS, that today can look back on 30 years of successful development in geospatial data management and analysis, image processing, graphics and maps production, spatial modeling and visualization. The recent development of new hydrologic tools, coupled with the tremendous boost in the existing flow routing algorithms, reduced the computational time and made GRASS a complete toolset for hydrological analysis even for large datasets. The tool presented here is a module called r.basin, based on GRASS' traditional nomenclature, where the "r" stands for "raster", and it is available for GRASS version 6.x and more recently for GRASS 7. As input it uses a Digital Elevation Model and the coordinates of the outlet, and, powered by the recently developed r.stream.* hydrological tools, it performs the flow calculation, delimits the basin's boundaries and extracts the drainage network, returning the flow direction and accumulation, the distance to outlet and the hill slopes length maps. Based on those maps, it calculates hydrologically meaningful shape factors and

  15. Open Source Software Compliance within the Government

    DTIC Science & Technology

    2016-12-01

    The exception to this rule is the various General Public License (GPLs), which consider all distributions to contractors as outside distribution...is developed by a contractor at the government’s expense or for the government’s exclusive use. The third condition that must be met is that ERDC...executables and source code can only be offered by an authorized delivering entity to an authorized receiving entity. This means that contractors , with

  16. OpenSesame: an open-source, graphical experiment builder for the social sciences.

    PubMed

    Mathôt, Sebastiaan; Schreij, Daniel; Theeuwes, Jan

    2012-06-01

    In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality, such as support for eyetrackers, input devices, and video playback, is available through plug-ins. OpenSesame can be used in combination with existing software for creating experiments.

  17. Embracing Open Source for NASA's Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin

    2017-01-01

    The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.

  18. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  19. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  20. The Future of Geospatial Standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds

  1. Open Source Radiation Hardened by Design Technology

    NASA Technical Reports Server (NTRS)

    Shuler, Robert

    2016-01-01

    The proposed technology allows use of the latest microcircuit technology with lowest power and fastest speed, with minimal delay and engineering costs, through new Radiation Hardened by Design (RHBD) techniques that do not require extensive process characterization, technique evaluation and re-design at each Moore's Law generation. The separation of critical node groups is explicitly parameterized so it can be increased as microcircuit technologies shrink. The technology will be open access to radiation tolerant circuit vendors. INNOVATION: This technology would enhance computation intensive applications such as autonomy, robotics, advanced sensor and tracking processes, as well as low power applications such as wireless sensor networks. OUTCOME / RESULTS: 1) Simulation analysis indicates feasibility. 2)Compact voting latch 65 nanometer test chip designed and submitted for fabrication -7/2016. INFUSION FOR SPACE / EARTH: This technology may be used in any digital integrated circuit in which a high level of resistance to Single Event Upsets is desired, and has the greatest benefit outside low earth orbit where cosmic rays are numerous.

  2. Development of an Open Source, Air-Deployable Weather Station

    NASA Astrophysics Data System (ADS)

    Krejci, A.; Lopez Alcala, J. M.; Nelke, M.; Wagner, J.; Udell, C.; Higgins, C. W.; Selker, J. S.

    2017-12-01

    We created a packaged weather station intended to be deployed in the air on tethered systems. The device incorporates lightweight sensors and parts and runs for up to 24 hours off of lithium polymer batteries, allowing the entire package to be supported by a thin fiber. As the fiber does not provide a stable platform, additional data (pitch and roll) from typical weather parameters (e.g. temperature, pressure, humidity, wind speed, and wind direction) are determined using an embedded inertial motion unit. All designs are open sourced including electronics, CAD drawings, and descriptions of assembly and can be found on the OPEnS lab website at http://www.open-sensing.org/lowcost-weather-station/. The Openly Published Environmental Sensing Lab (OPEnS: Open-Sensing.org) expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting-edge technology. New OPEnS labs are now being established in India, France, Switzerland, the Netherlands, and Ghana.

  3. Building a Snow Data Management System using Open Source Software (and IDL)

    NASA Astrophysics Data System (ADS)

    Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.

    2012-12-01

    At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version

  4. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  5. Open source Modeling and optimization tools for Planning

    SciTech Connect

    Peles, S.

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less

  6. Open Source Solutions for Libraries: ABCD vs Koha

    ERIC Educational Resources Information Center

    Macan, Bojan; Fernandez, Gladys Vanesa; Stojanovski, Jadranka

    2013-01-01

    Purpose: The purpose of this study is to present an overview of the two open source (OS) integrated library systems (ILS)--Koha and ABCD (ISIS family), to compare their "next-generation library catalog" functionalities, and to give comparison of other important features available through ILS modules. Design/methodology/approach: Two open source…

  7. Modular Open-Source Software for Item Factor Analysis

    ERIC Educational Resources Information Center

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  8. Using Open Source Software in Visual Simulation Development

    DTIC Science & Technology

    2005-09-01

    increased the use of the technology in training activities. Using open source/free software tools in the process can expand these possibilities...resulting in even greater cost reduction and allowing the flexibility needed in a training environment. This thesis presents a configuration and architecture...to be used when developing training visual simulations using both personal computers and open source tools. Aspects of the requirements needed in a

  9. Open-Source 3D-Printable Optics Equipment

    PubMed Central

    Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104

  10. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  11. Using R to implement spatial analysis in open source environment

    NASA Astrophysics Data System (ADS)

    Shao, Yixi; Chen, Dong; Zhao, Bo

    2007-06-01

    R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.

  12. Open-source 3D-printable optics equipment.

    PubMed

    Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.

  13. Open source electronic health records and chronic disease management.

    PubMed

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-02-01

    To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC.

  14. Open source electronic health records and chronic disease management

    PubMed Central

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    Objective To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). Methods and Materials The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Results Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. Discussion The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. Conclusions The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC. PMID:23813566

  15. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  16. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  17. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent

  18. Streamlining geospatial metadata in the Semantic Web

    NASA Astrophysics Data System (ADS)

    Fugazza, Cristiano; Pepe, Monica; Oggioni, Alessandro; Tagliolato, Paolo; Carrara, Paola

    2016-04-01

    In the geospatial realm, data annotation and discovery rely on a number of ad-hoc formats and protocols. These have been created to enable domain-specific use cases generalized search is not feasible for. Metadata are at the heart of the discovery process and nevertheless they are often neglected or encoded in formats that either are not aimed at efficient retrieval of resources or are plainly outdated. Particularly, the quantum leap represented by the Linked Open Data (LOD) movement did not induce so far a consistent, interlinked baseline in the geospatial domain. In a nutshell, datasets, scientific literature related to them, and ultimately the researchers behind these products are only loosely connected; the corresponding metadata intelligible only to humans, duplicated on different systems, seldom consistently. Instead, our workflow for metadata management envisages i) editing via customizable web- based forms, ii) encoding of records in any XML application profile, iii) translation into RDF (involving the semantic lift of metadata records), and finally iv) storage of the metadata as RDF and back-translation into the original XML format with added semantics-aware features. Phase iii) hinges on relating resource metadata to RDF data structures that represent keywords from code lists and controlled vocabularies, toponyms, researchers, institutes, and virtually any description one can retrieve (or directly publish) in the LOD Cloud. In the context of a distributed Spatial Data Infrastructure (SDI) built on free and open-source software, we detail phases iii) and iv) of our workflow for the semantics-aware management of geospatial metadata.

  19. Open Source and ROI: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    A switch to free open source software can minimize cost and allow funding to be diverted to equipment and other programs. For instance, the OpenOffice suite is an alternative to expensive basic application programs offered by major vendors. Many such programs on the market offer features seldom used in education but for which educators must pay.…

  20. "I CAMMINI DELLA REGINA" - Open Source based tools for preserving and culturally exploring historical traffic routes.

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Colombo, Massimo; Antonovic, Milan; Cardoso, Mirko; Delucchi, Andrea; Gianocca, Giancarlo; Brovelli, Maria Antonia

    2015-04-01

    "I CAMMINI DELLA REGINA" (The Via Regina Paths) is an Interreg project funded within the transnational cooperation program between Italy and Switzerland 2007-2013. The aim of this project is the preservation and valorization of the cultural heritage linked to the walking historically paths crossing, connecting and serving the local territories. With the approach of leveraging the already existing tools, which generally consist of technical descriptions of the paths, the project uses the open source geospatial technologies to deploy innovative solutions which can fill some of the gaps in historical-cultural tourism offers. The Swiss part, and particularly the IST-SUPSI team, has been focusing its activities in the realization of two innovative solutions: a mobile application for the survey of historical paths and a storytelling system for immersive cultural exploration of the historical paths. The former, based on Android, allows to apply in a revised manner a consolidated and already successfully used methodology of survey focused on the conservation of the historical paths (Inventory of historical traffic routes in Switzerland). Up to now operators could rely only on hand work based on a combination of notes, pictures and GPS devices synthesized in manually drawn maps; this procedure is error prone and shows many problems both in data updating and extracting for elaborations. Thus it has been created an easy to use interface which allows to map, according to a newly developed spatially enabled data model, paths, morphological elements, and multimedia notes. When connected to the internet the application can send the data to a web service which, after applying linear referencing and further elaborating the data, makes them available using open standards. The storytelling system has been designed to provide users with cultural insights embedded in a multimedial and immersive geospatial portal. Whether the tourist is exploring physically or virtually the desired

  1. EPA Geospatial Applications

    EPA Pesticide Factsheets

    EPA has developed many applications that allow users to explore and interact with geospatial data. This page highlights some of the flagship geospatial web applications but these represent only a fraction of the total.

  2. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    NASA Astrophysics Data System (ADS)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  3. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  4. A Framework for the Systematic Collection of Open Source Intelligence

    SciTech Connect

    Pouchard, Line Catherine; Trien, Joseph P; Dobson, Jonathan D

    2009-01-01

    Following legislative directions, the Intelligence Community has been mandated to make greater use of Open Source Intelligence (OSINT). Efforts are underway to increase the use of OSINT but there are many obstacles. One of these obstacles is the lack of tools helping to manage the volume of available data and ascertain its credibility. We propose a unique system for selecting, collecting and storing Open Source data from the Web and the Open Source Center. Some data management tasks are automated, document source is retained, and metadata containing geographical coordinates are added to the documents. Analysts are thus empowered to search,more » view, store, and analyze Web data within a single tool. We present ORCAT I and ORCAT II, two implementations of the system.« less

  5. 40 CFR Table 3 to Subpart Wwww of... - Organic HAP Emissions Limits for Existing Open Molding Sources, New Open Molding Sources Emitting...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and... CATEGORIES National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and...

  6. NASA's Geospatial Interoperability Office(GIO)Program

    NASA Technical Reports Server (NTRS)

    Weir, Patricia

    2004-01-01

    NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including

  7. EPA GEOSPATIAL QUALITY COUNCIL

    EPA Science Inventory

    The EPA Geospatial Quality Council (previously known as the EPA GIS-QA Team - EPA/600/R-00/009 was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA Geospatial Q...

  8. Database Organisation in a Web-Enabled Free and Open-Source Software (foss) Environment for Spatio-Temporal Landslide Modelling

    NASA Astrophysics Data System (ADS)

    Das, I.; Oberai, K.; Sarathi Roy, P.

    2012-07-01

    Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.

  9. Open Source Drug Discovery in Practice: A Case Study

    PubMed Central

    Årdal, Christine; Røttingen, John-Arne

    2012-01-01

    Background Open source drug discovery offers potential for developing new and inexpensive drugs to combat diseases that disproportionally affect the poor. The concept borrows two principle aspects from open source computing (i.e., collaboration and open access) and applies them to pharmaceutical innovation. By opening a project to external contributors, its research capacity may increase significantly. To date there are only a handful of open source R&D projects focusing on neglected diseases. We wanted to learn from these first movers, their successes and failures, in order to generate a better understanding of how a much-discussed theoretical concept works in practice and may be implemented. Methodology/Principal Findings A descriptive case study was performed, evaluating two specific R&D projects focused on neglected diseases. CSIR Team India Consortium's Open Source Drug Discovery project (CSIR OSDD) and The Synaptic Leap's Schistosomiasis project (TSLS). Data were gathered from four sources: interviews of participating members (n = 14), a survey of potential members (n = 61), an analysis of the websites and a literature review. Both cases have made significant achievements; however, they have done so in very different ways. CSIR OSDD encourages international collaboration, but its process facilitates contributions from mostly Indian researchers and students. Its processes are formal with each task being reviewed by a mentor (almost always offline) before a result is made public. TSLS, on the other hand, has attracted contributors internationally, albeit significantly fewer than CSIR OSDD. Both have obtained funding used to pay for access to facilities, physical resources and, at times, labor costs. TSLS releases its results into the public domain, whereas CSIR OSDD asserts ownership over its results. Conclusions/Significance Technically TSLS is an open source project, whereas CSIR OSDD is a crowdsourced project. However, both have enabled high quality

  10. Technology collaboration by means of an open source government

    NASA Astrophysics Data System (ADS)

    Berardi, Steven M.

    2009-05-01

    The idea of open source software originally began in the early 1980s, but it never gained widespread support until recently, largely due to the explosive growth of the Internet. Only the Internet has made this kind of concept possible, bringing together millions of software developers from around the world to pool their knowledge. The tremendous success of open source software has prompted many corporations to adopt the culture of open source and thus share information they previously held secret. The government, and specifically the Department of Defense (DoD), could also benefit from adopting an open source culture. In acquiring satellite systems, the DoD often builds walls between program offices, but installing doors between programs can promote collaboration and information sharing. This paper addresses the challenges and consequences of adopting an open source culture to facilitate technology collaboration for DoD space acquisitions. DISCLAIMER: The views presented here are the views of the author, and do not represent the views of the United States Government, United States Air Force, or the Missile Defense Agency.

  11. Open-Source RTOS Space Qualification: An RTEMS Case Study

    NASA Technical Reports Server (NTRS)

    Zemerick, Scott

    2017-01-01

    NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.

  12. Best Practices for Preparing Interoperable Geospatial Data

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.

    2010-12-01

    Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable

  13. Geospatial Service Platform for Education and Research

    NASA Astrophysics Data System (ADS)

    Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.

    2014-04-01

    We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.

  14. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https://github.com/Open

  15. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  16. Bioclipse: an open source workbench for chemo- and bioinformatics.

    PubMed

    Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl E S

    2007-02-22

    There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no successful attempts have been made to integrate chemo- and bioinformatics into a single framework. Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at http://www.bioclipse.net.

  17. Open Source Hbim for Cultural Heritage: a Project Proposal

    NASA Astrophysics Data System (ADS)

    Diara, F.; Rinaudo, F.

    2018-05-01

    Actual technologies are changing Cultural Heritage research, analysis, conservation and development ways, allowing new innovative approaches. The possibility of integrating Cultural Heritage data, like archaeological information, inside a three-dimensional environment system (like a Building Information Modelling) involve huge benefits for its management, monitoring and valorisation. Nowadays there are many commercial BIM solutions. However, these tools are thought and developed mostly for architecture design or technical installations. An example of better solution could be a dynamic and open platform that might consider Cultural Heritage needs as priority. Suitable solution for better and complete data usability and accessibility could be guaranteed by open source protocols. This choice would allow adapting software to Cultural Heritage needs and not the opposite, thus avoiding methodological stretches. This work will focus exactly on analysis and experimentations about specific characteristics of these kind of open source software (DBMS, CAD, Servers) applied to a Cultural Heritage example, in order to verifying their flexibility, reliability and then creating a dynamic HBIM open source prototype. Indeed, it might be a starting point for a future creation of a complete HBIM open source solution that we could adapt to others Cultural Heritage researches and analysis.

  18. A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data

    DOE PAGES

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...

    2016-01-01

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information

  19. Human genome and open source: balancing ethics and business.

    PubMed

    Marturano, Antonio

    2011-01-01

    The Human Genome Project has been completed thanks to a massive use of computer techniques, as well as the adoption of the open-source business and research model by the scientists involved. This model won over the proprietary model and allowed a quick propagation and feedback of research results among peers. In this paper, the author will analyse some ethical and legal issues emerging by the use of such computer model in the Human Genome property rights. The author will argue that the Open Source is the best business model, as it is able to balance business and human rights perspectives.

  20. Open Source Next Generation Visualization Software for Interplanetary Missions

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  1. Freeing Crop Genetics through the Open Source Seed Initiative

    PubMed Central

    Luby, Claire H.; Goldman, Irwin L.

    2016-01-01

    For millennia, seeds have been freely available to use for farming and plant breeding without restriction. Within the past century, however, intellectual property rights (IPRs) have threatened this tradition. In response, a movement has emerged to counter the trend toward increasing consolidation of control and ownership of plant germplasm. One effort, the Open Source Seed Initiative (OSSI, www.osseeds.org), aims to ensure access to crop genetic resources by embracing an open source mechanism that fosters exchange and innovation among farmers, plant breeders, and seed companies. Plant breeders across many sectors have taken the OSSI Pledge to create a protected commons of plant germplasm for future generations. PMID:27093567

  2. Freeing Crop Genetics through the Open Source Seed Initiative.

    PubMed

    Luby, Claire H; Goldman, Irwin L

    2016-04-01

    For millennia, seeds have been freely available to use for farming and plant breeding without restriction. Within the past century, however, intellectual property rights (IPRs) have threatened this tradition. In response, a movement has emerged to counter the trend toward increasing consolidation of control and ownership of plant germplasm. One effort, the Open Source Seed Initiative (OSSI, www.osseeds.org), aims to ensure access to crop genetic resources by embracing an open source mechanism that fosters exchange and innovation among farmers, plant breeders, and seed companies. Plant breeders across many sectors have taken the OSSI Pledge to create a protected commons of plant germplasm for future generations.

  3. Open source and DIY hardware for DNA nanotechnology labs.

    PubMed

    Damase, Tulsi R; Stephens, Daniel; Spencer, Adam; Allen, Peter B

    A set of instruments and specialized equipment is necessary to equip a laboratory to work with DNA. Reducing the barrier to entry for DNA manipulation should enable and encourage new labs to enter the field. We present three examples of open source/DIY technology with significantly reduced costs relative to commercial equipment. This includes a gel scanner, a horizontal PAGE gel mold, and a homogenizer for generating DNA-coated particles. The overall cost savings obtained by using open source/DIY equipment was between 50 and 90%.

  4. Open source and DIY hardware for DNA nanotechnology labs

    PubMed Central

    Damase, Tulsi R.; Stephens, Daniel; Spencer, Adam; Allen, Peter B.

    2015-01-01

    A set of instruments and specialized equipment is necessary to equip a laboratory to work with DNA. Reducing the barrier to entry for DNA manipulation should enable and encourage new labs to enter the field. We present three examples of open source/DIY technology with significantly reduced costs relative to commercial equipment. This includes a gel scanner, a horizontal PAGE gel mold, and a homogenizer for generating DNA-coated particles. The overall cost savings obtained by using open source/DIY equipment was between 50 and 90%. PMID:26457320

  5. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network

  6. Issues on Building Kazakhstan Geospatial Portal to Implement E-Government

    NASA Astrophysics Data System (ADS)

    Sagadiyev, K.; Kang, H. K.; Li, K. J.

    2016-06-01

    A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.

  7. OpenMC In Situ Source Convergence Detection

    SciTech Connect

    Aldrich, Garrett Allen; Dutta, Soumya; Woodring, Jonathan Lee

    2016-05-07

    We designed and implemented an in situ version of particle source convergence for the OpenMC particle transport simulator. OpenMC is a Monte Carlo based-particle simulator for neutron criticality calculations. For the transport simulation to be accurate, source particles must converge on a spatial distribution. Typically, convergence is obtained by iterating the simulation by a user-settable, fixed number of steps, and it is assumed that convergence is achieved. We instead implement a method to detect convergence, using the stochastic oscillator for identifying convergence of source particles based on their accumulated Shannon Entropy. Using our in situ convergence detection, we are ablemore » to detect and begin tallying results for the full simulation once the proper source distribution has been confirmed. Our method ensures that the simulation is not started too early, by a user setting too optimistic parameters, or too late, by setting too conservative a parameter.« less

  8. Intelligence, mapping, and geospatial exploitation system (IMAGES)

    NASA Astrophysics Data System (ADS)

    Moellman, Dennis E.; Cain, Joel M.

    1998-08-01

    This paper provides further detail to one facet of the battlespace visualization concept described in last year's paper Battlespace Situation Awareness for Force XXI. It focuses on the National Imagery and Mapping Agency (NIMA) goal to 'provide customers seamless access to tailorable imagery, imagery intelligence, and geospatial information.' This paper describes Intelligence, Mapping, and Geospatial Exploitation System (IMAGES), an exploitation element capable of CONUS baseplant operations or field deployment to provide NIMA geospatial information collaboratively into a reconnaissance, surveillance, and target acquisition (RSTA) environment through the United States Imagery and Geospatial Information System (USIGS). In a baseplant CONUS setting IMAGES could be used to produce foundation data to support mission planning. In the field it could be directly associated with a tactical sensor receiver or ground station (e.g. UAV or UGV) to provide near real-time and mission specific RSTA to support mission execution. This paper provides IMAGES functional level design; describes the technologies, their interactions and interdependencies; and presents a notional operational scenario to illustrate the system flexibility. Using as a system backbone an intelligent software agent technology, called Open Agent ArchitectureTM (OAATM), IMAGES combines multimodal data entry, natural language understanding, and perceptual and evidential reasoning for system management. Configured to be DII COE compliant, it would utilize, to the extent possible, COTS applications software for data management, processing, fusion, exploitation, and reporting. It would also be modular, scaleable, and reconfigurable. This paper describes how the OAATM achieves data synchronization and enables the necessary level of information to be rapidly available to various command echelons for making informed decisions. The reasoning component will provide for the best information to be developed in the timeline

  9. Panel: Governance in Open Source Projects and Communities

    NASA Astrophysics Data System (ADS)

    Bolici, Francesco; de Laat, Paul; Ljungberg, Jan; Pontiggia, Andrea; Rossi Lamastra, Cristina

    “Although considerable research has been devoted to the growth and expansion of open source communities and the comparison between the efficiency of corporate structures and community structures in the field of software development, rather less attention has been paid to their governance structures (control, monitoring, supervision)” (Lattemann and Stieglitz 2005).

  10. Digital Preservation in Open-Source Digital Library Software

    ERIC Educational Resources Information Center

    Madalli, Devika P.; Barve, Sunita; Amin, Saiful

    2012-01-01

    Digital archives and digital library projects are being initiated all over the world for materials of different formats and domains. To organize, store, and retrieve digital content, many libraries as well as archiving centers are using either proprietary or open-source software. While it is accepted that print media can survive for centuries with…

  11. Chinese Localisation of Evergreen: An Open Source Integrated Library System

    ERIC Educational Resources Information Center

    Zou, Qing; Liu, Guoying

    2009-01-01

    Purpose: The purpose of this paper is to investigate various issues related to Chinese language localisation in Evergreen, an open source integrated library system (ILS). Design/methodology/approach: A Simplified Chinese version of Evergreen was implemented and tested and various issues such as encoding, indexing, searching, and sorting…

  12. Faculty/Student Surveys Using Open Source Software

    ERIC Educational Resources Information Center

    Kaceli, Sali

    2004-01-01

    This session will highlight an easy survey package which lets non-technical users create surveys, administer surveys, gather results, and view statistics. This is an open source application all managed online via a web browser. By using phpESP, the faculty is given the freedom of creating various surveys at their convenience and link them to their…

  13. Higher Education Sub-Cultures and Open Source Adoption

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2011-01-01

    Successful adoption of new teaching and learning technologies in higher education requires the consensus of two sub-cultures, namely the technologist sub-culture and the academic sub-culture. This paper examines trends in adoption of open source software (OSS) for teaching and learning by comparing the results of a 2009 survey of 285 Chief…

  14. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  15. Critical Analysis on Open Source LMSs Using FCA

    ERIC Educational Resources Information Center

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  16. Open Source Projects in Software Engineering Education: A Mapping Study

    ERIC Educational Resources Information Center

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  17. Is Open Source the ERP Cure-All?

    ERIC Educational Resources Information Center

    Panettieri, Joseph C.

    2008-01-01

    Conventional and hosted applications thrive, but open source ERP (enterprise resource planning) is coming on strong. In many ways, the evolution of the ERP market is littered with ironies. When Oracle began buying up customer relationship management (CRM) and ERP companies, some universities worried that they would be left with fewer choices and…

  18. Analyzing huge pathology images with open source software.

    PubMed

    Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc

    2013-06-06

    Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here

  19. Analyzing huge pathology images with open source software

    PubMed Central

    2013-01-01

    Background Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer’s memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. Results We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Conclusions Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. Virtual slides The virtual slide(s) for this article can be found here: http

  20. Open Ephys: an open-source, plugin-based platform for multichannel electrophysiology

    NASA Astrophysics Data System (ADS)

    Siegle, Joshua H.; Cuevas López, Aarón; Patel, Yogi A.; Abramov, Kirill; Ohayon, Shay; Voigts, Jakob

    2017-08-01

    Objective. Closed-loop experiments, in which causal interventions are conditioned on the state of the system under investigation, have become increasingly common in neuroscience. Such experiments can have a high degree of explanatory power, but they require a precise implementation that can be difficult to replicate across laboratories. We sought to overcome this limitation by building open-source software that makes it easier to develop and share algorithms for closed-loop control. Approach. We created the Open Ephys GUI, an open-source platform for multichannel electrophysiology experiments. In addition to the standard ‘open-loop’ visualization and recording functionality, the GUI also includes modules for delivering feedback in response to events detected in the incoming data stream. Importantly, these modules can be built and shared as plugins, which makes it possible for users to extend the functionality of the GUI through a simple API, without having to understand the inner workings of the entire application. Main results. In combination with low-cost, open-source hardware for amplifying and digitizing neural signals, the GUI has been used for closed-loop experiments that perturb the hippocampal theta rhythm in a phase-specific manner. Significance. The Open Ephys GUI is the first widely used application for multichannel electrophysiology that leverages a plugin-based workflow. We hope that it will lower the barrier to entry for electrophysiologists who wish to incorporate real-time feedback into their research.

  1. Open Ephys: an open-source, plugin-based platform for multichannel electrophysiology.

    PubMed

    Siegle, Joshua H; López, Aarón Cuevas; Patel, Yogi A; Abramov, Kirill; Ohayon, Shay; Voigts, Jakob

    2017-08-01

    Closed-loop experiments, in which causal interventions are conditioned on the state of the system under investigation, have become increasingly common in neuroscience. Such experiments can have a high degree of explanatory power, but they require a precise implementation that can be difficult to replicate across laboratories. We sought to overcome this limitation by building open-source software that makes it easier to develop and share algorithms for closed-loop control. We created the Open Ephys GUI, an open-source platform for multichannel electrophysiology experiments. In addition to the standard 'open-loop' visualization and recording functionality, the GUI also includes modules for delivering feedback in response to events detected in the incoming data stream. Importantly, these modules can be built and shared as plugins, which makes it possible for users to extend the functionality of the GUI through a simple API, without having to understand the inner workings of the entire application. In combination with low-cost, open-source hardware for amplifying and digitizing neural signals, the GUI has been used for closed-loop experiments that perturb the hippocampal theta rhythm in a phase-specific manner. The Open Ephys GUI is the first widely used application for multichannel electrophysiology that leverages a plugin-based workflow. We hope that it will lower the barrier to entry for electrophysiologists who wish to incorporate real-time feedback into their research.

  2. Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.

    PubMed

    Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel

    2015-01-01

    There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).

  3. openBIS ELN-LIMS: an open-source database for academic laboratories.

    PubMed

    Barillari, Caterina; Ottoz, Diana S M; Fuentes-Serna, Juan Mariano; Ramakrishnan, Chandrasekhar; Rinn, Bernd; Rudolf, Fabian

    2016-02-15

    The open-source platform openBIS (open Biology Information System) offers an Electronic Laboratory Notebook and a Laboratory Information Management System (ELN-LIMS) solution suitable for the academic life science laboratories. openBIS ELN-LIMS allows researchers to efficiently document their work, to describe materials and methods and to collect raw and analyzed data. The system comes with a user-friendly web interface where data can be added, edited, browsed and searched. The openBIS software, a user guide and a demo instance are available at https://openbis-eln-lims.ethz.ch. The demo instance contains some data from our laboratory as an example to demonstrate the possibilities of the ELN-LIMS (Ottoz et al., 2014). For rapid local testing, a VirtualBox image of the ELN-LIMS is also available. © The Author 2015. Published by Oxford University Press.

  4. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    PubMed

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  5. Utilizing Free and Open Source Software to access, view and compare in situ observations, EO products and model output data

    NASA Astrophysics Data System (ADS)

    Vines, Aleksander; Hamre, Torill; Lygre, Kjetil

    2014-05-01

    The GreenSeas project (Development of global plankton data base and model system for eco-climate early warning) aims to advance the knowledge and predictive capacities of how marine ecosystems will respond to global change. A main task has been to set up a data delivery and monitoring core service following the open and free data access policy implemented in the Global Monitoring for the Environment and Security (GMES) programme. The aim is to ensure open and free access to historical plankton data, new data (EO products and in situ measurements), model data (including estimates of simulation error) and biological, environmental and climatic indicators to a range of stakeholders, such as scientists, policy makers and environmental managers. To this end, we have developed a geo-spatial database of both historical and new in situ physical, biological and chemical parameters for the Southern Ocean, Atlantic, Nordic Seas and the Arctic, and organized related satellite-derived quantities and model forecasts in a joint geo-spatial repository. For easy access to these data, we have implemented a web-based GIS (Geographical Information Systems) where observed, derived and forcasted parameters can be searched, displayed, compared and exported. Model forecasts can also be uploaded dynamically to the system, to allow modelers to quickly compare their results with available in situ and satellite observations. We have implemented the web-based GIS(Geographical Information Systems) system based on free and open source technologies: Thredds Data Server, ncWMS, GeoServer, OpenLayers, PostGIS, Liferay, Apache Tomcat, PRTree, NetCDF-Java, json-simple, Geotoolkit, Highcharts, GeoExt, MapFish, FileSaver, jQuery, jstree and qUnit. We also wanted to used open standards to communicate between the different services and we use WMS, WFS, netCDF, GML, OPeNDAP, JSON, and SLD. The main advantage we got from using FOSS was that we did not have to invent the wheel all over again, but could use

  6. Transforming High School Classrooms with Free/Open Source Software: "It's Time for an Open Source Software Revolution"

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2008-01-01

    Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…

  7. Open Source Clinical NLP – More than Any Single System

    PubMed Central

    Masanz, James; Pakhomov, Serguei V.; Xu, Hua; Wu, Stephen T.; Chute, Christopher G.; Liu, Hongfang

    2014-01-01

    The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP’s mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice. PMID:25954581

  8. Open Source Clinical NLP - More than Any Single System.

    PubMed

    Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang

    2014-01-01

    The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.

  9. Open Source Web-Based Solutions for Disseminating and Analyzing Flood Hazard Information at the Community Level

    NASA Astrophysics Data System (ADS)

    Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.

    2017-09-01

    We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.

  10. A Dozen Years after Open Source's 1998 Birth, It's Time for "OpenTechComm"

    ERIC Educational Resources Information Center

    Still, Brian

    2010-01-01

    2008 marked the 10-year Anniversary of the Open Source movement, which has had a substantial impact on not only software production and adoption, but also on the sharing and distribution of information. Technical communication as a discipline has taken some advantage of the movement or its derivative software, but this article argues not as much…

  11. Opening Up to Open Source: Looking at How Moodle Was Adopted in Higher Education

    ERIC Educational Resources Information Center

    Costello, Eamon

    2013-01-01

    The virtual learning environment (VLE) has grown to become a piece of complex infrastructure that is now deemed critical to higher educational provision. This paper looks at Moodle and its adoption in higher education. Moodle's origins, as an open source VLE, are investigated and its growth examined in the context of how higher educational…

  12. OpenSim: open-source software to create and analyze dynamic simulations of movement.

    PubMed

    Delp, Scott L; Anderson, Frank C; Arnold, Allison S; Loan, Peter; Habib, Ayman; John, Chand T; Guendelman, Eran; Thelen, Darryl G

    2007-11-01

    Dynamic simulations of movement allow one to study neuromuscular coordination, analyze athletic performance, and estimate internal loading of the musculoskeletal system. Simulations can also be used to identify the sources of pathological movement and establish a scientific basis for treatment planning. We have developed a freely available, open-source software system (OpenSim) that lets users develop models of musculoskeletal structures and create dynamic simulations of a wide variety of movements. We are using this system to simulate the dynamics of individuals with pathological gait and to explore the biomechanical effects of treatments. OpenSim provides a platform on which the biomechanics community can build a library of simulations that can be exchanged, tested, analyzed, and improved through a multi-institutional collaboration. Developing software that enables a concerted effort from many investigators poses technical and sociological challenges. Meeting those challenges will accelerate the discovery of principles that govern movement control and improve treatments for individuals with movement pathologies.

  13. Scalar collapse in AdS with an OpenCL open source code

    NASA Astrophysics Data System (ADS)

    Liebling, Steven L.; Khanna, Gaurav

    2017-10-01

    We study the spherically symmetric collapse of a scalar field in anti-de Sitter spacetime using a newly constructed, open-source code which parallelizes over heterogeneous architectures using the open standard OpenCL. An open question for this scenario concerns how to tell, a priori, whether some form of initial data will be stable or will instead develop under the turbulent instability into a black hole in the limit of vanishing amplitude. Previous work suggested the existence of islands of stability around quasi-periodic solutions, and we use this new code to examine the stability properties of approximately quasi-periodic solutions which balance energy transfer to higher modes with energy transfer to lower modes. The evolutions provide some evidence, though not conclusively, for stability of initial data sufficiently close to quasiperiodic solutions.

  14. Development open source microcontroller based temperature data logger

    NASA Astrophysics Data System (ADS)

    Abdullah, M. H.; Che Ghani, S. A.; Zaulkafilai, Z.; Tajuddin, S. N.

    2017-10-01

    This article discusses the development stages in designing, prototyping, testing and deploying a portable open source microcontroller based temperature data logger for use in rough industrial environment. The 5V powered prototype of data logger is equipped with open source Arduino microcontroller for integrating multiple thermocouple sensors with their module, secure digital (SD) card storage, liquid crystal display (LCD), real time clock and electronic enclosure made of acrylic. The program for the function of the datalogger is programmed so that 8 readings from the thermocouples can be acquired within 3 s interval and displayed on the LCD simultaneously. The recorded temperature readings at four different points on both hydrodistillation show similar profile pattern and highest yield of extracted oil was achieved on hydrodistillation 2 at 0.004%. From the obtained results, this study achieved the objective of developing an inexpensive, portable and robust eight channels temperature measuring module with capabilities to monitor and store real time data.

  15. Long distance education for croatian nurses with open source software.

    PubMed

    Radenovic, Aleksandar; Kalauz, Sonja

    2006-01-01

    Croatian Nursing Informatics Association (CNIA) has been established as result of continuing work on promoting nursing informatics in Croatia. Main goals of CNIA are promoting nursing informatics and education of nurses about nursing informatics and using information technology in nursing process. CNIA in start of work is developed three courses from nursing informatics all designed with support of long distance education with open source software. Courses are: A - 'From Data to Wisdom', B - 'Introduction to Nursing Informatics' and C - 'Nursing Informatics I'. Courses A and B are obligatory for C course. Technology used to implement these online courses is based on the open source Learning Management System (LMS), Claroline, free online collaborative learning platform. Courses are divided in two modules/days. First module/day participants have classical approach to education and second day with E-learning from home. These courses represent first courses from nursing informatics' and first long distance education for nurses also.

  16. astroplan: An Open Source Observation Planning Package in Python

    NASA Astrophysics Data System (ADS)

    Morris, Brett M.; Tollerud, Erik; Sipőcz, Brigitta; Deil, Christoph; Douglas, Stephanie T.; Berlanga Medina, Jazmin; Vyhmeister, Karl; Smith, Toby R.; Littlefair, Stuart; Price-Whelan, Adrian M.; Gee, Wilfred T.; Jeschke, Eric

    2018-03-01

    We present astroplan—an open source, open development, Astropy affiliated package for ground-based observation planning and scheduling in Python. astroplan is designed to provide efficient access to common observational quantities such as celestial rise, set, and meridian transit times and simple transformations from sky coordinates to altitude-azimuth coordinates without requiring a detailed understanding of astropy’s implementation of coordinate systems. astroplan provides convenience functions to generate common observational plots such as airmass and parallactic angle as a function of time, along with basic sky (finder) charts. Users can determine whether or not a target is observable given a variety of observing constraints, such as airmass limits, time ranges, Moon illumination/separation ranges, and more. A selection of observation schedulers are included that divide observing time among a list of targets, given observing constraints on those targets. Contributions to the source code from the community are welcome.

  17. Clawpack: Building an open source ecosystem for solving hyperbolic PDEs

    USGS Publications Warehouse

    Iverson, Richard M.; Mandli, K.T.; Ahmadia, Aron J.; Berger, M.J.; Calhoun, Donna; George, David L.; Hadjimichael, Y.; Ketcheson, David I.; Lemoine, Grady L.; LeVeque, Randall J.

    2016-01-01

    Clawpack is a software package designed to solve nonlinear hyperbolic partial differential equations using high-resolution finite volume methods based on Riemann solvers and limiters. The package includes a number of variants aimed at different applications and user communities. Clawpack has been actively developed as an open source project for over 20 years. The latest major release, Clawpack 5, introduces a number of new features and changes to the code base and a new development model based on GitHub and Git submodules. This article provides a summary of the most significant changes, the rationale behind some of these changes, and a description of our current development model. Clawpack: building an open source ecosystem for solving hyperbolic PDEs.

  18. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  19. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  20. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  1. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  2. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  3. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  4. Crux: Rapid Open Source Protein Tandem Mass Spectrometry Analysis

    PubMed Central

    2015-01-01

    Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit (http://cruxtoolkit.sourceforge.net) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276

  5. Open source tools for large-scale neuroscience.

    PubMed

    Freeman, Jeremy

    2015-06-01

    New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  6. An Offline-Online Android Application for Hazard Event Mapping Using WebGIS Open Source Technologies

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya

    2016-04-01

    Nowadays, Free and Open Source Software (FOSS) plays an important role in better understanding and managing disaster risk reduction around the world. National and local government, NGOs and other stakeholders are increasingly seeking and producing data on hazards. Most of the hazard event inventories and land use mapping are based on remote sensing data, with little ground truthing, creating difficulties depending on the terrain and accessibility. Open Source WebGIS tools offer an opportunity for quicker and easier ground truthing of critical areas in order to analyse hazard patterns and triggering factors. This study presents a secure mobile-map application for hazard event mapping using Open Source WebGIS technologies such as Postgres database, Postgis, Leaflet, Cordova and Phonegap. The objectives of this prototype are: 1. An Offline-Online android mobile application with advanced Geospatial visualisation; 2. Easy Collection and storage of events information applied services; 3. Centralized data storage with accessibility by all the service (smartphone, standard web browser); 4. Improving data management by using active participation in hazard event mapping and storage. This application has been implemented as a low-cost, rapid and participatory method for recording impacts from hazard events and includes geolocation (GPS data and Internet), visualizing maps with overlay of satellite images, viewing uploaded images and events as cluster points, drawing and adding event information. The data can be recorded in offline (Android device) or online version (all browsers) and consequently uploaded through the server whenever internet is available. All the events and records can be visualized by an administrator and made public after approval. Different user levels can be defined to access the data for communicating the information. This application was tested for landslides in post-earthquake Nepal but can be used for any other type of hazards such as flood, avalanche

  7. Open-Source Automated Mapping Four-Point Probe.

    PubMed

    Chandra, Handy; Allen, Spencer W; Oberloier, Shane W; Bihari, Nupur; Gwamuri, Jephias; Pearce, Joshua M

    2017-01-26

    Scientists have begun using self-replicating rapid prototyper (RepRap) 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP) measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO) samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas.

  8. Open-Source Automated Mapping Four-Point Probe

    PubMed Central

    Chandra, Handy; Allen, Spencer W.; Oberloier, Shane W.; Bihari, Nupur; Gwamuri, Jephias; Pearce, Joshua M.

    2017-01-01

    Scientists have begun using self-replicating rapid prototyper (RepRap) 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP) measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO) samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas. PMID:28772471

  9. Free and open-source automated 3-D microscope.

    PubMed

    Wijnen, Bas; Petersen, Emily E; Hunt, Emily J; Pearce, Joshua M

    2016-11-01

    Open-source technology not only has facilitated the expansion of the greater research community, but by lowering costs it has encouraged innovation and customizable design. The field of automated microscopy has continued to be a challenge in accessibility due the expense and inflexible, noninterchangeable stages. This paper presents a low-cost, open-source microscope 3-D stage. A RepRap 3-D printer was converted to an optical microscope equipped with a customized, 3-D printed holder for a USB microscope. Precision measurements were determined to have an average error of 10 μm at the maximum speed and 27 μm at the minimum recorded speed. Accuracy tests yielded an error of 0.15%. The machine is a true 3-D stage and thus able to operate with USB microscopes or conventional desktop microscopes. It is larger than all commercial alternatives, and is thus capable of high-depth images over unprecedented areas and complex geometries. The repeatability is below 2-D microscope stages, but testing shows that it is adequate for the majority of scientific applications. The open-source microscope stage costs less than 3-9% of the closest proprietary commercial stages. This extreme affordability vastly improves accessibility for 3-D microscopy throughout the world. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  10. Open-source, community-driven microfluidics with Metafluidics.

    PubMed

    Kong, David S; Thorsen, Todd A; Babb, Jonathan; Wick, Scott T; Gam, Jeremy J; Weiss, Ron; Carr, Peter A

    2017-06-07

    Microfluidic devices have the potential to automate and miniaturize biological experiments, but open-source sharing of device designs has lagged behind sharing of other resources such as software. Synthetic biologists have used microfluidics for DNA assembly, cell-free expression, and cell culture, but a combination of expense, device complexity, and reliance on custom set-ups hampers their widespread adoption. We present Metafluidics, an open-source, community-driven repository that hosts digital design files, assembly specifications, and open-source software to enable users to build, configure, and operate a microfluidic device. We use Metafluidics to share designs and fabrication instructions for both a microfluidic ring-mixer device and a 32-channel tabletop microfluidic controller. This device and controller are applied to build genetic circuits using standard DNA assembly methods including ligation, Gateway, Gibson, and Golden Gate. Metafluidics is intended to enable a broad community of engineers, DIY enthusiasts, and other nontraditional participants with limited fabrication skills to contribute to microfluidic research.

  11. Open Source Platform Application to Groundwater Characterization and Monitoring

    NASA Astrophysics Data System (ADS)

    Ntarlagiannis, D.; Day-Lewis, F. D.; Falzone, S.; Lane, J. W., Jr.; Slater, L. D.; Robinson, J.; Hammett, S.

    2017-12-01

    Groundwater characterization and monitoring commonly rely on the use of multiple point sensors and human labor. Due to the number of sensors, labor, and other resources needed, establishing and maintaining an adequate groundwater monitoring network can be both labor intensive and expensive. To improve and optimize the monitoring network design, open source software and hardware components could potentially provide the platform to control robust and efficient sensors thereby reducing costs and labor. This work presents early attempts to create a groundwater monitoring system incorporating open-source software and hardware that will control the remote operation of multiple sensors along with data management and file transfer functions. The system is built around a Raspberry PI 3, that controls multiple sensors in order to perform on-demand, continuous or `smart decision' measurements while providing flexibility to incorporate additional sensors to meet the demands of different projects. The current objective of our technology is to monitor exchange of ionic tracers between mobile and immobile porosity using a combination of fluid and bulk electrical-conductivity measurements. To meet this objective, our configuration uses four sensors (pH, specific conductance, pressure, temperature) that can monitor the fluid electrical properties of interest and guide the bulk electrical measurement. This system highlights the potential of using open source software and hardware components for earth sciences applications. The versatility of the system makes it ideal for use in a large number of applications, and the low cost allows for high resolution (spatially and temporally) monitoring.

  12. Towards the Development of a Taxonomy for Visualisation of Streamed Geospatial Data

    NASA Astrophysics Data System (ADS)

    Sibolla, B. H.; Van Zyl, T.; Coetzee, S.

    2016-06-01

    Geospatial data has very specific characteristics that need to be carefully captured in its visualisation, in order for the user and the viewer to gain knowledge from it. The science of visualisation has gained much traction over the last decade as a response to various visualisation challenges. During the development of an open source based, dynamic two-dimensional visualisation library, that caters for geospatial streaming data, it was found necessary to conduct a review of existing geospatial visualisation taxonomies. The review was done in order to inform the design phase of the library development, such that either an existing taxonomy can be adopted or extended to fit the needs at hand. The major challenge in this case is to develop dynamic two dimensional visualisations that enable human interaction in order to assist the user to understand the data streams that are continuously being updated. This paper reviews the existing geospatial data visualisation taxonomies that have been developed over the years. Based on the review, an adopted taxonomy for visualisation of geospatial streaming data is presented. Example applications of this taxonomy are also provided. The adopted taxonomy will then be used to develop the information model for the visualisation library in a further study.

  13. OpenDrift - an open source framework for ocean trajectory modeling

    NASA Astrophysics Data System (ADS)

    Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn

    2016-04-01

    We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.

  14. UASs for geospatial data

    USDA-ARS?s Scientific Manuscript database

    Increasingly, consumer organizations, businesses, and academic researchers are using UAS to gather geospatial, environmental data on natural and man-made phenomena. These data may be either remotely sensed or measured directly (e. g., sampling of atmospheric constituents). The term geospatial data r...

  15. GEOSPATIAL QUALITY COUNCIL

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  16. Geospatial Information Best Practices

    DTIC Science & Technology

    2012-01-01

    26 Spring - 2012 By MAJ Christopher Blais, CW2 Joshua Stratton and MSG Moise Danjoint The fact that Geospatial information can be codified and...Operation Iraqi Freedom V (2007-2008, and Operation New Dawn (2011). MSG Moise Danjoint is the noncommissioned officer in charge, Geospatial

  17. A new chapter in environmental sensing: The Open-Source Published Environmental Sensing (OPENS) laboratory

    NASA Astrophysics Data System (ADS)

    Selker, J. S.; Roques, C.; Higgins, C. W.; Good, S. P.; Hut, R.; Selker, A.

    2015-12-01

    The confluence of 3-Dimensional printing, low-cost solid-state-sensors, low-cost, low-power digital controllers (e.g., Arduinos); and open-source publishing (e.g., Github) is poised to transform environmental sensing. The Open-Source Published Environmental Sensing (OPENS) laboratory has launched and is available for all to use. OPENS combines cutting edge technologies and makes them available to the global environmental sensing community. OPENS includes a Maker lab space where people may collaborate in person or virtually via on-line forum for the publication and discussion of environmental sensing technology (Corvallis, Oregon, USA, please feel free to request a free reservation for space and equipment use). The physical lab houses a test-bed for sensors, as well as a complete classical machine shop, 3-D printers, electronics development benches, and workstations for code development. OPENS will provide a web-based formal publishing framework wherein global students and scientists can peer-review publish (with DOI) novel and evolutionary advancements in environmental sensor systems. This curated and peer-reviewed digital collection will include complete sets of "printable" parts and operating computer code for sensing systems. The physical lab will include all of the machines required to produce these sensing systems. These tools can be addressed in person or virtually, creating a truly global venue for advancement in monitoring earth's environment and agricultural systems. In this talk we will present an example of the process of design and publication the design and data from the OPENS-Permeameter. The publication includes 3-D printing code, Arduino (or other control/logging platform) operational code; sample data sets, and a full discussion of the design set in the scientific context of previous related devices. Editors for the peer-review process are currently sought - contact John.Selker@Oregonstate.edu or Clement.Roques@Oregonstate.edu.

  18. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  19. Impacts and Viability of Open Source Software on Earth Science Metadata Clearing House and Service Registry Applications

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Cechini, M. F.; Mitchell, A.

    2011-12-01

    Earth Science applications typically deal with large amounts of data and high throughput rates, if not also high transaction rates. While Open Source is frequently used for smaller scientific applications, large scale, highly available systems frequently fall back to "enterprise" class solutions like Oracle RAC or commercial grade JEE Application Servers. NASA's Earth Observing System Data and Information System (EOSDIS) provides end-to-end capabilities for managing NASA's Earth science data from multiple sources - satellites, aircraft, field measurements, and various other programs. A core capability of EOSDIS, the Earth Observing System (EOS) Clearinghouse (ECHO), is a highly available search and order clearinghouse of over 100 million pieces of science data that has evolved from its early R&D days to a fully operational system. Over the course of this maturity ECHO has largely transitioned from commercial frameworks, databases, and operating systems to Open Source solutions...and in some cases, back. In this talk we discuss the progression of our technological solutions and our lessons learned in the areas of: ? High performance, large scale searching solutions ? GeoSpatial search capabilities and dealing with multiple coordinate systems ? Search and storage of variable format source (science) data ? Highly available deployment solutions ? Scalable (elastic) solutions to visual searching and image handling Throughout the evolution of the ECHO system we have had to evaluate solutions with respect to performance, cost, developer productivity, reliability, and maintainability in the context of supporting global science users. Open Source solutions have played a significant role in our architecture and development but several critical commercial components remain (or have been reinserted) to meet our operational demands.

  20. MicMac GIS application: free open source

    NASA Astrophysics Data System (ADS)

    Duarte, L.; Moutinho, O.; Teodoro, A.

    2016-10-01

    The use of Remotely Piloted Aerial System (RPAS) for remote sensing applications is becoming more frequent as the technologies on on-board cameras and the platform itself are becoming a serious contender to satellite and airplane imagery. MicMac is a photogrammetric tool for image matching that can be used in different contexts. It is an open source software and it can be used as a command line or with a graphic interface (for each command). The main objective of this work was the integration of MicMac with QGIS, which is also an open source software, in order to create a new open source tool applied to photogrammetry/remote sensing. Python language was used to develop the application. This tool would be very useful in the manipulation and 3D modelling of a set of images. The main objective was to create a toolbar in QGIS with the basic functionalities with intuitive graphic interfaces. The toolbar is composed by three buttons: produce the points cloud, create the Digital Elevation Model (DEM) and produce the orthophoto of the study area. The application was tested considering 35 photos, a subset of images acquired by a RPAS in the Aguda beach area, Porto, Portugal. They were used in order to create a 3D terrain model and from this model obtain an orthophoto and the corresponding DEM. The code is open and can be modified according to the user requirements. This integration would be very useful in photogrammetry and remote sensing community combined with GIS capabilities.

  1. Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics

    NASA Astrophysics Data System (ADS)

    Singh, R.; Bermudez, L. E.

    2013-12-01

    Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web

  2. Aerostat-Lofted Instrument Platform and Sampling Method for Determination of Emissions from Open Area Sources

    EPA Science Inventory

    Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...

  3. Nowcasting influenza outbreaks using open-source media report.

    SciTech Connect

    Ray, Jaideep; Brownstein, John S.

    We construct and verify a statistical method to nowcast influenza activity from a time-series of the frequency of reports concerning influenza related topics. Such reports are published electronically by both public health organizations as well as newspapers/media sources, and thus can be harvested easily via web crawlers. Since media reports are timely, whereas reports from public health organization are delayed by at least two weeks, using timely, open-source data to compensate for the lag in %E2%80%9Cofficial%E2%80%9D reports can be useful. We use morbidity data from networks of sentinel physicians (both the Center of Disease Control's ILINet and France's Sentinelles network)more » as the gold standard of influenza-like illness (ILI) activity. The time-series of media reports is obtained from HealthMap (http://healthmap.org). We find that the time-series of media reports shows some correlation ( 0.5) with ILI activity; further, this can be leveraged into an autoregressive moving average model with exogenous inputs (ARMAX model) to nowcast ILI activity. We find that the ARMAX models have more predictive skill compared to autoregressive (AR) models fitted to ILI data i.e., it is possible to exploit the information content in the open-source data. We also find that when the open-source data are non-informative, the ARMAX models reproduce the performance of AR models. The statistical models are tested on data from the 2009 swine-flu outbreak as well as the mild 2011-2012 influenza season in the U.S.A.« less

  4. A Kernel for Open Source Drug Discovery in Tropical Diseases

    PubMed Central

    Ortí, Leticia; Carbajo, Rodrigo J.; Pieper, Ursula; Eswar, Narayanan; Maurer, Stephen M.; Rai, Arti K.; Taylor, Ginger; Todd, Matthew H.; Pineda-Lucena, Antonio; Sali, Andrej; Marti-Renom, Marc A.

    2009-01-01

    Background Conventional patent-based drug development incentives work badly for the developing world, where commercial markets are usually small to non-existent. For this reason, the past decade has seen extensive experimentation with alternative R&D institutions ranging from private–public partnerships to development prizes. Despite extensive discussion, however, one of the most promising avenues—open source drug discovery—has remained elusive. We argue that the stumbling block has been the absence of a critical mass of preexisting work that volunteers can improve through a series of granular contributions. Historically, open source software collaborations have almost never succeeded without such “kernels”. Methodology/Principal Findings Here, we use a computational pipeline for: (i) comparative structure modeling of target proteins, (ii) predicting the localization of ligand binding sites on their surfaces, and (iii) assessing the similarity of the predicted ligands to known drugs. Our kernel currently contains 143 and 297 protein targets from ten pathogen genomes that are predicted to bind a known drug or a molecule similar to a known drug, respectively. The kernel provides a source of potential drug targets and drug candidates around which an online open source community can nucleate. Using NMR spectroscopy, we have experimentally tested our predictions for two of these targets, confirming one and invalidating the other. Conclusions/Significance The TDI kernel, which is being offered under the Creative Commons attribution share-alike license for free and unrestricted use, can be accessed on the World Wide Web at http://www.tropicaldisease.org. We hope that the kernel will facilitate collaborative efforts towards the discovery of new drugs against parasites that cause tropical diseases. PMID:19381286

  5. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  6. Geospatial Data Science Modeling | Geospatial Data Science | NREL

    Science.gov Websites

    Geospatial Data Science Modeling Geospatial Data Science Modeling NREL uses geospatial data science modeling to develop innovative models and tools for energy professionals, project developers, and consumers . Photo of researchers inspecting maps on a large display. Geospatial modeling at NREL often produces the

  7. Hypersonic simulations using open-source CFD and DSMC solvers

    NASA Astrophysics Data System (ADS)

    Casseau, V.; Scanlon, T. J.; John, B.; Emerson, D. R.; Brown, R. E.

    2016-11-01

    Hypersonic hybrid hydrodynamic-molecular gas flow solvers are required to satisfy the two essential requirements of any high-speed reacting code, these being physical accuracy and computational efficiency. The James Weir Fluids Laboratory at the University of Strathclyde is currently developing an open-source hybrid code which will eventually reconcile the direct simulation Monte-Carlo method, making use of the OpenFOAM application called dsmcFoam, and the newly coded open-source two-temperature computational fluid dynamics solver named hy2Foam. In conjunction with employing the CVDV chemistry-vibration model in hy2Foam, novel use is made of the QK rates in a CFD solver. In this paper, further testing is performed, in particular with the CFD solver, to ensure its efficacy before considering more advanced test cases. The hy2Foam and dsmcFoam codes have shown to compare reasonably well, thus providing a useful basis for other codes to compare against.

  8. Building an Open Source Framework for Integrated Catchment Modeling

    NASA Astrophysics Data System (ADS)

    Jagers, B.; Meijers, E.; Villars, M.

    2015-12-01

    In order to develop effective strategies and associated policies for environmental management, we need to understand the dynamics of the natural system as a whole and the human role therein. This understanding is gained by comparing our mental model of the world with observations from the field. However, to properly understand the system we should look at dynamics of water, sediments, water quality, and ecology throughout the whole system from catchment to coast both at the surface and in the subsurface. Numerical models are indispensable in helping us understand the interactions of the overall system, but we need to be able to update and adjust them to improve our understanding and test our hypotheses. To support researchers around the world with this challenging task we started a few years ago with the development of a new open source modeling environment DeltaShell that integrates distributed hydrological models with 1D, 2D, and 3D hydraulic models including generic components for the tracking of sediment, water quality, and ecological quantities throughout the hydrological cycle composed of the aforementioned components. The open source approach combined with a modular approach based on open standards, which allow for easy adjustment and expansion as demands and knowledge grow, provides an ideal starting point for addressing challenging integrated environmental questions.

  9. Status and future plans for open source QuickPIC

    NASA Astrophysics Data System (ADS)

    An, Weiming; Decyk, Viktor; Mori, Warren

    2017-10-01

    QuickPIC is a three dimensional (3D) quasi-static particle-in-cell (PIC) code developed based on the UPIC framework. It can be used for efficiently modeling plasma based accelerator (PBA) problems. With quasi-static approximation, QuickPIC can use different time scales for calculating the beam (or laser) evolution and the plasma response, and a 3D plasma wake field can be simulated using a two-dimensional (2D) PIC code where the time variable is ξ = ct - z and z is the beam propagation direction. QuickPIC can be thousand times faster than the normal PIC code when simulating the PBA. It uses an MPI/OpenMP hybrid parallel algorithm, which can be run on either a laptop or the largest supercomputer. The open source QuickPIC is an object-oriented program with high level classes written in Fortran 2003. It can be found at https://github.com/UCLA-Plasma-Simulation-Group/QuickPIC-OpenSource.git

  10. Understanding How the "Open" of Open Source Software (OSS) Will Improve Global Health Security.

    PubMed

    Hahn, Erin; Blazes, David; Lewis, Sheri

    2016-01-01

    Improving global health security will require bold action in all corners of the world, particularly in developing settings, where poverty often contributes to an increase in emerging infectious diseases. In order to mitigate the impact of emerging pandemic threats, enhanced disease surveillance is needed to improve early detection and rapid response to outbreaks. However, the technology to facilitate this surveillance is often unattainable because of high costs, software and hardware maintenance needs, limited technical competence among public health officials, and internet connectivity challenges experienced in the field. One potential solution is to leverage open source software, a concept that is unfortunately often misunderstood. This article describes the principles and characteristics of open source software and how it may be applied to solve global health security challenges.

  11. [Geospatial models for local health surveillance].

    PubMed

    De Pietri, Diana Elba; García, Susana; Rico, Osvaldo

    2008-06-01

    To produce a geospatial model to evaluate lead exposure among school children from 6-8 years of age, in San Antonio Oeste, Rio Negro province, Argentina, an area contaminated by a foundry in the city center whose toxins were released into the open air. The spatial analysis conducted from October-April 2006 included satellite interpretation and mapping of the data to geographically plot the information. Residences on dirt roads were included, as was the distance for each of the study children's homes and/or schools to the site identified as the source of the exposure. Blood samples taken from children attending schools within the area surrounding the source showed higher lead levels than that of children in other areas. These lead levels were associated with the proximity to the source and/or with living on a dirt road. The highest blood lead levels corresponded to the higher environmental lead levels. Spatial analysis was shown to be a useful tool for site analysis and risk management since it indicated a definitive association between elevated lead levels and the proximity to the source, and/or residing on a dirt road, connections which had not been revealed with traditional epidemiological analyses. The results provided the scientific evidence needed to begin implementing interventions regarding the sources of exposure and education aimed at promoting more hygienic dietary habits among the population.

  12. Open source system OpenVPN in a function of Virtual Private Network

    NASA Astrophysics Data System (ADS)

    Skendzic, A.; Kovacic, B.

    2017-05-01

    Using of Virtual Private Networks (VPN) can establish high security level in network communication. VPN technology enables high security networking using distributed or public network infrastructure. VPN uses different security and managing rules inside networks. It can be set up using different communication channels like Internet or separate ISP communication infrastructure. VPN private network makes security communication channel over public network between two endpoints (computers). OpenVPN is an open source software product under GNU General Public License (GPL) that can be used to establish VPN communication between two computers inside business local network over public communication infrastructure. It uses special security protocols and 256-bit Encryption and it is capable of traversing network address translators (NATs) and firewalls. It allows computers to authenticate each other using a pre-shared secret key, certificates or username and password. This work gives review of VPN technology with a special accent on OpenVPN. This paper will also give comparison and financial benefits of using open source VPN software in business environment.

  13. The Pixhawk Open-Source Computer Vision Framework for Mavs

    NASA Astrophysics Data System (ADS)

    Meier, L.; Tanskanen, P.; Fraundorfer, F.; Pollefeys, M.

    2011-09-01

    Unmanned aerial vehicles (UAV) and micro air vehicles (MAV) are already intensively used in geodetic applications. State of the art autonomous systems are however geared towards the application area in safe and obstacle-free altitudes greater than 30 meters. Applications at lower altitudes still require a human pilot. A new application field will be the reconstruction of structures and buildings, including the facades and roofs, with semi-autonomous MAVs. Ongoing research in the MAV robotics field is focusing on enabling this system class to operate at lower altitudes in proximity to nearby obstacles and humans. PIXHAWK is an open source and open hardware toolkit for this purpose. The quadrotor design is optimized for onboard computer vision and can connect up to four cameras to its onboard computer. The validity of the system design is shown with a fully autonomous capture flight along a building.

  14. openPSTD: The open source pseudospectral time-domain method for acoustic propagation

    NASA Astrophysics Data System (ADS)

    Hornikx, Maarten; Krijnen, Thomas; van Harten, Louis

    2016-06-01

    An open source implementation of the Fourier pseudospectral time-domain (PSTD) method for computing the propagation of sound is presented, which is geared towards applications in the built environment. Being a wave-based method, PSTD captures phenomena like diffraction, but maintains efficiency in processing time and memory usage as it allows to spatially sample close to the Nyquist criterion, thus keeping both the required spatial and temporal resolution coarse. In the implementation it has been opted to model the physical geometry as a composition of rectangular two-dimensional subdomains, hence initially restricting the implementation to orthogonal and two-dimensional situations. The strategy of using subdomains divides the problem domain into local subsets, which enables the simulation software to be built according to Object-Oriented Programming best practices and allows room for further computational parallelization. The software is built using the open source components, Blender, Numpy and Python, and has been published under an open source license itself as well. For accelerating the software, an option has been included to accelerate the calculations by a partial implementation of the code on the Graphical Processing Unit (GPU), which increases the throughput by up to fifteen times. The details of the implementation are reported, as well as the accuracy of the code.

  15. GSKY: A scalable distributed geospatial data server on the cloud

    NASA Astrophysics Data System (ADS)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as

  16. Open-source Software for Exoplanet Atmospheric Modeling

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Blecic, Jasmina; Harrington, Joseph

    2018-01-01

    I will present a suite of self-standing open-source tools to model and retrieve exoplanet spectra implemented for Python. These include: (1) a Bayesian-statistical package to run Levenberg-Marquardt optimization and Markov-chain Monte Carlo posterior sampling, (2) a package to compress line-transition data from HITRAN or Exomol without loss of information, (3) a package to compute partition functions for HITRAN molecules, (4) a package to compute collision-induced absorption, and (5) a package to produce radiative-transfer spectra of transit and eclipse exoplanet observations and atmospheric retrievals.

  17. A Stigmergy Approach for Open Source Software Developer Community Simulation

    SciTech Connect

    Cui, Xiaohui; Beaver, Justin M; Potok, Thomas E

    2009-01-01

    The stigmergy collaboration approach provides a hypothesized explanation about how online groups work together. In this research, we presented a stigmergy approach for building an agent based open source software (OSS) developer community collaboration simulation. We used group of actors who collaborate on OSS projects as our frame of reference and investigated how the choices actors make in contribution their work on the projects determinate the global status of the whole OSS projects. In our simulation, the forum posts and project codes served as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing developer agentmore » behaviors selection probability.« less

  18. Diagnosing turbulence for research aircraft safety using open source toolkits

    NASA Astrophysics Data System (ADS)

    Lang, T. J.; Guy, N.

    Open source software toolkits have been developed and applied to diagnose in-cloud turbulence in the vicinity of Earth science research aircraft, via analysis of ground-based Doppler radar data. Based on multiple retrospective analyses, these toolkits show promise for detecting significant turbulence well prior to cloud penetrations by research aircraft. A pilot study demonstrated the ability to provide mission scientists turbulence estimates in near real time during an actual field campaign, and thus these toolkits are recommended for usage in future cloud-penetrating aircraft field campaigns.

  19. Open source approaches to health information systems in Kenya.

    PubMed

    Drury, Peter; Dahlman, Bruce

    2005-01-01

    This paper focuses on the experience to date of an installation of a Free Open Source Software (FOSS) product, Care2X, at a church hospital in Kenya. The FOSS movement has been maturing rapidly. In developed countries, its benefits relative to proprietary software have been extensively discussed and ways of quantifying the total costs of the development have been developed. Nevertheless, empirical data on the impact of FOSS, particularly in the developing world, concerning its use and development is still quite limited, although the possibilities of FOSS are becoming increasingly attractive.

  20. pyLIMA : an open source microlensing software

    NASA Astrophysics Data System (ADS)

    Bachelet, Etienne

    2017-01-01

    Planetary microlensing is a unique tool to detect cold planets around low-mass stars which is approaching a watershed in discoveries as near-future missions incorporate dedicated surveys. NASA and ESA have decided to complement WFIRST-AFTA and Euclid with microlensing programs to enrich our statistics about this planetary population. Of the nany challenges in- herent in these missions, the data analysis is of primary importance, yet is often perceived as time consuming, complex and daunting barrier to participation in the field. We present the first open source modeling software to conduct a microlensing analysis. This software is written in Python and use as much as possible existing packages.

  1. Review on open source operating systems for internet of things

    NASA Astrophysics Data System (ADS)

    Wang, Zhengmin; Li, Wei; Dong, Huiliang

    2017-08-01

    Internet of Things (IoT) is an environment in which everywhere and every device became smart in a smart world. Internet of Things is growing vastly; it is an integrated system of uniquely identifiable communicating devices which exchange information in a connected network to provide extensive services. IoT devices have very limited memory, computational power, and power supply. Traditional operating systems (OS) have no way to meet the needs of IoT systems. In this paper, we thus analyze the challenges of IoT OS and survey applicable open source OSs.

  2. An open-source java platform for automated reaction mapping.

    PubMed

    Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M

    2010-09-27

    This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.

  3. Integrating HCI Specialists into Open Source Software Development Projects

    NASA Astrophysics Data System (ADS)

    Hedberg, Henrik; Iivari, Netta

    Typical open source software (OSS) development projects are organized around technically talented developers, whose communication is based on technical aspects and source code. Decision-making power is gained through proven competence and activity in the project, and non-technical end-user opinions are too many times neglected. In addition, also human-computer interaction (HCI) specialists have encountered difficulties in trying to participate in OSS projects, because there seems to be no clear authority and responsibility for them. In this paper, based on HCI and OSS literature, we introduce an extended OSS development project organization model that adds a new level of communication and roles for attending human aspects of software. The proposed model makes the existence of HCI specialists visible in the projects, and promotes interaction between developers and the HCI specialists in the course of a project.

  4. Inexpensive Open-Source Data Logging in the Field

    NASA Astrophysics Data System (ADS)

    Wickert, A. D.

    2013-12-01

    I present a general-purpose open-source field-capable data logger, which provides a mechanism to develop dense networks of inexpensive environmental sensors. This data logger was developed as a low-power variant of the Arduino open-source development system, and is named the ALog ("Arduino Logger") BottleLogger (it is slim enough to fit inside a Nalgene water bottle) version 1.0. It features an integrated high-precision real-time clock, SD card slot for high-volume data storage, and integrated power switching. The ALog can interface with sensors via six analog/digital pins, two digital pins, and one digital interrupt pin that can read event-based inputs, such as those from a tipping-bucket rain gauge. We have successfully tested the ALog BottleLogger with ultrasonic rangefinders (for water stage and snow accumulation and melt), temperature sensors, tipping-bucket rain gauges, soil moisture and water potential sensors, resistance-based tools to measure frost heave, and cameras that it triggers based on events. The source code for the ALog, including functions to interface with a a range of commercially-available sensors, is provided as an Arduino C++ library with example implementations. All schematics, circuit board layouts, and source code files are open-source and freely available under GNU GPL v3.0 and Creative Commons Attribution-ShareAlike 3.0 Unported licenses. Through this work, we hope to foster a community-driven movement to collect field environmental data on a budget that permits citizen-scientists and researchers from low-income countries to collect the same high-quality data as researchers in wealthy countries. These data can provide information about global change to managers, governments, scientists, and interested citizens worldwide. Watertight box with ALog BottleLogger data logger on the left and battery pack with 3 D cells on the right. Data can be collected for 3-5 years on one set of batteries.

  5. Nurturing reliable and robust open-source scientific software

    NASA Astrophysics Data System (ADS)

    Uieda, L.; Wessel, P.

    2017-12-01

    Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo

  6. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    NASA Astrophysics Data System (ADS)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the

  7. Digital time stamping system based on open source technologies.

    PubMed

    Miskinis, Rimantas; Smirnov, Dmitrij; Urba, Emilis; Burokas, Andrius; Malysko, Bogdan; Laud, Peeter; Zuliani, Francesco

    2010-03-01

    A digital time stamping system based on open source technologies (LINUX-UBUNTU, OpenTSA, OpenSSL, MySQL) is described in detail, including all important testing results. The system, called BALTICTIME, was developed under a project sponsored by the European Commission under the Program FP 6. It was designed to meet the requirements posed to the systems of legal and accountable time stamping and to be applicable to the hardware commonly used by the national time metrology laboratories. The BALTICTIME system is intended for the use of governmental and other institutions as well as personal bodies. Testing results demonstrate that the time stamps issued to the user by BALTICTIME and saved in BALTICTIME's archives (which implies that the time stamps are accountable) meet all the regulatory requirements. Moreover, the BALTICTIME in its present implementation is able to issue more than 10 digital time stamps per second. The system can be enhanced if needed. The test version of the BALTICTIME service is free and available at http://baltictime. pfi.lt:8080/btws/ and http://baltictime.lnmc.lv:8080/btws/.

  8. A national assessment of the effect of intensive agro-land use practices on nonpoint source pollution using emission scenarios and geo-spatial data.

    PubMed

    Zhuo, Dong; Liu, Liming; Yu, Huirong; Yuan, Chengcheng

    2018-01-01

    China's intensive agriculture has led to a broad range of adverse impacts upon ecosystems and thereby caused environmental quality degradation. One of the fundamental problems that face land managers when dealing with agricultural nonpoint source (NPS) pollution is to quantitatively assess the NPS pollution loads from different sources at a national scale. In this study, export scenarios and geo-spatial data were used to calculate the agricultural NPS pollution loads of nutrient, pesticide, plastic film residue, and crop straw burning in China. The results provided the comprehensive and baseline knowledge of agricultural NPS pollution from China's arable farming system in 2014. First, the nitrogen (N) and phosphorus (P) emission loads to water environment were estimated to be 1.44 Tg N and 0.06 Tg P, respectively. East and south China showed the highest load intensities of nutrient release to aquatic system. Second, the amount of pesticide loss to water of seven pesticides that are widely used in China was estimated to be 30.04 tons (active ingredient (ai)). Acetochlor was the major source of pesticide loss to water, contributing 77.65% to the total loss. The environmental impacts of pesticide usage in east and south China were higher than other parts. Third, 19.75% of the plastic film application resided in arable soils. It contributed a lot to soil phthalate ester (PAE) contamination. Fourth, 14.11% of straw produce were burnt in situ, most occurring in May to July (post-winter wheat harvest) in North China Plain and October to November (post-rice harvest days) in southeast China. All the above agricultural NPS pollution loadings were unevenly distributed across China. The spatial correlations between pollution loads at land unit scale were also estimated. Rising labor cost in rural China might be a possible explanation for the general positive correlations of the NPS pollution loads. It also indicated a co-occurred higher NPS pollution loads and a higher

  9. Open Source, Crowd Source: Harnessing the Power of the People behind Our Libraries

    ERIC Educational Resources Information Center

    Trainor, Cindi

    2009-01-01

    Purpose: The purpose of this paper is to provide an insight into the use of Web 2.0 and Library 2.0 technologies so that librarians can combine open source software with user-generated content to create a richer discovery experience for their users. Design/methodology/approach: Following a description of the current state of integrated library…

  10. Instrumentino: An Open-Source Software for Scientific Instruments.

    PubMed

    Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C

    2015-01-01

    Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project.

  11. Robust, open-source removal of systematics in Kepler data

    NASA Astrophysics Data System (ADS)

    Aigrain, S.; Parviainen, H.; Roberts, S.; Reece, S.; Evans, T.

    2017-10-01

    We present ARC2 (Astrophysically Robust Correction 2), an open-source python-based systematics-correction pipeline, to correct for the Kepler prime mission long-cadence light curves. The ARC2 pipeline identifies and corrects any isolated discontinuities in the light curves and then removes trends common to many light curves. These trends are modelled using the publicly available co-trending basis vectors, within an (approximate) Bayesian framework with 'shrinkage' priors to minimize the risk of overfitting and the injection of any additional noise into the corrected light curves, while keeping any astrophysical signals intact. We show that the ARC2 pipeline's performance matches that of the standard Kepler PDC-MAP data products using standard noise metrics, and demonstrate its ability to preserve astrophysical signals using injection tests with simulated stellar rotation and planetary transit signals. Although it is not identical, the ARC2 pipeline can thus be used as an open-source alternative to PDC-MAP, whenever the ability to model the impact of the systematics removal process on other kinds of signal is important.

  12. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  13. Building integrated business environments: analysing open-source ESB

    NASA Astrophysics Data System (ADS)

    Martínez-Carreras, M. A.; García Jimenez, F. J.; Gómez Skarmeta, A. F.

    2015-05-01

    Integration and interoperability are two concepts that have gained significant prominence in the business field, providing tools which enable enterprise application integration (EAI). In this sense, enterprise service bus (ESB) has played a crucial role as the underpinning technology for creating integrated environments in which companies may connect all their legacy-applications. However, the potential of these technologies remains unknown and some important features are not used to develop suitable business environments. The aim of this paper is to describe and detail the elements for building the next generation of integrated business environments (IBE) and to analyse the features of ESBs as the core of this infrastructure. For this purpose, we evaluate how well-known open-source ESB products fulfil these needs. Moreover, we introduce a scenario in which the collaborative system 'Alfresco' is integrated in the business infrastructure. Finally, we provide a comparison of the different open-source ESBs available for IBE requirements. According to this study, Fuse ESB provides the best results, considering features such as support for a wide variety of standards and specifications, documentation and implementation, security, advanced business trends, ease of integration and performance.

  14. OPEN-SOURCE SOFTWARE IN DENTISTRY: A SYSTEMATIC REVIEW.

    PubMed

    Chruściel-Nogalska, Małgorzata; Smektała, Tomasz; Tutak, Marcin; Sporniak-Tutak, Katarzyna; Olszewski, Raphael

    2017-01-01

    Technological development and the need for electronic health records management resulted in the need for a computer with dedicated, commercial software in daily dental practice. The alternative for commercial software may be open-source solutions. Therefore, this study reviewed the current literature on the availability and use of open-source software (OSS) in dentistry. A comprehensive database search was performed on February 1, 2017. Only articles published in peer-reviewed journals with a focus on the use or description of OSS were retrieved. The level of evidence, according to Oxford EBM Centre Levels of Evidence Scale was classified for all studies. Experimental studies underwent additional quality reporting assessment. The screening and evaluation process resulted in twenty-one studies from 1,940 articles found, with 10 of them being experimental studies. None of the articles provided level 1 evidence, and only one study was considered high quality following quality assessment. Twenty-six different OSS programs were described in the included studies of which ten were used for image visualization, five were used for healthcare records management, four were used for educations processes, one was used for remote consultation and simulation, and six were used for general purposes. Our analysis revealed that the dental literature on OSS consists of scarce, incomplete, and methodologically low quality information.

  15. Clarity: An Open Source Manager for Laboratory Automation

    PubMed Central

    Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.

    2013-01-01

    Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169

  16. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood

  17. Free and Open Source Software underpinning the European Forest Data Centre

    NASA Astrophysics Data System (ADS)

    Rodriguez Aseretto, Dario; Di Leo, Margherita; de Rigo, Daniele; Corti, Paolo; McInerney, Daniel; Camia, Andrea; San-Miguel-Ayanz, Jesús

    2013-04-01

    forest data and information (see also [18]). A set of web-based tools allow accessing the information located in EFDAC. The following applications - running on GNU/Linux platforms - are the core elements of EFDAC: In (a.1) a metadata client allows users to search for EFDAC related spatial datasets while (a.2) is a customized web map service that allows the user to visualize, navigate and query available maps and derived geo-datasets on several forest-related topics. The database system currently relies on ORACLE and PostgreSQL [24] with PostGIS [25]. EFFIS (a.3) [26-33] is a comprehensive system covering the full cycle of forest-fire management. The system supports forest-fire prevention and fighting in Europe, North Africa and Middle East countries through the provision of timely and reliable information on forest-fires [29,30,32]. Within EFFIS, UMN Mapserver [34] is used for the management and publication of the fire behavior forecast and the other fire-related layers in a wide range of formats including INSPIRE and Open Geospatial Consortium (OGC) standards such as: Transdisciplinary modelling research. The EFDAC portal [39] provides data and information which rely on coordinated research [40-50] on wide-scale transdisciplinary modelling for environment (WSTMe) [51]. This contributed to advanced computational modelling approaches such as morphological spatial pattern analysis (MSPA) [52-54] and geospatial semantic array programming (GeoSemAP) [51,55]. FOSS is here essential. For example, GeoSemAP is based on a semantically-enhanced [56,57] joint use of geospatial and array programming [58] tools (c) where semantic transparency also implies FOSS use. References Hahn, R. W., Bessen, J., Evans, D. S., Lessig, L., Smith, B. L., 2009. Government Policy Toward Open Source Software. Hahn, R. W. (Ed.). ISBN: 0-8157-3393-3 http://dx.doi.org/10.2139/ssrn Free Software Foundation, 2012. What is free software? http://www.gnu.org/philosophy/free-sw.html (revision 1.118 archived at

  18. Comparative Analysis Study of Open Source GIS in Malaysia

    NASA Astrophysics Data System (ADS)

    Rasid, Muhammad Zamir Abdul; Kamis, Naddia; Khuizham Abd Halim, Mohd

    2014-06-01

    Open source origin might appear like a major prospective change which is qualified to deliver in various industries and also competing means in developing countries. The leading purpose of this research study is to basically discover the degree of adopting Open Source Software (OSS) that is connected with Geographic Information System (GIS) application within Malaysia. It was derived based on inadequate awareness with regards to the origin ideas or even on account of techie deficiencies in the open origin instruments. This particular research has been carried out based on two significant stages; the first stage involved a survey questionnaire: to evaluate the awareness and acceptance level based on the comparison feedback regarding OSS and commercial GIS. This particular survey was conducted among three groups of candidates: government servant, university students and lecturers, as well as individual. The approaches of measuring awareness in this research were based on a comprehending signal plus a notion signal for each survey questions. These kinds of signs had been designed throughout the analysis in order to supply a measurable and also a descriptive signal to produce the final result. The second stage involved an interview session with a major organization that carries out available origin internet GIS; the Federal Department of Town and Country Planning Peninsular Malaysia (JPBD). The impact of this preliminary study was to understand the particular viewpoint of different groups of people on the available origin, and also their insufficient awareness with regards to origin ideas as well as likelihood may be significant root of adopting level connected with available origin options.

  19. GEOSPATIAL DATA ACCURACY ASSESSMENT

    EPA Science Inventory

    The development of robust accuracy assessment methods for the validation of spatial data represent's a difficult scientific challenge for the geospatial science community. The importance and timeliness of this issue is related directly to the dramatic escalation in the developmen...

  20. Combining Open-Source Packages for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    Schmidt, Albrecht; Grieger, Björn; Völk, Stefan

    2015-04-01

    The science planning of the ESA Rosetta mission has presented challenges which were addressed with combining various open-source software packages, such as the SPICE toolkit, the Python language and the Web graphics library three.js. The challenge was to compute certain parameters from a pool of trajectories and (possible) attitudes to describe the behaviour of the spacecraft. To be able to do this declaratively and efficiently, a C library was implemented that allows to interface the SPICE toolkit for geometrical computations from the Python language and process as much data as possible during one subroutine call. To minimise the lines of code one has to write special care was taken to ensure that the bindings were idiomatic and thus integrate well into the Python language and ecosystem. When done well, this very much simplifies the structure of the code and facilitates the testing for correctness by automatic test suites and visual inspections. For rapid visualisation and confirmation of correctness of results, the geometries were visualised with the three.js library, a popular Javascript library for displaying three-dimensional graphics in a Web browser. Programmatically, this was achieved by generating data files from SPICE sources that were included into templated HTML and displayed by a browser, thus made easily accessible to interested parties at large. As feedback came and new ideas were to be explored, the authors benefited greatly from the design of the Python-to-SPICE library which allowed the expression of algorithms to be concise and easier to communicate. In summary, by combining several well-established open-source tools, we were able to put together a flexible computation and visualisation environment that helped communicate and build confidence in planning ideas.

  1. GRASS GIS: The first Open Source Temporal GIS

    NASA Astrophysics Data System (ADS)

    Gebbert, Sören; Leppelt, Thomas

    2015-04-01

    GRASS GIS is a full featured, general purpose Open Source geographic information system (GIS) with raster, 3D raster and vector processing support[1]. Recently, time was introduced as a new dimension that transformed GRASS GIS into the first Open Source temporal GIS with comprehensive spatio-temporal analysis, processing and visualization capabilities[2]. New spatio-temporal data types were introduced in GRASS GIS version 7, to manage raster, 3D raster and vector time series. These new data types are called space time datasets. They are designed to efficiently handle hundreds of thousands of time stamped raster, 3D raster and vector map layers of any size. Time stamps can be defined as time intervals or time instances in Gregorian calendar time or relative time. Space time datasets are simplifying the processing and analysis of large time series in GRASS GIS, since these new data types are used as input and output parameter in temporal modules. The handling of space time datasets is therefore equal to the handling of raster, 3D raster and vector map layers in GRASS GIS. A new dedicated Python library, the GRASS GIS Temporal Framework, was designed to implement the spatio-temporal data types and their management. The framework provides the functionality to efficiently handle hundreds of thousands of time stamped map layers and their spatio-temporal topological relations. The framework supports reasoning based on the temporal granularity of space time datasets as well as their temporal topology. It was designed in conjunction with the PyGRASS [3] library to support parallel processing of large datasets, that has a long tradition in GRASS GIS [4,5]. We will present a subset of more than 40 temporal modules that were implemented based on the GRASS GIS Temporal Framework, PyGRASS and the GRASS GIS Python scripting library. These modules provide a comprehensive temporal GIS tool set. The functionality range from space time dataset and time stamped map layer management

  2. Owgis 2.0: Open Source Java Application that Builds Web GIS Interfaces for Desktop Andmobile Devices

    NASA Astrophysics Data System (ADS)

    Zavala Romero, O.; Chassignet, E.; Zavala-Hidalgo, J.; Pandav, H.; Velissariou, P.; Meyer-Baese, A.

    2016-12-01

    OWGIS is an open source Java and JavaScript application that builds easily configurable Web GIS sites for desktop and mobile devices. The current version of OWGIS generates mobile interfaces based on HTML5 technology and can be used to create mobile applications. The style of the generated websites can be modified using COMPASS, a well known CSS Authoring Framework. In addition, OWGIS uses several Open Geospatial Consortium standards to request datafrom the most common map servers, such as GeoServer. It is also able to request data from ncWMS servers, allowing the websites to display 4D data from NetCDF files. This application is configured by XML files that define which layers, geographic datasets, are displayed on the Web GIS sites. Among other features, OWGIS allows for animations; streamlines from vector data; virtual globe display; vertical profiles and vertical transects; different color palettes; the ability to download data; and display text in multiple languages. OWGIS users are mainly scientists in the oceanography, meteorology and climate fields.

  3. Sharing Lessons-Learned on Effective Open Data, Open-Source Practices from OpenAQ, a Global Open Air Quality Community.

    NASA Astrophysics Data System (ADS)

    Hasenkopf, C. A.

    2017-12-01

    Increasingly, open data, open-source projects are unearthing rich datasets and tools, previously impossible for more traditional avenues to generate. These projects are possible, in part, because of the emergence of online collaborative and code-sharing tools, decreasing costs of cloud-based services to fetch, store, and serve data, and increasing interest of individuals to contribute their time and skills to 'open projects.' While such projects have generated palpable enthusiasm from many sectors, many of these projects face uncharted paths for sustainability, visibility, and acceptance. Our project, OpenAQ, is an example of an open-source, open data community that is currently forging its own uncharted path. OpenAQ is an open air quality data platform that aggregates and universally formats government and research-grade air quality data from 50 countries across the world. To date, we make available more than 76 million air quality (PM2.5, PM10, SO2, NO2, O3, CO and black carbon) data points through an open Application Programming Interface (API) and a user-customizable download interface at https://openaq.org. The goal of the platform is to enable an ecosystem of users to advance air pollution efforts from science to policy to the private sector. The platform is also an open-source project (https://github.com/openaq) and has only been made possible through the coding and data contributions of individuals around the world. In our first two years of existence, we have seen requests for data to our API skyrocket to more than 6 million datapoints per month, and use-cases as varied as ingesting data aggregated from our system into real-time models of wildfires to building open-source statistical packages (e.g. ropenaq and py-openaq) on top of the platform to creating public-friendly apps and chatbots. We will share a whirl-wind trip through our evolution and the many lessons learned so far related to platform structure, community engagement, organizational model type

  4. An open source device for operant licking in rats

    PubMed Central

    Longley, Matthew; Willis, Ethan L.; Tay, Cindy X.

    2017-01-01

    We created an easy-to-use device for operant licking experiments and another device that records environmental variables. Both devices use the Raspberry Pi computer to obtain data from multiple input devices (e.g., radio frequency identification tag readers, touch and motion sensors, environmental sensors) and activate output devices (e.g., LED lights, syringe pumps) as needed. Data gathered from these devices are stored locally on the computer but can be automatically transferred to a remote server via a wireless network. We tested the operant device by training rats to obtain either sucrose or water under the control of a fixed ratio, a variable ratio, or a progressive ratio reinforcement schedule. The lick data demonstrated that the device has sufficient precision and time resolution to record the fast licking behavior of rats. Data from the environment monitoring device also showed reliable measurements. By providing the source code and 3D design under an open source license, we believe these examples will stimulate innovation in behavioral studies. The source code can be found at http://github.com/chen42/openbehavior. PMID:28229020

  5. ACToR Chemical Structure processing using Open Source ...

    EPA Pesticide Factsheets

    ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from over 1,950 public sources. ACToR contains chemical structure information and toxicological data for over 558,000 unique chemicals. The database primarily includes data from NCCT research programs, in vivo toxicity data from ToxRef, human exposure data from ExpoCast, high-throughput screening data from ToxCast and high quality chemical structure information from the EPA DSSTox program. The DSSTox database is a chemical structure inventory for the NCCT programs and currently has about 16,000 unique structures. Included are also data from PubChem, ChemSpider, USDA, FDA, NIH and several other public data sources. ACToR has been a resource to various international and national research groups. Most of our recent efforts on ACToR are focused on improving the structural identifiers and Physico-Chemical properties of the chemicals in the database. Organizing this huge collection of data and improving the chemical structure quality of the database has posed some major challenges. Workflows have been developed to process structures, calculate chemical properties and identify relationships between CAS numbers. The Structure processing workflow integrates web services (PubChem and NIH NCI Cactus) to d

  6. An open source device for operant licking in rats.

    PubMed

    Longley, Matthew; Willis, Ethan L; Tay, Cindy X; Chen, Hao

    2017-01-01

    We created an easy-to-use device for operant licking experiments and another device that records environmental variables. Both devices use the Raspberry Pi computer to obtain data from multiple input devices (e.g., radio frequency identification tag readers, touch and motion sensors, environmental sensors) and activate output devices (e.g., LED lights, syringe pumps) as needed. Data gathered from these devices are stored locally on the computer but can be automatically transferred to a remote server via a wireless network. We tested the operant device by training rats to obtain either sucrose or water under the control of a fixed ratio, a variable ratio, or a progressive ratio reinforcement schedule. The lick data demonstrated that the device has sufficient precision and time resolution to record the fast licking behavior of rats. Data from the environment monitoring device also showed reliable measurements. By providing the source code and 3D design under an open source license, we believe these examples will stimulate innovation in behavioral studies. The source code can be found at http://github.com/chen42/openbehavior.

  7. Open source GIS for HIV/AIDS management

    PubMed Central

    Vanmeulebrouk, Bas; Rivett, Ulrike; Ricketts, Adam; Loudon, Melissa

    2008-01-01

    Background Reliable access to basic services can improve a community's resilience to HIV/AIDS. Accordingly, work is being done to upgrade the physical infrastructure in affected areas, often employing a strategy of decentralised service provision. Spatial characteristics are one of the major determinants in implementing services, even in the smaller municipal areas, and good quality spatial information is needed to inform decision making processes. However, limited funds, technical infrastructure and human resource capacity result in little or no access to spatial information for crucial infrastructure development decisions at local level. This research investigated whether it would be possible to develop a GIS for basic infrastructure planning and management at local level. Given the resource constraints of the local government context, particularly in small municipalities, it was decided that open source software should be used for the prototype system. Results The design and development of a prototype system illustrated that it is possible to develop an open source GIS system that can be used within the context of local information management. Usability tests show a high degree of usability for the system, which is important considering the heavy workload and high staff turnover that characterises local government in South Africa. Local infrastructure management stakeholders interviewed in a case study of a South African municipality see the potential for the use of GIS as a communication tool and are generally positive about the use of GIS for these purposes. They note security issues that may arise through the sharing of information, lack of skills and resource constraints as the major barriers to adoption. Conclusion The case study shows that spatial information is an identified need at local level. Open source GIS software can be used to develop a system to provide local-level stakeholders with spatial information. However, the suitability of the technology

  8. Improving Data Catalogs with Free and Open Source Software

    NASA Astrophysics Data System (ADS)

    Schweitzer, R.; Hankin, S.; O'Brien, K.

    2013-12-01

    The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are

  9. Challenges of the Open Source Component Marketplace in the Industry

    NASA Astrophysics Data System (ADS)

    Ayala, Claudia; Hauge, Øyvind; Conradi, Reidar; Franch, Xavier; Li, Jingyue; Velle, Ketil Sandanger

    The reuse of Open Source Software components available on the Internet is playing a major role in the development of Component Based Software Systems. Nevertheless, the special nature of the OSS marketplace has taken the “classical” concept of software reuse based on centralized repositories to a completely different arena based on massive reuse over Internet. In this paper we provide an overview of the actual state of the OSS marketplace, and report preliminary findings about how companies interact with this marketplace to reuse OSS components. Such data was gathered from interviews in software companies in Spain and Norway. Based on these results we identify some challenges aimed to improve the industrial reuse of OSS components.

  10. Frapbot: An open-source application for FRAP data.

    PubMed

    Kohze, Robin; Dieteren, Cindy E J; Koopman, Werner J H; Brock, Roland; Schmidt, Samuel

    2017-08-01

    We introduce Frapbot, a free-of-charge open source software web application written in R, which provides manual and automated analyses of fluorescence recovery after photobleaching (FRAP) datasets. For automated operation, starting from data tables containing columns of time-dependent intensity values for various regions of interests within the images, a pattern recognition algorithm recognizes the relevant columns and identifies the presence or absence of prebleach values and the time point of photobleaching. Raw data, residuals, normalization, and boxplots indicating the distribution of half times of recovery (t 1/2 ) of all uploaded files are visualized instantly in a batch-wise manner using a variety of user-definable fitting options. The fitted results are provided as .zip file, which contains .csv formatted output tables. Alternatively, the user can manually control any of the options described earlier. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  11. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  12. Do Open Source LMSs Support Personalization? A Comparative Evaluation

    NASA Astrophysics Data System (ADS)

    Kerkiri, Tania; Paleologou, Angela-Maria

    A number of parameters that support the LMSs capabilities towards content personalization are presented and substantiated. These parameters constitute critical criteria for an exhaustive investigation of the personalization capabilities of the most popular open source LMSs. Results are comparatively shown and commented upon, thus highlighting a course of conduct for the implementation of new personalization methodologies for these LMSs, aligned at their existing infrastructure, to maintain support of the numerous educational institutions entrusting major part of their curricula to them. Meanwhile, new capabilities arise as drawn from a more efficient description of the existing resources -especially when organized into widely available repositories- that lead to qualitatively advanced learner-oriented courses which would ideally meet the challenge of combining personification of demand and personalization of thematic content at once.

  13. pyLIMA : The first open source microlensing modeling software

    NASA Astrophysics Data System (ADS)

    Bachelet, Etienne; Street, Rachel; Bozza, Valerio

    2018-01-01

    Microlensing is highly sensitive to planets beyond the snowline and distributed along the line of sight towards the Galactic Bulge. The WFIRST-AFTA mission should detect about 3000 of these planets and significantly improves our knowledge of planet formation and statistics, complementing results found by transit and radial velocity methods. However, the modeling of microlensing event is challenging on different aspects leading to a highly time consuming analysis. After a quick summarize of these different challenges, I will present pyLIMA, the first open source microlensing modeling software. The aimed goal of this software are to be flexible, powerful and user friendly. This presentation will focus on various case and early results.

  14. Open Source GIS Connectors to NASA GES DISC Satellite Data

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Pham, Long; Yang, Wenli

    2014-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) houses a suite of high spatiotemporal resolution GIS data including satellite-derived and modeled precipitation, air quality, and land surface parameter data. The data are valuable to various GIS research and applications at regional, continental, and global scales. On the other hand, many GIS users, especially those from the ArcGIS community, have difficulties in obtaining, importing, and using our data due to factors such as the variety of data products, the complexity of satellite remote sensing data, and the data encoding formats. We introduce a simple open source ArcGIS data connector that significantly simplifies the access and use of GES DISC data in ArcGIS.

  15. Open-source products for a lighting experiment device.

    PubMed

    Gildea, Kevin M; Milburn, Nelda

    2014-12-01

    The capabilities of open-source software and microcontrollers were used to construct a device for controlled lighting experiments. The device was designed to ascertain whether individuals with certain color vision deficiencies were able to discriminate between the red and white lights in fielded systems on the basis of luminous intensity. The device provided the ability to control the timing and duration of light-emitting diode (LED) and incandescent light stimulus presentations, to present the experimental sequence and verbal instructions automatically, to adjust LED and incandescent luminous intensity, and to display LED and incandescent lights with various spectral emissions. The lighting device could easily be adapted for experiments involving flashing or timed presentations of colored lights, or the components could be expanded to study areas such as threshold light perception and visual alerting systems.

  16. Development of Thread-compatible Open Source Stack

    NASA Astrophysics Data System (ADS)

    Zimmermann, Lukas; Mars, Nidhal; Schappacher, Manuel; Sikora, Axel

    2017-07-01

    The Thread protocol is a recent development based on 6LoWPAN (IPv6 over IEEE 802.15.4), but with extensions regarding a more media independent approach, which - additionally - also promises true interoperability. To evaluate and analyse the operation of a Thread network a given open source 6LoWPAN stack for embedded devices (emb::6) has been extended in order to comply with the Thread specification. The implementation covers Mesh Link Establishment (MLE) and network layer functionality as well as 6LoWPAN mesh under routing mechanism based on MAC short addresses. The development has been verified on a virtualization platform and allows dynamical establishment of network topologies based on Thread’s partitioning algorithm.

  17. Open-Source Software in Computational Research: A Case Study

    DOE PAGES

    Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; ...

    2008-01-01

    A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized inmore » the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.« less

  18. Spatial Dmbs Architecture for a Free and Open Source Bim

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Valari, E.; Karachaliou, E.; Stylianidis, E.

    2017-08-01

    Recent research on the field of Building Information Modelling (BIM) technology, revealed that except of a few, accessible and free BIM viewers there is a lack of Free & Open Source Software (FOSS) BIM software for the complete BIM process. With this in mind and considering BIM as the technological advancement of Computer-Aided Design (CAD) systems, the current work proposes the use of a FOSS CAD software in order to extend its capabilities and transform it gradually into a FOSS BIM platform. Towards this undertaking, a first approach on developing a spatial Database Management System (DBMS) able to store, organize and manage the overall amount of information within a single application, is presented.

  19. Open Source Dataturbine (OSDT) Android Sensorpod in Environmental Observing Systems

    NASA Astrophysics Data System (ADS)

    Fountain, T. R.; Shin, P.; Tilak, S.; Trinh, T.; Smith, J.; Kram, S.

    2014-12-01

    The OSDT Android SensorPod is a custom-designed mobile computing platform for assembling wireless sensor networks for environmental monitoring applications. Funded by an award from the Gordon and Betty Moore Foundation, the OSDT SensorPod represents a significant technological advance in the application of mobile and cloud computing technologies to near-real-time applications in environmental science, natural resources management, and disaster response and recovery. It provides a modular architecture based on open standards and open-source software that allows system developers to align their projects with industry best practices and technology trends, while avoiding commercial vendor lock-in to expensive proprietary software and hardware systems. The integration of mobile and cloud-computing infrastructure represents a disruptive technology in the field of environmental science, since basic assumptions about technology requirements are now open to revision, e.g., the roles of special purpose data loggers and dedicated site infrastructure. The OSDT Android SensorPod was designed with these considerations in mind, and the resulting system exhibits the following characteristics - it is flexible, efficient and robust. The system was developed and tested in the three science applications: 1) a fresh water limnology deployment in Wisconsin, 2) a near coastal marine science deployment at the UCSD Scripps Pier, and 3) a terrestrial ecological deployment in the mountains of Taiwan. As part of a public education and outreach effort, a Facebook page with daily ocean pH measurements from the UCSD Scripps pier was developed. Wireless sensor networks and the virtualization of data and network services is the future of environmental science infrastructure. The OSDT Android SensorPod was designed and developed to harness these new technology developments for environmental monitoring applications.

  20. MOSFiT: Modular Open Source Fitter for Transients

    NASA Astrophysics Data System (ADS)

    Guillochon, James; Nicholl, Matt; Villar, V. Ashley; Mockler, Brenna; Narayan, Gautham; Mandel, Kaisey S.; Berger, Edo; Williams, Peter K. G.

    2018-05-01

    Much of the progress made in time-domain astronomy is accomplished by relating observational multiwavelength time-series data to models derived from our understanding of physical laws. This goal is typically accomplished by dividing the task in two: collecting data (observing), and constructing models to represent that data (theorizing). Owing to the natural tendency for specialization, a disconnect can develop between the best available theories and the best available data, potentially delaying advances in our understanding new classes of transients. We introduce MOSFiT: the Modular Open Source Fitter for Transients, a Python-based package that downloads transient data sets from open online catalogs (e.g., the Open Supernova Catalog), generates Monte Carlo ensembles of semi-analytical light-curve fits to those data sets and their associated Bayesian parameter posteriors, and optionally delivers the fitting results back to those same catalogs to make them available to the rest of the community. MOSFiT is designed to help bridge the gap between observations and theory in time-domain astronomy; in addition to making the application of existing models and creation of new models as simple as possible, MOSFiT yields statistically robust predictions for transient characteristics, with a standard output format that includes all the setup information necessary to reproduce a given result. As large-scale surveys such as that conducted with the Large Synoptic Survey Telescope (LSST), discover entirely new classes of transients, tools such as MOSFiT will be critical for enabling rapid comparison of models against data in statistically consistent, reproducible, and scientifically beneficial ways.

  1. Mousetrap: An integrated, open-source mouse-tracking package.

    PubMed

    Kieslich, Pascal J; Henninger, Felix

    2017-10-01

    Mouse-tracking - the analysis of mouse movements in computerized experiments - is becoming increasingly popular in the cognitive sciences. Mouse movements are taken as an indicator of commitment to or conflict between choice options during the decision process. Using mouse-tracking, researchers have gained insight into the temporal development of cognitive processes across a growing number of psychological domains. In the current article, we present software that offers easy and convenient means of recording and analyzing mouse movements in computerized laboratory experiments. In particular, we introduce and demonstrate the mousetrap plugin that adds mouse-tracking to OpenSesame, a popular general-purpose graphical experiment builder. By integrating with this existing experimental software, mousetrap allows for the creation of mouse-tracking studies through a graphical interface, without requiring programming skills. Thus, researchers can benefit from the core features of a validated software package and the many extensions available for it (e.g., the integration with auxiliary hardware such as eye-tracking, or the support of interactive experiments). In addition, the recorded data can be imported directly into the statistical programming language R using the mousetrap package, which greatly facilitates analysis. Mousetrap is cross-platform, open-source and available free of charge from https://github.com/pascalkieslich/mousetrap-os .

  2. Inexpensive, Low Power, Open-Source Data Logging hardware development

    NASA Astrophysics Data System (ADS)

    Sandell, C. T.; Schulz, B.; Wickert, A. D.

    2017-12-01

    Over the past six years, we have developed a suite of open-source, low-cost, and lightweight data loggers for scientific research. These loggers employ the popular and easy-to-use Arduino programming environment, but consist of custom hardware optimized for field research. They may be connected to a broad and expanding range of off-the-shelf sensors, with software support built in directly to the "ALog" library. Three main models exist: The ALog (for Autonomous or Arduino Logger) is the extreme low-power model for years-long deployments with only primary AA or D batteries. The ALog shield is a stripped-down ALog that nests with a standard Arduino board for prototyping or education. The TLog (for Telemetering Logger) contains an embedded radio with 500 m range and a GPS for communications and precision timekeeping. This enables meshed networks of loggers that can send their data back to an internet-connected "home base" logger for near-real-time field data retrieval. All boards feature feature a high-precision clock, full size SD card slot for high-volume data storage, large screw terminals to connect sensors, interrupts, SPI and I2C communication capability, and 3.3V/5V power outputs. The ALog and TLog have fourteen 16-bit analog inputs with a precision voltage reference for precise analog measurements. Their components are rated -40 to +85 degrees C, and they have been tested in harsh field conditions. These low-cost and open-source data loggers have enabled our research group to collect field data across North and South America on a limited budget, support student projects, and build toward better future scientific data systems.

  3. Gimli: open source and high-performance biomedical name recognition

    PubMed Central

    2013-01-01

    Background Automatic recognition of biomedical names is an essential task in biomedical information extraction, presenting several complex and unsolved challenges. In recent years, various solutions have been implemented to tackle this problem. However, limitations regarding system characteristics, customization and usability still hinder their wider application outside text mining research. Results We present Gimli, an open-source, state-of-the-art tool for automatic recognition of biomedical names. Gimli includes an extended set of implemented and user-selectable features, such as orthographic, morphological, linguistic-based, conjunctions and dictionary-based. A simple and fast method to combine different trained models is also provided. Gimli achieves an F-measure of 87.17% on GENETAG and 72.23% on JNLPBA corpus, significantly outperforming existing open-source solutions. Conclusions Gimli is an off-the-shelf, ready to use tool for named-entity recognition, providing trained and optimized models for recognition of biomedical entities from scientific text. It can be used as a command line tool, offering full functionality, including training of new models and customization of the feature set and model parameters through a configuration file. Advanced users can integrate Gimli in their text mining workflows through the provided library, and extend or adapt its functionalities. Based on the underlying system characteristics and functionality, both for final users and developers, and on the reported performance results, we believe that Gimli is a state-of-the-art solution for biomedical NER, contributing to faster and better research in the field. Gimli is freely available at http://bioinformatics.ua.pt/gimli. PMID:23413997

  4. The Future of ECHO: Evaluating Open Source Possibilities

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Gilman, J.; Baynes, K.; Mitchell, A. E.

    2012-12-01

    NASA's Earth Observing System ClearingHOuse (ECHO) is a format agnostic metadata repository supporting over 3000 collections and 100M science granules. ECHO exposes FTP and RESTful Data Ingest APIs in addition to both SOAP and RESTful search and order capabilities. Built on top of ECHO is a human facing search and order web application named Reverb. ECHO processes hundreds of orders, tens of thousands of searches, and 1-2M ingest actions each week. As ECHO's holdings, metadata format support, and visibility have increased, the ECHO team has received requests by non-NASA entities for copies of ECHO that can be run locally against their data holdings. ESDIS and the ECHO Team have begun investigations into various deployment and Open Sourcing models that can balance the real constraints faced by the ECHO project with the benefits of providing ECHO capabilities to a broader set of users and providers. This talk will discuss several release and Open Source models being investigated by the ECHO team along with the impacts those models are expected to have on the project. We discuss: - Addressing complex deployment or setup issues for potential users - Models of vetting code contributions - Balancing external (public) user requests versus our primary partners - Preparing project code for public release, including navigating licensing issues related to leveraged libraries - Dealing with non-free project dependencies such as commercial databases - Dealing with sensitive aspects of project code such as database passwords, authentication approaches, security through obscurity, etc. - Ongoing support for the released code including increased testing demands, bug fixes, security fixes, and new features.

  5. OpenStereo: Open Source, Cross-Platform Software for Structural Geology Analysis

    NASA Astrophysics Data System (ADS)

    Grohmann, C. H.; Campanha, G. A.

    2010-12-01

    Free and open source software (FOSS) are increasingly seen as synonyms of innovation and progress. Freedom to run, copy, distribute, study, change and improve the software (through access to the source code) assure a high level of positive feedback between users and developers, which results in stable, secure and constantly updated systems. Several software packages for structural geology analysis are available to the user, with commercial licenses or that can be downloaded at no cost from the Internet. Some provide basic tools of stereographic projections such as plotting poles, great circles, density contouring, eigenvector analysis, data rotation etc, while others perform more specific tasks, such as paleostress or geotechnical/rock stability analysis. This variety also means a wide range of data formating for input, Graphical User Interface (GUI) design and graphic export format. The majority of packages is built for MS-Windows and even though there are packages for the UNIX-based MacOS, there aren't native packages for *nix (UNIX, Linux, BSD etc) Operating Systems (OS), forcing the users to run these programs with emulators or virtual machines. Those limitations lead us to develop OpenStereo, an open source, cross-platform software for stereographic projections and structural geology. The software is written in Python, a high-level, cross-platform programming language and the GUI is designed with wxPython, which provide a consistent look regardless the OS. Numeric operations (like matrix and linear algebra) are performed with the Numpy module and all graphic capabilities are provided by the Matplolib library, including on-screen plotting and graphic exporting to common desktop formats (emf, eps, ps, pdf, png, svg). Data input is done with simple ASCII text files, with values of dip direction and dip/plunge separated by spaces, tabs or commas. The user can open multiple file at the same time (or the same file more than once), and overlay different elements of

  6. Implementing Open Source Platform for Education Quality Enhancement in Primary Education: Indonesia Experience

    ERIC Educational Resources Information Center

    Kisworo, Marsudi Wahyu

    2016-01-01

    Information and Communication Technology (ICT)-supported learning using free and open source platform draws little attention as open source initiatives were focused in secondary or tertiary educations. This study investigates possibilities of ICT-supported learning using open source platform for primary educations. The data of this study is taken…

  7. The Visible Human Data Sets (VHD) and Insight Toolkit (ITk): Experiments in Open Source Software

    PubMed Central

    Ackerman, Michael J.; Yoo, Terry S.

    2003-01-01

    From its inception in 1989, the Visible Human Project was designed as an experiment in open source software. In 1994 and 1995 the male and female Visible Human data sets were released by the National Library of Medicine (NLM) as open source data sets. In 2002 the NLM released the first version of the Insight Toolkit (ITk) as open source software. PMID:14728278

  8. Open Knee: Open Source Modeling & Simulation to Enable Scientific Discovery and Clinical Care in Knee Biomechanics

    PubMed Central

    Erdemir, Ahmet

    2016-01-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical function of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor intensive reproduction of model development steps can be avoided. The interested parties can immediately utilize readily available models for scientific discovery and for clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes detailed anatomical representation of the joint's major tissue structures, their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next generation knee models are noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age

  9. Open-Source Selective Laser Sintering (OpenSLS) of Nylon and Biocompatible Polycaprolactone.

    PubMed

    Kinstlinger, Ian S; Bastian, Andreas; Paulsen, Samantha J; Hwang, Daniel H; Ta, Anderson H; Yalacki, David R; Schmidt, Tim; Miller, Jordan S

    2016-01-01

    Selective Laser Sintering (SLS) is an additive manufacturing process that uses a laser to fuse powdered starting materials into solid 3D structures. Despite the potential for fabrication of complex, high-resolution structures with SLS using diverse starting materials (including biomaterials), prohibitive costs of commercial SLS systems have hindered the wide adoption of this technology in the scientific community. Here, we developed a low-cost, open-source SLS system (OpenSLS) and demonstrated its capacity to fabricate structures in nylon with sub-millimeter features and overhanging regions. Subsequently, we demonstrated fabrication of polycaprolactone (PCL) into macroporous structures such as a diamond lattice. Widespread interest in using PCL for bone tissue engineering suggests that PCL lattices are relevant model scaffold geometries for engineering bone. SLS of materials with large powder grain size (~500 μm) leads to part surfaces with high roughness, so we further introduced a simple vapor-smoothing technique to reduce the surface roughness of sintered PCL structures which further improves their elastic modulus and yield stress. Vapor-smoothed PCL can also be used for sacrificial templating of perfusable fluidic networks within orthogonal materials such as poly(dimethylsiloxane) silicone. Finally, we demonstrated that human mesenchymal stem cells were able to adhere, survive, and differentiate down an osteogenic lineage on sintered and smoothed PCL surfaces, suggesting that OpenSLS has the potential to produce PCL scaffolds useful for cell studies. OpenSLS provides the scientific community with an accessible platform for the study of laser sintering and the fabrication of complex geometries in diverse materials.

  10. Open-Source Selective Laser Sintering (OpenSLS) of Nylon and Biocompatible Polycaprolactone

    PubMed Central

    Paulsen, Samantha J.; Hwang, Daniel H.; Ta, Anderson H.; Yalacki, David R.; Schmidt, Tim; Miller, Jordan S.

    2016-01-01

    Selective Laser Sintering (SLS) is an additive manufacturing process that uses a laser to fuse powdered starting materials into solid 3D structures. Despite the potential for fabrication of complex, high-resolution structures with SLS using diverse starting materials (including biomaterials), prohibitive costs of commercial SLS systems have hindered the wide adoption of this technology in the scientific community. Here, we developed a low-cost, open-source SLS system (OpenSLS) and demonstrated its capacity to fabricate structures in nylon with sub-millimeter features and overhanging regions. Subsequently, we demonstrated fabrication of polycaprolactone (PCL) into macroporous structures such as a diamond lattice. Widespread interest in using PCL for bone tissue engineering suggests that PCL lattices are relevant model scaffold geometries for engineering bone. SLS of materials with large powder grain size (~500 μm) leads to part surfaces with high roughness, so we further introduced a simple vapor-smoothing technique to reduce the surface roughness of sintered PCL structures which further improves their elastic modulus and yield stress. Vapor-smoothed PCL can also be used for sacrificial templating of perfusable fluidic networks within orthogonal materials such as poly(dimethylsiloxane) silicone. Finally, we demonstrated that human mesenchymal stem cells were able to adhere, survive, and differentiate down an osteogenic lineage on sintered and smoothed PCL surfaces, suggesting that OpenSLS has the potential to produce PCL scaffolds useful for cell studies. OpenSLS provides the scientific community with an accessible platform for the study of laser sintering and the fabrication of complex geometries in diverse materials. PMID:26841023

  11. ExpertEyes: open-source, high-definition eyetracking.

    PubMed

    Parada, Francisco J; Wyatte, Dean; Yu, Chen; Akavipat, Ruj; Emerick, Brandi; Busey, Thomas

    2015-03-01

    ExpertEyes is a low-cost, open-source package of hardware and software that is designed to provide portable high-definition eyetracking. The project involves several technological innovations, including portability, high-definition video recording, and multiplatform software support. It was designed for challenging recording environments, and all processing is done offline to allow for optimization of parameter estimation. The pupil and corneal reflection are estimated using a novel forward eye model that simultaneously fits both the pupil and the corneal reflection with full ellipses, addressing a common situation in which the corneal reflection sits at the edge of the pupil and therefore breaks the contour of the ellipse. The accuracy and precision of the system are comparable to or better than what is available in commercial eyetracking systems, with a typical accuracy of less than 0.4° and best accuracy below 0.3°, and with a typical precision (SD method) around 0.3° and best precision below 0.2°. Part of the success of the system comes from a high-resolution eye image. The high image quality results from uncasing common digital camcorders and recording directly to SD cards, which avoids the limitations of the analog NTSC format. The software is freely downloadable, and complete hardware plans are available, along with sources for custom parts.

  12. An open, interoperable, transdisciplinary approach to a point cloud data service using OGC standards and open source software.

    NASA Astrophysics Data System (ADS)

    Steer, Adam; Trenham, Claire; Druken, Kelsey; Evans, Benjamin; Wyborn, Lesley

    2017-04-01

    High resolution point clouds and other topology-free point data sources are widely utilised for research, management and planning activities. A key goal for research and management users is making these data and common derivatives available in a way which is seamlessly interoperable with other observed and modelled data. The Australian National Computational Infrastructure (NCI) stores point data from a range of disciplines, including terrestrial and airborne LiDAR surveys, 3D photogrammetry, airborne and ground-based geophysical observations, bathymetric observations and 4D marine tracers. These data are stored alongside a significant store of Earth systems data including climate and weather, ecology, hydrology, geoscience and satellite observations, and available from NCI's National Environmental Research Data Interoperability Platform (NERDIP) [1]. Because of the NERDIP requirement for interoperability with gridded datasets, the data models required to store these data may not conform to the LAS/LAZ format - the widely accepted community standard for point data storage and transfer. The goal for NCI is making point data discoverable, accessible and useable in ways which allow seamless integration with earth observation datasets and model outputs - in turn assisting researchers and decision-makers in the often-convoluted process of handling and analyzing massive point datasets. With a use-case of providing a web data service and supporting a derived product workflow, NCI has implemented and tested a web-based point cloud service using the Open Geospatial Consortium (OGC) Web Processing Service [2] as a transaction handler between a web-based client and server-side computing tools based on a native Linux operating system. Using this model, the underlying toolset for driving a data service is flexible and can take advantage of NCI's highly scalable research cloud. Present work focusses on the Point Data Abstraction Library (PDAL) [3] as a logical choice for

  13. The Case for Open Source: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    Open source has continued to evolve and in the past three years the development of a graphical user interface has made it increasingly accessible and viable for end users without special training. Open source relies to a great extent on the free software movement. In this context, the term free refers not to cost, but to the freedom users have to…

  14. OpenSQUID: A Flexible Open-Source Software Framework for the Control of SQUID Electronics

    DOE PAGES

    Jaeckel, Felix T.; Lafler, Randy J.; Boyd, S. T. P.

    2013-02-06

    We report commercially available computer-controlled SQUID electronics are usually delivered with software providing a basic user interface for adjustment of SQUID tuning parameters, such as bias current, flux offset, and feedback loop settings. However, in a research context it would often be useful to be able to modify this code and/or to have full control over all these parameters from researcher-written software. In the case of the STAR Cryoelectronics PCI/PFL family of SQUID control electronics, the supplied software contains modules for automatic tuning and noise characterization, but does not provide an interface for user code. On the other hand, themore » Magnicon SQUIDViewer software package includes a public application programming interface (API), but lacks auto-tuning and noise characterization features. To overcome these and other limitations, we are developing an "open-source" framework for controlling SQUID electronics which should provide maximal interoperability with user software, a unified user interface for electronics from different manufacturers, and a flexible platform for the rapid development of customized SQUID auto-tuning and other advanced features. Finally, we have completed a first implementation for the STAR Cryoelectronics hardware and have made the source code for this ongoing project available to the research community on SourceForge (http://opensquid.sourceforge.net) under the GNU public license.« less

  15. Choosing Open Source ERP Systems: What Reasons Are There For Doing So?

    NASA Astrophysics Data System (ADS)

    Johansson, Björn; Sudzina, Frantisek

    Enterprise resource planning (ERP) systems attract a high attention and open source software does it as well. The question is then if, and if so, when do open source ERP systems take off. The paper describes the status of open source ERP systems. Based on literature review of ERP system selection criteria based on Web of Science articles, it discusses reported reasons for choosing open source or proprietary ERP systems. Last but not least, the article presents some conclusions that could act as input for future research. The paper aims at building up a foundation for the basic question: What are the reasons for an organization to adopt open source ERP systems.

  16. Citing geospatial feature inventories with XML manifests

    NASA Astrophysics Data System (ADS)

    Bose, R.; McGarva, G.

    2006-12-01

    Today published scientific papers include a growing number of citations for online information sources that either complement or replace printed journals and books. We anticipate this same trend for cartographic citations used in the geosciences, following advances in web mapping and geographic feature-based services. Instead of using traditional libraries to resolve citations for print material, the geospatial citation life cycle will include requesting inventories of objects or geographic features from distributed geospatial data repositories. Using a case study from the UK Ordnance Survey MasterMap database, which is illustrative of geographic object-based products in general, we propose citing inventories of geographic objects using XML feature manifests. These manifests: (1) serve as a portable listing of sets of versioned features; (2) could be used as citations within the identification portion of an international geospatial metadata standard; (3) could be incorporated into geospatial data transfer formats such as GML; but (4) can be resolved only with comprehensive, curated repositories of current and historic data. This work has implications for any researcher who foresees the need to make or resolve references to online geospatial databases.

  17. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  18. Learning from open source software projects to improve scientific review.

    PubMed

    Ghosh, Satrajit S; Klein, Arno; Avants, Brian; Millman, K Jarrod

    2012-01-01

    Peer-reviewed publications are the primary mechanism for sharing scientific results. The current peer-review process is, however, fraught with many problems that undermine the pace, validity, and credibility of science. We highlight five salient problems: (1) reviewers are expected to have comprehensive expertise; (2) reviewers do not have sufficient access to methods and materials to evaluate a study; (3) reviewers are neither identified nor acknowledged; (4) there is no measure of the quality of a review; and (5) reviews take a lot of time, and once submitted cannot evolve. We propose that these problems can be resolved by making the following changes to the review process. Distributing reviews to many reviewers would allow each reviewer to focus on portions of the article that reflect the reviewer's specialty or area of interest and place less of a burden on any one reviewer. Providing reviewers materials and methods to perform comprehensive evaluation would facilitate transparency, greater scrutiny, and replication of results. Acknowledging reviewers makes it possible to quantitatively assess reviewer contributions, which could be used to establish the impact of the reviewer in the scientific community. Quantifying review quality could help establish the importance of individual reviews and reviewers as well as the submitted article. Finally, we recommend expediting post-publication reviews and allowing for the dialog to continue and flourish in a dynamic and interactive manner. We argue that these solutions can be implemented by adapting existing features from open-source software management and social networking technologies. We propose a model of an open, interactive review system that quantifies the significance of articles, the quality of reviews, and the reputation of reviewers.

  19. Learning from open source software projects to improve scientific review

    PubMed Central

    Ghosh, Satrajit S.; Klein, Arno; Avants, Brian; Millman, K. Jarrod

    2012-01-01

    Peer-reviewed publications are the primary mechanism for sharing scientific results. The current peer-review process is, however, fraught with many problems that undermine the pace, validity, and credibility of science. We highlight five salient problems: (1) reviewers are expected to have comprehensive expertise; (2) reviewers do not have sufficient access to methods and materials to evaluate a study; (3) reviewers are neither identified nor acknowledged; (4) there is no measure of the quality of a review; and (5) reviews take a lot of time, and once submitted cannot evolve. We propose that these problems can be resolved by making the following changes to the review process. Distributing reviews to many reviewers would allow each reviewer to focus on portions of the article that reflect the reviewer's specialty or area of interest and place less of a burden on any one reviewer. Providing reviewers materials and methods to perform comprehensive evaluation would facilitate transparency, greater scrutiny, and replication of results. Acknowledging reviewers makes it possible to quantitatively assess reviewer contributions, which could be used to establish the impact of the reviewer in the scientific community. Quantifying review quality could help establish the importance of individual reviews and reviewers as well as the submitted article. Finally, we recommend expediting post-publication reviews and allowing for the dialog to continue and flourish in a dynamic and interactive manner. We argue that these solutions can be implemented by adapting existing features from open-source software management and social networking technologies. We propose a model of an open, interactive review system that quantifies the significance of articles, the quality of reviews, and the reputation of reviewers. PMID:22529798

  20. Free and Open Source Software for land degradation vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Imbrenda, Vito; Calamita, Giuseppe; Coluzzi, Rosa; D'Emilio, Mariagrazia; Lanfredi, Maria Teresa; Perrone, Angela; Ragosta, Maria; Simoniello, Tiziana

    2013-04-01

    Nowadays the role of FOSS software in scientific research is becoming increasingly important. Besides the important issues of reduced costs for licences, legality and security there are many other reasons that make FOSS software attractive. Firstly, making the code opened is a warranty of quality permitting to thousands of developers around the world to check the code and fix bugs rather than rely on vendors claims. FOSS communities are usually enthusiastic about helping other users for solving problems and expand or customize software (flexibility). Most important for this study, the interoperability allows to combine the user-friendly QGIS with the powerful GRASS-GIS and the richness of statistical methods of R in order to process remote sensing data and to perform geo-statistical analysis in one only environment. This study is focused on the land degradation (i.e. the reduction in the capacity of the land to provide ecosystem goods and services and assure its functions) and in particular on the estimation of the vulnerability levels in order to suggest appropriate policy actions to reduce/halt land degradation impacts, using the above mentioned software. The area investigated is the Basilicata Region (Southern Italy) where large natural areas are mixed with anthropized areas. To identify different levels of vulnerability we adopted the Environmentally Sensitive Areas (ESAs) model, based on the combination of indicators related to soil, climate, vegetation and anthropic stress. Such indicators were estimated by using the following data-sources: - Basilicata Region Geoportal to assess soil vulnerability; - DESERTNET2 project to evaluate potential vegetation vulnerability and climate vulnerability; - NDVI-MODIS satellite time series (2000-2010) with 250m resolution, available as 16-day composite from the NASA LP DAAC to characterize the dynamic component of vegetation; - Agricultural Census data 2010, Corine Land Cover 2006 and morphological information to assess

  1. EPA National Geospatial Data Policy

    EPA Pesticide Factsheets

    National Geospatial Data Policy (NGDP) establishes principles, responsibilities, and requirements for collecting and managing geospatial data used by Federal environmental programs and projects within the jurisdiction of the U.S. EPA

  2. Situational Awareness Geospatial Application (iSAGA)

    SciTech Connect

    Sher, Benjamin

    Situational Awareness Geospatial Application (iSAGA) is a geospatial situational awareness software tool that uses an algorithm to extract location data from nearly any internet-based, or custom data source and display it geospatially; allows user-friendly conduct of spatial analysis using custom-developed tools; searches complex Geographic Information System (GIS) databases and accesses high resolution imagery. iSAGA has application at the federal, state and local levels of emergency response, consequence management, law enforcement, emergency operations and other decision makers as a tool to provide complete, visual, situational awareness using data feeds and tools selected by the individual agency or organization. Feeds may bemore » layered and custom tools developed to uniquely suit each subscribing agency or organization. iSAGA may similarly be applied to international agencies and organizations.« less

  3. Geospatial analysis of residential proximity to open-pit coal mining areas in relation to micronuclei frequency, particulate matter concentration, and elemental enrichment factors.

    PubMed

    Espitia-Pérez, Lyda; Arteaga-Pertuz, Marcia; Soto, José Salvador; Espitia-Pérez, Pedro; Salcedo-Arteaga, Shirley; Pastor-Sierra, Karina; Galeano-Páez, Claudia; Brango, Hugo; da Silva, Juliana; Henriques, João A P

    2018-09-01

    During coal surface mining, several activities such as drilling, blasting, loading, and transport produce large quantities of particulate matter (PM) that is directly emitted into the atmosphere. Occupational exposure to this PM has been associated with an increase of DNA damage, but there is a scarcity of data examining the impact of these industrial operations in cytogenetic endpoints frequency and cancer risk of potentially exposed surrounding populations. In this study, we used a Geographic Information Systems (GIS) approach and Inverse Distance Weighting (IDW) methods to perform a spatial and statistical analysis to explore whether exposure to PM 2.5 and PM 10 pollution, and additional factors, including the enrichment of the PM with inorganic elements, contribute to cytogenetic damage in residents living in proximity to an open-pit coal mining area. Results showed a spatial relationship between exposure to elevated concentrations of PM 2.5, PM 10 and micronuclei frequency in binucleated (MNBN) and mononucleated (MNMONO) cells. Active pits, disposal, and storage areas could be identified as the possible emission sources of combustion elements. Mining activities were also correlated with increased concentrations of highly enriched elements like S, Cu and Cr in the atmosphere, corroborating its role in the inorganic elements pollution around coal mines. Elements enriched in the PM 2.5 fraction contributed to increasing of MNBN but seems to be more related to increased MNMONO frequencies and DNA damage accumulated in vivo. The combined use of GIS and IDW methods could represent an important tool for monitoring potential cancer risk associated to dynamically distributed variables like the PM. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. HELIOS: A new open-source radiative transfer code

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin

    2015-12-01

    I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net

  5. An Open Source modular platform for hydrological model implementation

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur; Bruland, Oddbjørn

    2010-05-01

    An implementation framework for setup and evaluation of spatio-temporal models is developed, forming a highly modularized distributed model system. The ENKI framework allows building space-time models for hydrological or other environmental purposes, from a suite of separately compiled subroutine modules. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational hydropower forecasting or other water resource management. Written in C++, ENKI uses a plug-in structure to build a complete model from separately compiled subroutine implementations. These modules contain very little code apart from the core process simulation, and are compiled as dynamic-link libraries (dll). A narrow interface allows the main executable to recognise the number and type of the different variables in each routine. The framework then exposes these variables to the user within the proper context, ensuring that time series exist for input variables, initialisation for states, GIS data sets for static map data, manually or automatically calibrated values for parameters etc. ENKI is designed to meet three different levels of involvement in model construction: • Model application: Running and evaluating a given model. Regional calibration against arbitrary data using a rich suite of objective functions, including likelihood and Bayesian estimation. Uncertainty analysis directed towards input or parameter uncertainty. o Need not: Know the model's composition of subroutines, or the internal variables in the model, or the creation of method modules. • Model analysis: Link together different process methods, including parallel setup of alternative methods for solving the same task. Investigate the effect of different spatial discretization schemes. o Need not

  6. Collaboration using open standards and open source software (examples of DIAS/CEOS Water Portal)

    NASA Astrophysics Data System (ADS)

    Miura, S.; Sekioka, S.; Kuroiwa, K.; Kudo, Y.

    2015-12-01

    The DIAS/CEOS Water Portal is a part of the DIAS (Data Integration and Analysis System, http://www.editoria.u-tokyo.ac.jp/projects/dias/?locale=en_US) systems for data distribution for users including, but not limited to, scientists, decision makers and officers like river administrators. One of the functions of this portal is to enable one-stop search and access variable water related data archived multiple data centers located all over the world. This portal itself does not store data. Instead, according to requests made by users on the web page, it retrieves data from distributed data centers on-the-fly and lets them download and see rendered images/plots. Our system mainly relies on the open source software GI-cat (http://essi-lab.eu/do/view/GIcat) and open standards such as OGC-CSW, Opensearch and OPeNDAP protocol to enable the above functions. Details on how it works will be introduced during the presentation. Although some data centers have unique meta data format and/or data search protocols, our portal's brokering function enables users to search across various data centers at one time. And this portal is also connected to other data brokering systems, including GEOSS DAB (Discovery and Access Broker). As a result, users can search over thousands of datasets, millions of files at one time. Users can access the DIAS/CEOS Water Portal system at http://waterportal.ceos.org/.

  7. IGT-Open: An open-source, computerized version of the Iowa Gambling Task.

    PubMed

    Dancy, Christopher L; Ritter, Frank E

    2017-06-01

    The Iowa Gambling Task (IGT) is commonly used to understand the processes involved in decision-making. Though the task was originally run without a computer, using a computerized version of the task has become typical. These computerized versions of the IGT are useful, because they can make the task more standardized across studies and allow for the task to be used in environments where a physical version of the task may be difficult or impossible to use (e.g., while collecting brain imaging data). Though these computerized versions of the IGT have been useful for experimentation, having multiple software implementations of the task could present reliability issues. We present an open-source software version of the Iowa Gambling Task (called IGT-Open) that allows for millisecond visual presentation accuracy and is freely available to be used and modified. This software has been used to collect data from human subjects and also has been used to run model-based simulations with computational process models developed to run in the ACT-R architecture.

  8. Geospatial Information Response Team

    USGS Publications Warehouse

    Witt, Emitt C.

    2010-01-01

    Extreme emergency events of national significance that include manmade and natural disasters seem to have become more frequent during the past two decades. The Nation is becoming more resilient to these emergencies through better preparedness, reduced duplication, and establishing better communications so every response and recovery effort saves lives and mitigates the long-term social and economic impacts on the Nation. The National Response Framework (NRF) (http://www.fema.gov/NRF) was developed to provide the guiding principles that enable all response partners to prepare for and provide a unified national response to disasters and emergencies. The NRF provides five key principles for better preparation, coordination, and response: 1) engaged partnerships, 2) a tiered response, 3) scalable, flexible, and adaptable operations, 4) unity of effort, and 5) readiness to act. The NRF also describes how communities, tribes, States, Federal Government, privatesector, and non-governmental partners apply these principles for a coordinated, effective national response. The U.S. Geological Survey (USGS) has adopted the NRF doctrine by establishing several earth-sciences, discipline-level teams to ensure that USGS science, data, and individual expertise are readily available during emergencies. The Geospatial Information Response Team (GIRT) is one of these teams. The USGS established the GIRT to facilitate the effective collection, storage, and dissemination of geospatial data information and products during an emergency. The GIRT ensures that timely geospatial data are available for use by emergency responders, land and resource managers, and for scientific analysis. In an emergency and response capacity, the GIRT is responsible for establishing procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing coordinated products and services utilizing the USGS' exceptional pool of

  9. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    NASA Astrophysics Data System (ADS)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During

  10. A New Global Open Source Marine Hydrocarbon Emission Site Database

    NASA Astrophysics Data System (ADS)

    Onyia, E., Jr.; Wood, W. T.; Barnard, A.; Dada, T.; Qazzaz, M.; Lee, T. R.; Herrera, E.; Sager, W.

    2017-12-01

    Hydrocarbon emission sites (e.g. seeps) discharge large volumes of fluids and gases into the oceans that are not only important for biogeochemical budgets, but also support abundant chemosynthetic communities. Documenting the locations of modern emissions is a first step towards understanding and monitoring how they affect the global state of the seafloor and oceans. Currently, no global open source (i.e. non-proprietry) detailed maps of emissions sites are available. As a solution, we have created a database that is housed within an Excel spreadsheet and use the latest versions of Earthpoint and Google Earth for position coordinate conversions and data mapping, respectively. To date, approximately 1,000 data points have been collected from referenceable sources across the globe, and we are continualy expanding the dataset. Due to the variety of spatial extents encountered, to identify each site we used two different methods: 1) point (x, y, z) locations for individual sites and; 2) delineation of areas where sites are clustered. Certain well-known areas, such as the Gulf of Mexico and the Mediterranean Sea, have a greater abundance of information; whereas significantly less information is available in other regions due to the absence of emission sites, lack of data, or because the existing data is proprietary. Although the geographical extent of the data is currently restricted to regions where the most data is publicly available, as the database matures, we expect to have more complete coverage of the world's oceans. This database is an information resource that consolidates and organizes the existing literature on hydrocarbons released into the marine environment, thereby providing a comprehensive reference for future work. We expect that the availability of seafloor hydrocarbon emission maps will benefit scientific understanding of hydrocarbon rich areas as well as potentially aiding hydrocarbon exploration and environmental impact assessements.

  11. Proteus - A Free and Open Source Sensor Observation Service (SOS) Client

    NASA Astrophysics Data System (ADS)

    Henriksson, J.; Satapathy, G.; Bermudez, L. E.

    2013-12-01

    The Earth's 'electronic skin' is becoming ever more sophisticated with a growing number of sensors measuring everything from seawater salinity levels to atmospheric pressure. To further the scientific application of this data collection effort, it is important to make the data easily available to anyone who wants to use it. Making Earth Science data readily available will allow the data to be used in new and potentially groundbreaking ways. The US National Science and Technology Council made this clear in its most recent National Strategy for Civil Earth Observations report, when it remarked that Earth observations 'are often found to be useful for additional purposes not foreseen during the development of the observation system'. On the road to this goal the Open Geospatial Consortium (OGC) is defining uniform data formats and service interfaces to facilitate the discovery and access of sensor data. This is being done through the Sensor Web Enablement (SWE) stack of standards, which include the Sensor Observation Service (SOS), Sensor Model Language (SensorML), Observations & Measurements (O&M) and Catalog Service for the Web (CSW). End-users do not have to use these standards directly, but can use smart tools that leverage and implement them. We have developed such a tool named Proteus. Proteus is an open-source sensor data discovery client. The goal of Proteus is to be a general-purpose client that can be used by anyone for discovering and accessing sensor data via OGC-based services. Proteus is a desktop client and supports a straightforward workflow for finding sensor data. The workflow takes the user through the process of selecting appropriate services, bounding boxes, observed properties, time periods and other search facets. NASA World Wind is used to display the matching sensor offerings on a map. Data from any sensor offering can be previewed in a time series. The user can download data from a single sensor offering, or download data in bulk from all

  12. Geospatial Thinking of Information Professionals

    ERIC Educational Resources Information Center

    Bishop, Bradley Wade; Johnston, Melissa P.

    2013-01-01

    Geospatial thinking skills inform a host of library decisions including planning and managing facilities, analyzing service area populations, facility site location, library outlet and service point closures, as well as assisting users with their own geospatial needs. Geospatial thinking includes spatial cognition, spatial reasoning, and knowledge…

  13. An open source workflow for 3D printouts of scientific data volumes

    NASA Astrophysics Data System (ADS)

    Loewe, P.; Klump, J. F.; Wickert, J.; Ludwig, M.; Frigeri, A.

    2013-12-01

    As the amount of scientific data continues to grow, researchers need new tools to help them visualize complex data. Immersive data-visualisations are helpful, yet fail to provide tactile feedback and sensory feedback on spatial orientation, as provided from tangible objects. The gap in sensory feedback from virtual objects leads to the development of tangible representations of geospatial information to solve real world problems. Examples are animated globes [1], interactive environments like tangible GIS [2], and on demand 3D prints. The production of a tangible representation of a scientific data set is one step in a line of scientific thinking, leading from the physical world into scientific reasoning and back: The process starts with a physical observation, or from a data stream generated by an environmental sensor. This data stream is turned into a geo-referenced data set. This data is turned into a volume representation which is converted into command sequences for the printing device, leading to the creation of a 3D printout. As a last, but crucial step, this new object has to be documented and linked to the associated metadata, and curated in long term repositories to preserve its scientific meaning and context. The workflow to produce tangible 3D data-prints from science data at the German Research Centre for Geosciences (GFZ) was implemented as a software based on the Free and Open Source Geoinformatics tools GRASS GIS and Paraview. The workflow was successfully validated in various application scenarios at GFZ using a RapMan printer to create 3D specimens of elevation models, geological underground models, ice penetrating radar soundings for planetology, and space time stacks for Tsunami model quality assessment. While these first pilot applications have demonstrated the feasibility of the overall approach [3], current research focuses on the provision of the workflow as Software as a Service (SAAS), thematic generalisation of information content and

  14. Open-Source Telemedicine Platform for Wireless Medical Video Communication

    PubMed Central

    Panayides, A.; Eleftheriou, I.; Pantziaris, M.

    2013-01-01

    An m-health system for real-time wireless communication of medical video based on open-source software is presented. The objective is to deliver a low-cost telemedicine platform which will allow for reliable remote diagnosis m-health applications such as emergency incidents, mass population screening, and medical education purposes. The performance of the proposed system is demonstrated using five atherosclerotic plaque ultrasound videos. The videos are encoded at the clinically acquired resolution, in addition to lower, QCIF, and CIF resolutions, at different bitrates, and four different encoding structures. Commercially available wireless local area network (WLAN) and 3.5G high-speed packet access (HSPA) wireless channels are used to validate the developed platform. Objective video quality assessment is based on PSNR ratings, following calibration using the variable frame delay (VFD) algorithm that removes temporal mismatch between original and received videos. Clinical evaluation is based on atherosclerotic plaque ultrasound video assessment protocol. Experimental results show that adequate diagnostic quality wireless medical video communications are realized using the designed telemedicine platform. HSPA cellular networks provide for ultrasound video transmission at the acquired resolution, while VFD algorithm utilization bridges objective and subjective ratings. PMID:23573082

  15. ERDDAP: Reducing Data Friction with an Open Source Data Platform

    NASA Astrophysics Data System (ADS)

    O'Brien, K.

    2017-12-01

    Data friction is not just an issue facing interdisciplinary research. Often times, even within disciplines, significant data friction can exist. Issues of differing formats, limited metadata and non-existent machine-to-machine data access are all issues that exist within disciplines and make it that much harder for successful interdisciplinary cooperation. Therefore, reducing data friction within disciplines is crucial first step in providing better overall collaboration. ERDDAP, an open source data platform developed at NOAA's Southwest Fisheries Center, is well poised to improve data useability and understanding and reduce data friction, both in single and multi-disciplinary research. By virtue of its ability to integrate data of varying formats and provide RESTful-based user access to data and metadata, use of ERDDAP has grown substantially throughout the ocean data community. ERDDAP also supports standards such as the DAP data protocol, the Climate and Forecast (CF) metadata conventions and the Bagit document standard for data archival. In this presentation, we will discuss the advantages of using ERDDAP as a data platform. We will also show specific use cases where utilizing ERDDAP has reduced friction within a single discipline (physical oceanography) and improved interdisciplinary collaboration as well.

  16. Real-time control using open source RTOS

    NASA Astrophysics Data System (ADS)

    Irwin, Philip C.; Johnson, Richard L., Jr.

    2002-12-01

    Complex telescope systems such as interferometers tend to rely heavily on hard real-time operating systems (RTOS). It has been standard practice at NASA's Jet Propulsion Laboratory (JPL) and many other institutions to use costly commercial RTOSs and hardware. After developing a real-time toolkit for VxWorks on the PowerPC platform (dubbed RTC), the interferometry group at JPL is porting this code to the real-time Application Interface (RTAI), an open source RTOS that is essentially an extension to the Linux kernel. This port has the potential to reduce software and hardware costs for future projects, while increasing the level of performance. The goals of this paper are to briefly describe the RTC toolkit, highlight the successes and pitfalls of porting the toolkit from VxWorks to Linux-RTAI, and to discuss future enhancements that will be implemented as a direct result of this port. The first port of any body of code is always the most difficult since it uncovers the OS-specific calls and forces "red flags" into those portions of the code. For this reason, It has also been a huge benefit that the project chose a generic, platform independent OS extension, ACE, and its CORBA counterpart, TAO. This port of RTC will pave the way for conversions to other environments, the most interesting of which is a non-real-time simulation environment, currently being considered by the Space Interferometry Mission (SIM) and the Terrestrial Planet Finder (TPF) Projects.

  17. An open source, wireless capable miniature microscope system

    NASA Astrophysics Data System (ADS)

    Liberti, William A., III; Perkins, L. Nathan; Leman, Daniel P.; Gardner, Timothy J.

    2017-08-01

    Objective. Fluorescence imaging through head-mounted microscopes in freely behaving animals is becoming a standard method to study neural circuit function. Flexible, open-source designs are needed to spur evolution of the method. Approach. We describe a miniature microscope for single-photon fluorescence imaging in freely behaving animals. The device is made from 3D printed parts and off-the-shelf components. These microscopes weigh less than 1.8 g, can be configured to image a variety of fluorophores, and can be used wirelessly or in conjunction with active commutators. Microscope control software, based in Swift for macOS, provides low-latency image processing capabilities for closed-loop, or BMI, experiments. Main results. Miniature microscopes were deployed in the songbird premotor region HVC (used as a proper name), in singing zebra finches. Individual neurons yield temporally precise patterns of calcium activity that are consistent over repeated renditions of song. Several cells were tracked over timescales of weeks and months, providing an opportunity to study learning related changes in HVC. Significance. 3D printed miniature microscopes, composed completely of consumer grade components, are a cost-effective, modular option for head-mounting imaging. These easily constructed and customizable tools provide access to cell-type specific neural ensembles over timescales of weeks.

  18. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  19. Open-source telemedicine platform for wireless medical video communication.

    PubMed

    Panayides, A; Eleftheriou, I; Pantziaris, M

    2013-01-01

    An m-health system for real-time wireless communication of medical video based on open-source software is presented. The objective is to deliver a low-cost telemedicine platform which will allow for reliable remote diagnosis m-health applications such as emergency incidents, mass population screening, and medical education purposes. The performance of the proposed system is demonstrated using five atherosclerotic plaque ultrasound videos. The videos are encoded at the clinically acquired resolution, in addition to lower, QCIF, and CIF resolutions, at different bitrates, and four different encoding structures. Commercially available wireless local area network (WLAN) and 3.5G high-speed packet access (HSPA) wireless channels are used to validate the developed platform. Objective video quality assessment is based on PSNR ratings, following calibration using the variable frame delay (VFD) algorithm that removes temporal mismatch between original and received videos. Clinical evaluation is based on atherosclerotic plaque ultrasound video assessment protocol. Experimental results show that adequate diagnostic quality wireless medical video communications are realized using the designed telemedicine platform. HSPA cellular networks provide for ultrasound video transmission at the acquired resolution, while VFD algorithm utilization bridges objective and subjective ratings.

  20. Open-Source Photometric System for Enzymatic Nitrate Quantification

    PubMed Central

    Wittbrodt, B. T.; Squires, D. A.; Walbeck, J.; Campbell, E.; Campbell, W. H.; Pearce, J. M.

    2015-01-01

    Nitrate, the most oxidized form of nitrogen, is regulated to protect people and animals from harmful levels as there is a large over abundance due to anthropogenic factors. Widespread field testing for nitrate could begin to address the nitrate pollution problem, however, the Cadmium Reduction Method, the leading certified method to detect and quantify nitrate, demands the use of a toxic heavy metal. An alternative, the recently proposed Environmental Protection Agency Nitrate Reductase Nitrate-Nitrogen Analysis Method, eliminates this problem but requires an expensive proprietary spectrophotometer. The development of an inexpensive portable, handheld photometer will greatly expedite field nitrate analysis to combat pollution. To accomplish this goal, a methodology for the design, development, and technical validation of an improved open-source water testing platform capable of performing Nitrate Reductase Nitrate-Nitrogen Analysis Method. This approach is evaluated for its potential to i) eliminate the need for toxic chemicals in water testing for nitrate and nitrite, ii) reduce the cost of equipment to perform this method for measurement for water quality, and iii) make the method easier to carryout in the field. The device is able to perform as well as commercial proprietary systems for less than 15% of the cost for materials. This allows for greater access to the technology and the new, safer nitrate testing technique. PMID:26244342

  1. Open-Source Software for Modeling of Nanoelectronic Devices

    NASA Technical Reports Server (NTRS)

    Oyafuso, Fabiano; Hua, Hook; Tisdale, Edwin; Hart, Don

    2004-01-01

    The Nanoelectronic Modeling 3-D (NEMO 3-D) computer program has been upgraded to open-source status through elimination of license-restricted components. The present version functions equivalently to the version reported in "Software for Numerical Modeling of Nanoelectronic Devices" (NPO-30520), NASA Tech Briefs, Vol. 27, No. 11 (November 2003), page 37. To recapitulate: NEMO 3-D performs numerical modeling of the electronic transport and structural properties of a semiconductor device that has overall dimensions of the order of tens of nanometers. The underlying mathematical model represents the quantum-mechanical behavior of the device resolved to the atomistic level of granularity. NEMO 3-D solves the applicable quantum matrix equation on a Beowulf-class cluster computer by use of a parallel-processing matrix vector multiplication algorithm coupled to a Lanczos and/or Rayleigh-Ritz algorithm that solves for eigenvalues. A prior upgrade of NEMO 3-D incorporated a capability for a strain treatment, parameterized for bulk material properties of GaAs and InAs, for two tight-binding submodels. NEMO 3-D has been demonstrated in atomistic analyses of effects of disorder in alloys and, in particular, in bulk In(x)Ga(1-x)As and in In(0.6)Ga(0.4)As quantum dots.

  2. Open source tools for standardized privacy protection of medical images

    NASA Astrophysics Data System (ADS)

    Lien, Chung-Yueh; Onken, Michael; Eichelberg, Marco; Kao, Tsair; Hein, Andreas

    2011-03-01

    In addition to the primary care context, medical images are often useful for research projects and community healthcare networks, so-called "secondary use". Patient privacy becomes an issue in such scenarios since the disclosure of personal health information (PHI) has to be prevented in a sharing environment. In general, most PHIs should be completely removed from the images according to the respective privacy regulations, but some basic and alleviated data is usually required for accurate image interpretation. Our objective is to utilize and enhance these specifications in order to provide reliable software implementations for de- and re-identification of medical images suitable for online and offline delivery. DICOM (Digital Imaging and Communications in Medicine) images are de-identified by replacing PHI-specific information with values still being reasonable for imaging diagnosis and patient indexing. In this paper, this approach is evaluated based on a prototype implementation built on top of the open source framework DCMTK (DICOM Toolkit) utilizing standardized de- and re-identification mechanisms. A set of tools has been developed for DICOM de-identification that meets privacy requirements of an offline and online sharing environment and fully relies on standard-based methods.

  3. Nektar++: An open-source spectral/ hp element framework

    NASA Astrophysics Data System (ADS)

    Cantwell, C. D.; Moxey, D.; Comerford, A.; Bolis, A.; Rocco, G.; Mengaldo, G.; De Grazia, D.; Yakovlev, S.; Lombard, J.-E.; Ekelschot, D.; Jordi, B.; Xu, H.; Mohamied, Y.; Eskilsson, C.; Nelson, B.; Vos, P.; Biotto, C.; Kirby, R. M.; Sherwin, S. J.

    2015-07-01

    Nektar++ is an open-source software framework designed to support the development of high-performance scalable solvers for partial differential equations using the spectral/ hp element method. High-order methods are gaining prominence in several engineering and biomedical applications due to their improved accuracy over low-order techniques at reduced computational cost for a given number of degrees of freedom. However, their proliferation is often limited by their complexity, which makes these methods challenging to implement and use. Nektar++ is an initiative to overcome this limitation by encapsulating the mathematical complexities of the underlying method within an efficient C++ framework, making the techniques more accessible to the broader scientific and industrial communities. The software supports a variety of discretisation techniques and implementation strategies, supporting methods research as well as application-focused computation, and the multi-layered structure of the framework allows the user to embrace as much or as little of the complexity as they need. The libraries capture the mathematical constructs of spectral/ hp element methods, while the associated collection of pre-written PDE solvers provides out-of-the-box application-level functionality and a template for users who wish to develop solutions for addressing questions in their own scientific domains.

  4. Special population planner 4 : an open source release.

    SciTech Connect

    Kuiper, J.; Metz, W.; Tanzman, E.

    2008-01-01

    Emergencies like Hurricane Katrina and the recent California wildfires underscore the critical need to meet the complex challenge of planning for individuals with special needs and for institutionalized special populations. People with special needs and special populations often have difficulty responding to emergencies or taking protective actions, and emergency responders may be unaware of their existence and situations during a crisis. Special Population Planner (SPP) is an ArcGIS-based emergency planning system released as an open source product. SPP provides for easy production of maps, reports, and analyses to develop and revise emergency response plans. It includes tools to manage amore » voluntary registry of data for people with special needs, integrated links to plans and documents, tools for response planning and analysis, preformatted reports and maps, and data on locations of special populations, facility and resource characteristics, and contacts. The system can be readily adapted for new settings without programming and is broadly applicable. Full documentation and a demonstration database are included in the release.« less

  5. A global, open-source database of flood protection standards

    NASA Astrophysics Data System (ADS)

    Scussolini, Paolo; Aerts, Jeroen; Jongman, Brenden; Bouwer, Laurens; Winsemius, Hessel; de Moel, Hans; Ward, Philip

    2016-04-01

    Accurate flood risk estimation is pivotal in that it enables risk-informed policies in disaster risk reduction, as emphasized in the recent Sendai framework for Disaster Risk Reduction. To improve our understanding of flood risk, models are now capable to provide actionable risk information on the (sub)global scale. Still the accuracy of their results is greatly limited by the lack of information on standards of protection to flood that are actually in place; and researchers thus take large assumptions on the extent of protection. With our work we propose a first global, open-source database of FLOod PROtection Standards, FLOPROS, covering a range of spatial scales. FLOPROS is structured in three layers of information, and merges them into one consistent database: 1) the Design layer contains empirical information about the standard of protection presently in place; 2) the Policy layer contains intended protection standards from normative documents; 3) the Model layer uses a validated numerical approach to calculate protection standards for areas not covered in the other layers. The FLOPROS database can be used for more accurate risk assessment exercises across scales. As the database should be continually updated to reflect new interventions, we invite researchers and practitioners to contribute information. Further, we look for partners within the risk community to participate in additional strategies to implement the amount and accuracy of information contained in this first version of FLOPROS.

  6. Open-Source Photometric System for Enzymatic Nitrate Quantification.

    PubMed

    Wittbrodt, B T; Squires, D A; Walbeck, J; Campbell, E; Campbell, W H; Pearce, J M

    2015-01-01

    Nitrate, the most oxidized form of nitrogen, is regulated to protect people and animals from harmful levels as there is a large over abundance due to anthropogenic factors. Widespread field testing for nitrate could begin to address the nitrate pollution problem, however, the Cadmium Reduction Method, the leading certified method to detect and quantify nitrate, demands the use of a toxic heavy metal. An alternative, the recently proposed Environmental Protection Agency Nitrate Reductase Nitrate-Nitrogen Analysis Method, eliminates this problem but requires an expensive proprietary spectrophotometer. The development of an inexpensive portable, handheld photometer will greatly expedite field nitrate analysis to combat pollution. To accomplish this goal, a methodology for the design, development, and technical validation of an improved open-source water testing platform capable of performing Nitrate Reductase Nitrate-Nitrogen Analysis Method. This approach is evaluated for its potential to i) eliminate the need for toxic chemicals in water testing for nitrate and nitrite, ii) reduce the cost of equipment to perform this method for measurement for water quality, and iii) make the method easier to carryout in the field. The device is able to perform as well as commercial proprietary systems for less than 15% of the cost for materials. This allows for greater access to the technology and the new, safer nitrate testing technique.

  7. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  8. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.

  9. Tessera: Open source software for accelerated data science

    SciTech Connect

    Sego, Landon H.; Hafen, Ryan P.; Director, Hannah M.

    2014-06-30

    Extracting useful, actionable information from data can be a formidable challenge for the safeguards, nonproliferation, and arms control verification communities. Data scientists are often on the “front-lines” of making sense of complex and large datasets. They require flexible tools that make it easy to rapidly reformat large datasets, interactively explore and visualize data, develop statistical algorithms, and validate their approaches—and they need to perform these activities with minimal lines of code. Existing commercial software solutions often lack extensibility and the flexibility required to address the nuances of the demanding and dynamic environments where data scientists work. To address this need,more » Pacific Northwest National Laboratory developed Tessera, an open source software suite designed to enable data scientists to interactively perform their craft at the terabyte scale. Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving insight from data. We illustrate the use of Tessera with an example analysis of computer network data.« less

  10. RF Wave Simulation Using the MFEM Open Source FEM Package

    NASA Astrophysics Data System (ADS)

    Stillerman, J.; Shiraiwa, S.; Bonoli, P. T.; Wright, J. C.; Green, D. L.; Kolev, T.

    2016-10-01

    A new plasma wave simulation environment based on the finite element method is presented. MFEM, a scalable open-source FEM library, is used as the basis for this capability. MFEM allows for assembling an FEM matrix of arbitrarily high order in a parallel computing environment. A 3D frequency domain RF physics layer was implemented using a python wrapper for MFEM and a cold collisional plasma model was ported. This physics layer allows for defining the plasma RF wave simulation model without user knowledge of the FEM weak-form formulation. A graphical user interface is built on πScope, a python-based scientific workbench, such that a user can build a model definition file interactively. Benchmark cases have been ported to this new environment, with results being consistent with those obtained using COMSOL multiphysics, GENRAY, and TORIC/TORLH spectral solvers. This work is a first step in bringing to bear the sophisticated computational tool suite that MFEM provides (e.g., adaptive mesh refinement, solver suite, element types) to the linear plasma-wave interaction problem, and within more complicated integrated workflows, such as coupling with core spectral solver, or incorporating additional physics such as an RF sheath potential model or kinetic effects. USDoE Awards DE-FC02-99ER54512, DE-FC02-01ER54648.

  11. Acquire: an open-source comprehensive cancer biobanking system.

    PubMed

    Dowst, Heidi; Pew, Benjamin; Watkins, Chris; McOwiti, Apollo; Barney, Jonathan; Qu, Shijing; Becnel, Lauren B

    2015-05-15

    The probability of effective treatment of cancer with a targeted therapeutic can be improved for patients with defined genotypes containing actionable mutations. To this end, many human cancer biobanks are integrating more tightly with genomic sequencing facilities and with those creating and maintaining patient-derived xenografts (PDX) and cell lines to provide renewable resources for translational research. To support the complex data management needs and workflows of several such biobanks, we developed Acquire. It is a robust, secure, web-based, database-backed open-source system that supports all major needs of a modern cancer biobank. Its modules allow for i) up-to-the-minute 'scoreboard' and graphical reporting of collections; ii) end user roles and permissions; iii) specimen inventory through caTissue Suite; iv) shipping forms for distribution of specimens to pathology, genomic analysis and PDX/cell line creation facilities; v) robust ad hoc querying; vi) molecular and cellular quality control metrics to track specimens' progress and quality; vii) public researcher request; viii) resource allocation committee distribution request review and oversight and ix) linkage to available derivatives of specimen. © The Author 2015. Published by Oxford University Press.

  12. An open source simulation model for soil and sediment bioturbation.

    PubMed

    Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin

    2011-01-01

    Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach.

  13. What makes computational open source software libraries successful?

    NASA Astrophysics Data System (ADS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  14. An Open Source Simulation Model for Soil and Sediment Bioturbation

    PubMed Central

    Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin

    2011-01-01

    Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach. PMID:22162997

  15. Slicer Method Comparison Using Open-source 3D Printer

    NASA Astrophysics Data System (ADS)

    Ariffin, M. K. A. Mohd; Sukindar, N. A.; Baharudin, B. T. H. T.; Jaafar, C. N. A.; Ismail, M. I. S.

    2018-01-01

    Open-source 3D printer has been one of the popular choices in fabricating 3D models. This technology is easily accessible and low in cost. However, several studies have been made to improve the performance of this low-cost technology in term of the accuracy of the parts finish. This study is focusing on the selection of slicer mode between CuraEngine and Slic3r. The effect on this slicer has been observe in terms of accuracy and surface visualization. The result shows that if the accuracy is the top priority, CuraEngine is the better option to use as contribute more accuracy as well as less filament is needed compared to the Slice3r. Slice3r may be very useful for complicated parts such as hanging structure due to excessive material which act as support material. The study provides basic platform for the user to have an idea which option to be used in fabricating 3D model.

  16. Open-source software platform for medical image segmentation applications

    NASA Astrophysics Data System (ADS)

    Namías, R.; D'Amato, J. P.; del Fresno, M.

    2017-11-01

    Segmenting 2D and 3D images is a crucial and challenging problem in medical image analysis. Although several image segmentation algorithms have been proposed for different applications, no universal method currently exists. Moreover, their use is usually limited when detection of complex and multiple adjacent objects of interest is needed. In addition, the continually increasing volumes of medical imaging scans require more efficient segmentation software design and highly usable applications. In this context, we present an extension of our previous segmentation framework which allows the combination of existing explicit deformable models in an efficient and transparent way, handling simultaneously different segmentation strategies and interacting with a graphic user interface (GUI). We present the object-oriented design and the general architecture which consist of two layers: the GUI at the top layer, and the processing core filters at the bottom layer. We apply the framework for segmenting different real-case medical image scenarios on public available datasets including bladder and prostate segmentation from 2D MRI, and heart segmentation in 3D CT. Our experiments on these concrete problems show that this framework facilitates complex and multi-object segmentation goals while providing a fast prototyping open-source segmentation tool.

  17. Evaluation of Open-Source Hard Real Time Software Packages

    NASA Technical Reports Server (NTRS)

    Mattei, Nicholas S.

    2004-01-01

    replacing this somewhat costly implementation is the focus of one of the SA group s current research projects. The explosion of open source software in the last ten years has led to the development of a multitude of software solutions which were once only produced by major corporations. The benefits of these open projects include faster release and bug patching cycles as well as inexpensive if not free software solutions. The main packages for hard real time solutions under Linux are Real Time Application Interface (RTAI) and two varieties of Real Time Linux (RTL), RTLFree and RTLPro. During my time here at NASA I have been testing various hard real time solutions operating as layers on the Linux Operating System. All testing is being run on an Intel SBC 2590 which is a common embedded hardware platform. The test plan was provided to me by the Software Assurance group at the start of my internship and my job has been to test the systems by developing and executing the test cases on the hardware. These tests are constructed so that the Software Assurance group can get hard test data for a comparison between the open source and proprietary implementations of hard real time solutions.

  18. EHDViz: clinical dashboard development using open-source technologies.

    PubMed

    Badgeley, Marcus A; Shameer, Khader; Glicksberg, Benjamin S; Tomlinson, Max S; Levin, Matthew A; McCormick, Patrick J; Kasarskis, Andrew; Reich, David L; Dudley, Joel T

    2016-03-24

    To design, develop and prototype clinical dashboards to integrate high-frequency health and wellness data streams using interactive and real-time data visualisation and analytics modalities. We developed a clinical dashboard development framework called electronic healthcare data visualization (EHDViz) toolkit for generating web-based, real-time clinical dashboards for visualising heterogeneous biomedical, healthcare and wellness data. The EHDViz is an extensible toolkit that uses R packages for data management, normalisation and producing high-quality visualisations over the web using R/Shiny web server architecture. We have developed use cases to illustrate utility of EHDViz in different scenarios of clinical and wellness setting as a visualisation aid for improving healthcare delivery. Using EHDViz, we prototyped clinical dashboards to demonstrate the contextual versatility of EHDViz toolkit. An outpatient cohort was used to visualise population health management tasks (n=14,221), and an inpatient cohort was used to visualise real-time acuity risk in a clinical unit (n=445), and a quantified-self example using wellness data from a fitness activity monitor worn by a single individual was also discussed (n-of-1). The back-end system retrieves relevant data from data source, populates the main panel of the application and integrates user-defined data features in real-time and renders output using modern web browsers. The visualisation elements can be customised using health features, disease names, procedure names or medical codes to populate the visualisations. The source code of EHDViz and various prototypes developed using EHDViz are available in the public domain at http://ehdviz.dudleylab.org. Collaborative data visualisations, wellness trend predictions, risk estimation, proactive acuity status monitoring and knowledge of complex disease indicators are essential components of implementing data-driven precision medicine. As an open-source visualisation

  19. EHDViz: clinical dashboard development using open-source technologies

    PubMed Central

    Badgeley, Marcus A; Shameer, Khader; Glicksberg, Benjamin S; Tomlinson, Max S; Levin, Matthew A; McCormick, Patrick J; Kasarskis, Andrew; Reich, David L; Dudley, Joel T

    2016-01-01

    -driven precision medicine. As an open-source visualisation framework capable of integrating health assessment, EHDViz aims to be a valuable toolkit for rapid design, development and implementation of scalable clinical data visualisation dashboards. PMID:27013597

  20. The geospatial data quality REST API for primary biodiversity data

    PubMed Central

    Otegui, Javier; Guralnick, Robert P.

    2016-01-01

    Summary: We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. Availability and implementation: The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial. Contact: javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26833340

  1. The geospatial data quality REST API for primary biodiversity data.

    PubMed

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  2. A novel algorithm for fully automated mapping of geospatial ontologies

    NASA Astrophysics Data System (ADS)

    Chaabane, Sana; Jaziri, Wassim

    2018-01-01

    Geospatial information is collected from different sources thus making spatial ontologies, built for the same geographic domain, heterogeneous; therefore, different and heterogeneous conceptualizations may coexist. Ontology integrating helps creating a common repository of the geospatial ontology and allows removing the heterogeneities between the existing ontologies. Ontology mapping is a process used in ontologies integrating and consists in finding correspondences between the source ontologies. This paper deals with the "mapping" process of geospatial ontologies which consist in applying an automated algorithm in finding the correspondences between concepts referring to the definitions of matching relationships. The proposed algorithm called "geographic ontologies mapping algorithm" defines three types of mapping: semantic, topological and spatial.

  3. When Free Isn't Free: The Realities of Running Open Source in School

    ERIC Educational Resources Information Center

    Derringer, Pam

    2009-01-01

    Despite the last few years' growth in awareness of open-source software in schools and the potential savings it represents, its widespread adoption is still hampered. Randy Orwin, technology director of the Bainbridge Island School District in Washington State and a strong open-source advocate, cautions that installing an open-source…

  4. A flexible open-source toolkit for lava flow simulations

    NASA Astrophysics Data System (ADS)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  5. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  6. Grid computing enhances standards-compatible geospatial catalogue service

    NASA Astrophysics Data System (ADS)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and

  7. Alaska Geospatial Council

    Science.gov Websites

    Bill Walker on July 29, 2015. Since that date, additional members have been annexed to include federal Executive Committee. News and Updates: Geospatial Data Act of 2017: A bill re-introduced to the Senate on

  8. Automated Geospatial Watershed Assessment

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool is a Geographic Information Systems (GIS) interface jointly developed by the U.S. Environmental Protection Agency, the U.S. Department of Agriculture (USDA) Agricultural Research Service, and the University of Arizona to a...

  9. Leveraging the geospatial advantage

    Treesearch

    Ben Butler; Andrew Bailey

    2013-01-01

    The Wildland Fire Decision Support System (WFDSS) web-based application leverages geospatial data to inform strategic decisions on wildland fires. A specialized data team, working within the Wildland Fire Management Research Development and Application group (WFM RD&A), assembles authoritative national-level data sets defining values to be protected. The use of...

  10. Application of polar orbiter products in weather forecasting using open source tools and open standards

    NASA Astrophysics Data System (ADS)

    Plieger, Maarten; de Vreede, Ernst

    2015-04-01

    EUMETSAT disseminates data for a number of polar satellites. At KNMI these data are not fully used for operational weather forecasting mainly because of the irregular coverage and lack of tools for handling these different types of data and products. For weather forecasting there is a lot of interest in the application of products from these polar orbiters. One of the key aspects is the high-resolution of these products, which can complement the information provided by numerical weather forecasts. Another advantage over geostationary satellites is the high coverage at higher latitudes and lack of parallax. Products like the VIIRS day-night band offer many possibilities for this application. This presentation will describe a project that aims to make available a number of products from polar satellites to the forecasting operation. The goal of the project is to enable easy and timely access to polar orbiter products and enable combined presentations of satellite imagery with model data. The system will be able to generate RGB composites (“false colour images”) for operational use. The system will be built using open source components and open standards. Pytroll components are used for data handling, reprojection and derived product generation. For interactive presentation of imagery the browser based ADAGUC WMS viewer component is used. Image generation is done by ADAGUC server components, which provide OGC WMS services. Polar satellite products are stored as true color RGBA data in the NetCDF file format, the satellite swaths are stored as regular grids with their own custom geographical projection. The ADAGUC WMS system is able to reproject, render and combine these data in a webbrowser interactively. Results and lessons learned will be presented at the conference.

  11. Matlab Geochemistry: An open source geochemistry solver based on MRST

    NASA Astrophysics Data System (ADS)

    McNeece, C. J.; Raynaud, X.; Nilsen, H.; Hesse, M. A.

    2017-12-01

    The study of geological systems often requires the solution of complex geochemical relations. To address this need we present an open source geochemical solver based on the Matlab Reservoir Simulation Toolbox (MRST) developed by SINTEF. The implementation supports non-isothermal multicomponent aqueous complexation, surface complexation, ion exchange, and dissolution/precipitation reactions. The suite of tools available in MRST allows for rapid model development, in particular the incorporation of geochemical calculations into transport simulations of multiple phases, complex domain geometry and geomechanics. Different numerical schemes and additional physics can be easily incorporated into the existing tools through the object-oriented framework employed by MRST. The solver leverages the automatic differentiation tools available in MRST to solve arbitrarily complex geochemical systems with any choice of species or element concentration as input. Four mathematical approaches enable the solver to be quite robust: 1) the choice of chemical elements as the basis components makes all entries in the composition matrix positive thus preserving convexity, 2) a log variable transformation is used which transfers the nonlinearity to the convex composition matrix, 3) a priori bounds on variables are calculated from the structure of the problem, constraining Netwon's path and 4) an initial guess is calculated implicitly by sequentially adding model complexity. As a benchmark we compare the model to experimental and semi-analytic solutions of the coupled salinity-acidity transport system. Together with the reservoir simulation capabilities of MRST the solver offers a promising tool for geochemical simulations in reservoir domains for applications in a diversity of fields from enhanced oil recovery to radionuclide storage.

  12. OpenCFU, a new free and open-source software to count cell colonies and other circular objects.

    PubMed

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net.

  13. OpenCFU, a New Free and Open-Source Software to Count Cell Colonies and Other Circular Objects

    PubMed Central

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net. PMID:23457446

  14. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    PubMed

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  15. Common characteristics of open source software development and applicability for drug discovery: a systematic review.

    PubMed

    Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne

    2011-09-28

    Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.

  16. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  17. The Infusion of Dust Model Model Outputs into Public Health Decision Making - an Examination of Differential Adoption of SOAP and Open Geospatial Consortium Service Products into Public Health Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Benedict, K. K.

    2008-12-01

    Since 2004 the Earth Data Analysis Center, in collaboration with the researchers at the University of Arizona and George Mason University, with funding from NASA, has been developing a services oriented architecture (SOA) that acquires remote sensing, meteorological forecast, and observed ground level particulate data (EPA AirNow) from NASA, NOAA, and DataFed through a variety of standards-based service interfaces. These acquired data are used to initialize and set boundary conditions for the execution of the Dust Regional Atmospheric Model (DREAM) to generate daily 48-hour dust forecasts, which are then published via a combination of Open Geospatial Consortium (OGC) services (WMS and WCS), basic HTTP request-based services, and SOAP services. The goal of this work has been to develop services that can be integrated into existing public health decision support systems (DSS) to provide enhanced environmental data (i.e. ground surface particulate concentration estimates) for use in epidemiological analysis, public health warning systems, and syndromic surveillance systems. While the project has succeeded in deploying these products into the target systems, there has been differential adoption of the different service interface products, with the simple OGC and HTTP interfaces generating much greater interest by DSS developers and researchers than the more complex SOAP service interfaces. This paper reviews the SOA developed as part of this project and provides insights into how different service models may have a significant impact on the infusion of Earth science products into decision making processes and systems.

  18. THOR: an open-source exo-GCM

    NASA Astrophysics Data System (ADS)

    Grosheintz, Luc; Mendonça, João; Käppeli, Roger; Lukas Grimm, Simon; Mishra, Siddhartha; Heng, Kevin

    2015-12-01

    implicit GCM. By ESS3, I hope to present results for the advection equation.THOR is part of the Exoclimes Simulation Platform (ESP), a set of open-source community codes for simulating and understanding the atmospheres of exoplanets. The ESP also includes tools for radiative transfer and retrieval (HELIOS), an opacity calculator (HELIOS-K), and a chemical kinetics solver (VULCAN). We expect to publicly release an initial version of THOR in 2016 on www.exoclime.org.

  19. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    NASA Astrophysics Data System (ADS)

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  20. Towards open collaborative health informatics - The Role of free/libre open source principles. Contribution of the IMIA Open Source Health Informatics Working Group.

    PubMed

    Karopka, T; Schmuhl, H; Marcelo, A; Molin, J Dal; Wright, G

    2011-01-01

    : To analyze the contribution of Free/Libre Open Source Software in health care (FLOSS-HC) and to give perspectives for future developments. The paper summarizes FLOSS-related trends in health care as anticipated by members of the IMIA Open Source Working Group. Data were obtained through literature review and personal experience and observations of the authors in the last two decades. A status quo is given by a frequency analysis of the database of Medfloss.org, one of the world's largest platforms dedicated to FLOSS-HC. The authors discuss current problems in the field of health care and finally give a prospective roadmap, a projection of the potential influences of FLOSS in health care. FLOSS-HC already exists for more than 2 decades. Several projects have shown that FLOSS may produce highly competitive alternatives to proprietary solutions that are at least equivalent in usability and have a better total cost of ownership ratio. The Medfloss.org database currently lists 221 projects of diverse application types. FLOSS principles hold a great potential for addressing several of the most critical problems in health care IT. The authors argue that an ecosystem perspective is relevant and that FLOSS principles are best suited to create health IT systems that are able to evolve over time as medical knowledge, technologies, insights, workflows etc. continuously change. All these factors that inherently influence the development of health IT systems are changing at an ever growing pace. Traditional models of software engineering are not able to follow these changes and provide up-to-date systems for an acceptable cost/value ratio. To allow FLOSS to positively influence Health IT in the future a "FLOSS-friendly" environment has to be provided. Policy makers should resolve uncertainties in the legal framework that disfavor FLOSS. Certification procedures should be specified in a way that they do not raise additional barriers for FLOSS.

  1. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    SciTech Connect

    Pabian, Frank V

    2012-08-14

    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previouslymore » used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant and non-relevant facilities and their

  2. OpenMebius: an open source software for isotopically nonstationary 13C-based metabolic flux analysis.

    PubMed

    Kajihata, Shuichi; Furusawa, Chikara; Matsuda, Fumio; Shimizu, Hiroshi

    2014-01-01

    The in vivo measurement of metabolic flux by (13)C-based metabolic flux analysis ((13)C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a (13)C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas (13)C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary (13)C metabolic flux analysis (INST-(13)C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-(13)C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-(13)C-MFA. Confidence intervals determined by INST-(13)C-MFA were less than those determined by conventional methods, indicating the potential of INST-(13)C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-(13)C-MFA.

  3. Geospatial intelligence workforce

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2013-02-01

    A report on the future U.S. workforce for geospatial intelligence, requested by the U.S. National Geospatial-Intelligence Agency (NGA), found that the agency—which hires about 300 scientists and analysts annually—is probably finding sufficient experts to fill the needs in all of its core areas, with the possible exception of geographic information systems (GIS) and remote sensing. The report by the U.S. National Research Council, released on 25 January, noted that competition for GIS applications analysts is strong. While there appear to be enough cartographers, photogrammetrists, and geodesists to meet NGA's current needs in those core areas, the report cautioned that future shortages in these areas seem likely because of a relatively small number of graduates.

  4. Foreword to the theme issue on geospatial computer vision

    NASA Astrophysics Data System (ADS)

    Wegner, Jan Dirk; Tuia, Devis; Yang, Michael; Mallet, Clement

    2018-06-01

    Geospatial Computer Vision has become one of the most prevalent emerging fields of investigation in Earth Observation in the last few years. In this theme issue, we aim at showcasing a number of works at the interface between remote sensing, photogrammetry, image processing, computer vision and machine learning. In light of recent sensor developments - both from the ground as from above - an unprecedented (and ever growing) quantity of geospatial data is available for tackling challenging and urgent tasks such as environmental monitoring (deforestation, carbon sequestration, climate change mitigation), disaster management, autonomous driving or the monitoring of conflicts. The new bottleneck for serving these applications is the extraction of relevant information from such large amounts of multimodal data. This includes sources, stemming from multiple sensors, that exhibit distinct physical nature of heterogeneous quality, spatial, spectral and temporal resolutions. They are as diverse as multi-/hyperspectral satellite sensors, color cameras on drones, laser scanning devices, existing open land-cover geodatabases and social media. Such core data processing is mandatory so as to generate semantic land-cover maps, accurate detection and trajectories of objects of interest, as well as by-products of superior added-value: georeferenced data, images with enhanced geometric and radiometric qualities, or Digital Surface and Elevation Models.

  5. MITK-OpenIGTLink for combining open-source toolkits in real-time computer-assisted interventions.

    PubMed

    Klemm, Martin; Kirchner, Thomas; Gröhl, Janek; Cheray, Dominique; Nolden, Marco; Seitel, Alexander; Hoppe, Harald; Maier-Hein, Lena; Franz, Alfred M

    2017-03-01

    Due to rapid developments in the research areas of medical imaging, medical image processing and robotics, computer-assisted interventions (CAI) are becoming an integral part of modern patient care. From a software engineering point of view, these systems are highly complex and research can benefit greatly from reusing software components. This is supported by a number of open-source toolkits for medical imaging and CAI such as the medical imaging interaction toolkit (MITK), the public software library for ultrasound imaging research (PLUS) and 3D Slicer. An independent inter-toolkit communication such as the open image-guided therapy link (OpenIGTLink) can be used to combine the advantages of these toolkits and enable an easier realization of a clinical CAI workflow. MITK-OpenIGTLink is presented as a network interface within MITK that allows easy to use, asynchronous two-way messaging between MITK and clinical devices or other toolkits. Performance and interoperability tests with MITK-OpenIGTLink were carried out considering the whole CAI workflow from data acquisition over processing to visualization. We present how MITK-OpenIGTLink can be applied in different usage scenarios. In performance tests, tracking data were transmitted with a frame rate of up to 1000 Hz and a latency of 2.81 ms. Transmission of images with typical ultrasound (US) and greyscale high-definition (HD) resolutions of [Formula: see text] and [Formula: see text] is possible at up to 512 and 128 Hz, respectively. With the integration of OpenIGTLink into MITK, this protocol is now supported by all established open-source toolkits in the field. This eases interoperability between MITK and toolkits such as PLUS or 3D Slicer and facilitates cross-toolkit research collaborations. MITK and its submodule MITK-OpenIGTLink are provided open source under a BSD-style licence ( http://mitk.org ).

  6. High performance geospatial and climate data visualization using GeoJS

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring

  7. Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.

    PubMed

    Benson, Tim

    2016-07-04

    Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.

  8. Geospatial Data Stream Processing in Python Using FOSS4G Components

    NASA Astrophysics Data System (ADS)

    McFerren, G.; van Zyl, T.

    2016-06-01

    One viewpoint of current and future IT systems holds that there is an increase in the scale and velocity at which data are acquired and analysed from heterogeneous, dynamic sources. In the earth observation and geoinformatics domains, this process is driven by the increase in number and types of devices that report location and the proliferation of assorted sensors, from satellite constellations to oceanic buoy arrays. Much of these data will be encountered as self-contained messages on data streams - continuous, infinite flows of data. Spatial analytics over data streams concerns the search for spatial and spatio-temporal relationships within and amongst data "on the move". In spatial databases, queries can assess a store of data to unpack spatial relationships; this is not the case on streams, where spatial relationships need to be established with the incomplete data available. Methods for spatially-based indexing, filtering, joining and transforming of streaming data need to be established and implemented in software components. This article describes the usage patterns and performance metrics of a number of well known FOSS4G Python software libraries within the data stream processing paradigm. In particular, we consider the RTree library for spatial indexing, the Shapely library for geometric processing and transformation and the PyProj library for projection and geodesic calculations over streams of geospatial data. We introduce a message oriented Python-based geospatial data streaming framework called Swordfish, which provides data stream processing primitives, functions, transports and a common data model for describing messages, based on the Open Geospatial Consortium Observations and Measurements (O&M) and Unidata Common Data Model (CDM) standards. We illustrate how the geospatial software components are integrated with the Swordfish framework. Furthermore, we describe the tight temporal constraints under which geospatial functionality can be invoked when

  9. Open chemistry registry and mapping platform based on open source cheminformatics toolkits (ACS Fall meeting)

    EPA Science Inventory

    TheOpen PHACTS project (openphacts.org) is a European initiative, constituting a public–private partnership to enable easier, cheaper and faster drug discovery [1]. The project is supported by the Open PHACTS Foundation (www.openphactsfoundation.org) and funded by contributions f...

  10. Infrastructure for the Geospatial Web

    NASA Astrophysics Data System (ADS)

    Lake, Ron; Farley, Jim

    Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.

  11. Cooking up an open source EMR for developing countries: OpenMRS - a recipe for successful collaboration.

    PubMed

    Mamlin, Burke W; Biondich, Paul G; Wolfe, Ben A; Fraser, Hamish; Jazayeri, Darius; Allen, Christian; Miranda, Justin; Tierney, William M

    2006-01-01

    Millions of people are continue to die each year from HIV/AIDS. The majority of infected persons (>95%) live in the developing world. A worthy response to this pandemic will require coordinated, scalable, and flexible information systems. We describe the OpenMRS system, an open source, collaborative effort that can serve as a foundation for EMR development in developing countries. We report our progress to date, lessons learned, and future directions.

  12. The Implications of Incumbent Intellectual Property Strategies for Open Source Software Success and Commercialization

    ERIC Educational Resources Information Center

    Wen, Wen

    2012-01-01

    While open source software (OSS) emphasizes open access to the source code and avoids the use of formal appropriability mechanisms, there has been little understanding of how the existence and exercise of formal intellectual property rights (IPR) such as patents influence the direction of OSS innovation. This dissertation seeks to bridge this gap…

  13. Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic Publishing

    ERIC Educational Resources Information Center

    Samuels, Ruth Gallegos; Griffy, Henry

    2012-01-01

    This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…

  14. Build, Buy, Open Source, or Web 2.0?: Making an Informed Decision for Your Library

    ERIC Educational Resources Information Center

    Fagan, Jody Condit; Keach, Jennifer A.

    2010-01-01

    When improving a web presence, today's libraries have a choice: using a free Web 2.0 application, opting for open source, buying a product, or building a web application. This article discusses how to make an informed decision for one's library. The authors stress that deciding whether to use a free Web 2.0 application, to choose open source, to…

  15. Perceptions of Open Source versus Commercial Software: Is Higher Education Still on the Fence?

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2007-01-01

    This exploratory study investigated the perceptions of technology and academic decision-makers about open source benefits and risks versus commercial software applications. The study also explored reactions to a concept for outsourcing campus-wide deployment and maintenance of open source. Data collected from telephone interviews were analyzed,…

  16. Open-Source Learning Management Systems: A Predictive Model for Higher Education

    ERIC Educational Resources Information Center

    van Rooij, S. Williams

    2012-01-01

    The present study investigated the role of pedagogical, technical, and institutional profile factors in an institution of higher education's decision to select an open-source learning management system (LMS). Drawing on the results of previous research that measured patterns of deployment of open-source software (OSS) in US higher education and…

  17. An Evaluation of Open Source Learning Management Systems According to Administration Tools and Curriculum Design

    ERIC Educational Resources Information Center

    Ozdamli, Fezile

    2007-01-01

    Distance education is becoming more important in the universities and schools. The aim of this research is to evaluate the current existing Open Source Learning Management Systems according to Administration tool and Curriculum Design. For this, seventy two Open Source Learning Management Systems have been subjected to a general evaluation. After…

  18. Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments

    ERIC Educational Resources Information Center

    Wang, Shuo; Wang, Jing; Gao, Yanjing

    2017-01-01

    An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…

  19. An Embedded Systems Course for Engineering Students Using Open-Source Platforms in Wireless Scenarios

    ERIC Educational Resources Information Center

    Rodriguez-Sanchez, M. C.; Torrado-Carvajal, Angel; Vaquero, Joaquin; Borromeo, Susana; Hernandez-Tamames, Juan A.

    2016-01-01

    This paper presents a case study analyzing the advantages and disadvantages of using project-based learning (PBL) combined with collaborative learning (CL) and industry best practices, integrated with information communication technologies, open-source software, and open-source hardware tools, in a specialized microcontroller and embedded systems…

  20. Open-Source Intelligence in the Czech Military: Knowledge System and Process Design

    DTIC Science & Technology

    2002-06-01

    in Open-Source Intelligence OSINT, as one of the intelligence disciplines, bears some of the general problems of intelligence " business " OSINT...ADAPTING KNOWLEDGE MANAGEMENT THEORY TO THE CZECH MILITARY INTELLIGENCE Knowledge work is the core business of the military intelligence . As...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited OPEN-SOURCE INTELLIGENCE IN THE

  1. Government Technology Acquisition Policy: The Case of Proprietary versus Open Source Software

    ERIC Educational Resources Information Center

    Hemphill, Thomas A.

    2005-01-01

    This article begins by explaining the concepts of proprietary and open source software technology, which are now competing in the marketplace. A review of recent individual and cooperative technology development and public policy advocacy efforts, by both proponents of open source software and advocates of proprietary software, subsequently…

  2. 76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... be held in the General Services Administration (GSA), Central Office Auditorium, 1800 F Street NW...

  3. Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.

    ERIC Educational Resources Information Center

    Newby, Gregory B.; Greenberg, Jane; Jones, Paul

    2003-01-01

    Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)

  4. Looking toward the Future: A Case Study of Open Source Software in the Humanities

    ERIC Educational Resources Information Center

    Quamen, Harvey

    2006-01-01

    In this article Harvey Quamen examines how the philosophy of open source software might be of particular benefit to humanities scholars in the near future--particularly for academic journals with limited financial resources. To this end he provides a case study in which he describes his use of open source technology (MySQL database software and…

  5. Expanding Human Capabilities through the Adoption and Utilization of Free, Libre, and Open Source Software

    ERIC Educational Resources Information Center

    Simpson, James Daniel

    2014-01-01

    Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…

  6. Sketching Up New Geographies: Open Sourcing and Curriculum Development

    ERIC Educational Resources Information Center

    Boyd, William; Ellis, David

    2013-01-01

    The functionality of web 2.0 technologies has caused academics to rethink their development of teaching and learning methods and approaches. The editable, open access nature of web 2.0 encourages the innovative collaboration of ideas, the creation of equitable visual and tactile learning environments, and opportunity for academics to develop…

  7. Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.

    2007-12-01

    The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water

  8. Considerations on Geospatial Big Data

    NASA Astrophysics Data System (ADS)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  9. What an open source clinical trial community can learn from hackers

    PubMed Central

    Dunn, Adam G.; Day, Richard O.; Mandl, Kenneth D.; Coiera, Enrico

    2014-01-01

    Summary Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. Since a similar gap has already been addressed in the software industry by the open source software movement, we examine how the social and technical principles of the movement can be used to guide the growth of an open source clinical trial community. PMID:22553248

  10. 75 FR 6056 - National Geospatial Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY: Office of the Secretary, Interior. ACTION: Notice of renewal of National Geospatial Advisory Committee... renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations...

  11. Open source tools and toolkits for bioinformatics: significance, and where are we?

    PubMed

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  12. Sticks AND Carrots: Encouraging Open Science at its source.

    PubMed

    Leonelli, Sabina; Spichtinger, Daniel; Prainsack, Barbara

    2015-06-30

    The Open Science (OS) movement has been seen as an important facilitator for public participation in science. This has been underpinned by the assumption that widespread and free access to research outputs leads to (i) better and more efficient science, (ii) economic growth, in particular for small and medium-sized enterprises wishing to capitalise on research findings and (iii) increased transparency of knowledge production and its outcomes. The latter in particular could function as a catalyst for public participation and engagement. Whether OS is likely to help realise these benefits, however, will depend on the emergence of systemic incentives for scientists to utilise OS in a meaningful manner. While some areas, the environmental sciences have a long tradition of open ethos, citizen inclusion and global collaborations, such activities need to be more systematically supported and promoted by funders and learned societies in order to improve scientific research and public participation.

  13. Geospatial Data Science Research Staff | Geospatial Data Science | NREL

    Science.gov Websites

    Oliveira, Ricardo Researcher II-Geospatial Science Ricardo.Oliveira@nrel.gov 303-275-3272 Gilroy, Nicholas Specialist Pamela.Gray.hann@nrel.gov 303-275-4626 Grue, Nicholas Researcher III-Geospatial Science Nick.Grue

  14. US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY GEOSPATIAL SOLUTIONS

    EPA Science Inventory

    This presentation will discuss the history, strategy, products, and future plans of the EPA Geospatial Quality Council (GQC). A topical review of GQC products will be presented including:

    o Guidance for Geospatial Data Quality Assurance Project Plans.

    o GPS - Tec...

  15. Online Resources to Support Professional Development for Managing and Preserving Geospatial Data

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2013-12-01

    Improved capabilities of information and communication technologies (ICT) enable the development of new systems and applications for collecting, managing, disseminating, and using scientific data. New knowledge, skills, and techniques are also being developed to leverage these new ICT capabilities and improve scientific data management practices throughout the entire data lifecycle. In light of these developments and in response to increasing recognition of the wider value of scientific data for society, government agencies are requiring plans for the management, stewardship, and public dissemination of data and research products that are created by government-funded studies. Recognizing that data management and dissemination have not been part of traditional science education programs, new educational programs and learning resources are being developed to prepare new and practicing scientists, data scientists, data managers, and other data professionals with skills in data science and data management. Professional development and training programs also are being developed to address the need for scientists and professionals to improve their expertise in using the tools and techniques for managing and preserving scientific data. The Geospatial Data Preservation Resource Center offers an online catalog of various open access publications, open source tools, and freely available information for the management and stewardship of geospatial data and related resources, such as maps, GIS, and remote sensing data. Containing over 500 resources that can be found by type, topic, or search query, the geopreservation.org website enables discovery of various types of resources to improve capabilities for managing and preserving geospatial data. Applications and software tools can be found for use online or for download. Online journal articles, presentations, reports, blogs, and forums are also available through the website. Available education and training materials include

  16. Kinota: An Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring

    NASA Astrophysics Data System (ADS)

    Miles, B.; Chepudira, K.; LaBar, W.

    2017-12-01

    The Open Geospatial Consortium (OGC) SensorThings API (STA) specification, ratified in 2016, is a next-generation open standard for enabling real-time communication of sensor data. Building on over a decade of OGC Sensor Web Enablement (SWE) Standards, STA offers a rich data model that can represent a range of sensor and phenomena types (e.g. fixed sensors sensing fixed phenomena, fixed sensors sensing moving phenomena, mobile sensors sensing fixed phenomena, and mobile sensors sensing moving phenomena) and is data agnostic. Additionally, and in contrast to previous SWE standards, STA is developer-friendly, as is evident from its convenient JSON serialization, and expressive OData-based query language (with support for geospatial queries); with its Message Queue Telemetry Transport (MQTT), STA is also well-suited to efficient real-time data publishing and discovery. All these attributes make STA potentially useful for use in environmental monitoring sensor networks. Here we present Kinota(TM), an Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring. Kinota, which roughly stands for Knowledge from Internet of Things Analyses, relies on Cassandra its underlying data store, which is a horizontally scalable, fault-tolerant open-source database that is often used to store time-series data for Big Data applications (though integration with other NoSQL or rational databases is possible). With this foundation, Kinota can scale to store data from an arbitrary number of sensors collecting data every 500 milliseconds. Additionally, Kinota architecture is very modular allowing for customization by adopters who can choose to replace parts of the existing implementation when desirable. The architecture is also highly portable providing the flexibility to choose between cloud providers like azure, amazon, google etc. The scalable, flexible and cloud friendly architecture of Kinota makes it ideal for use in next

  17. Open Source Web Tool for Tracking in a Lowcost Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Fissore, F.; Pirotti, F.; Vettore, A.

    2017-11-01

    alternative solution to other more expensive MMSs. The first objective of this paper is to report on the development of a prototype of MMS for the collection of geospatial data based on the assembly of low cost sensors managed through a web interface developed using open source libraries. The main goal is to provide a system accessible by any type of user, and flexible to any type of upgrade or introduction of new models of sensors or versions thereof. After a presentation of the hardware components used in our system, a more detailed description of the software developed for the management of the MMS will be provided, which is the part of the innovation of the project. According to the worldwide request for having big data available through the web from everywhere in the world (Pirotti et al., 2011), the proposed solution allows to retrieve data from a web interface Figure 4. Actually, this is part of a project for the development of a new web infrastructure in the University of Padua (but it will be available for external users as well), in order to ease collaboration between researchers from different areas. Finally, strengths, weaknesses and future developments of the low cost MMS are discussed.

  18. Auscope: Australian Earth Science Information Infrastructure using Free and Open Source Software

    NASA Astrophysics Data System (ADS)

    Woodcock, R.; Cox, S. J.; Fraser, R.; Wyborn, L. A.

    2013-12-01

    Since 2005 the Australian Government has supported a series of initiatives providing researchers with access to major research facilities and information networks necessary for world-class research. Starting with the National Collaborative Research Infrastructure Strategy (NCRIS) the Australian earth science community established an integrated national geoscience infrastructure system called AuScope. AuScope is now in operation, providing a number of components to assist in understanding the structure and evolution of the Australian continent. These include the acquisition of subsurface imaging , earth composition and age analysis, a virtual drill core library, geological process simulation, and a high resolution geospatial reference framework. To draw together information from across the earth science community in academia, industry and government, AuScope includes a nationally distributed information infrastructure. Free and Open Source Software (FOSS) has been a significant enabler in building the AuScope community and providing a range of interoperable services for accessing data and scientific software. A number of FOSS components have been created, adopted or upgraded to create a coherent, OGC compliant Spatial Information Services Stack (SISS). SISS is now deployed at all Australian Geological Surveys, many Universities and the CSIRO. Comprising a set of OGC catalogue and data services, and augmented with new vocabulary and identifier services, the SISS provides a comprehensive package for organisations to contribute their data to the AuScope network. This packaging and a variety of software testing and documentation activities enabled greater trust and notably reduced barriers to adoption. FOSS selection was important, not only for technical capability and robustness, but also for appropriate licensing and community models to ensure sustainability of the infrastructure in the long term. Government agencies were sensitive to these issues and Au

  19. Adding tools to the open source toolbox: The Internet

    NASA Technical Reports Server (NTRS)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  20. Edaq530: A Transparent, Open-End and Open-Source Measurement Solution in Natural Science Education

    ERIC Educational Resources Information Center

    Kopasz, Katalin; Makra, Peter; Gingl, Zoltan

    2011-01-01

    We present Edaq530, a low-cost, compact and easy-to-use digital measurement solution consisting of a thumb-sized USB-to-sensor interface and measurement software. The solution is fully open-source, our aim being to provide a viable alternative to professional solutions. Our main focus in designing Edaq530 has been versatility and transparency. In…