Sample records for open source cross-platform

  1. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segre, Daniel

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  2. OpenSesame: an open-source, graphical experiment builder for the social sciences.

    PubMed

    Mathôt, Sebastiaan; Schreij, Daniel; Theeuwes, Jan

    2012-06-01

    In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality, such as support for eyetrackers, input devices, and video playback, is available through plug-ins. OpenSesame can be used in combination with existing software for creating experiments.

  3. Implementation of a near-real time cross-border web-mapping platform on airborne particulate matter (PM) concentration with open-source software

    NASA Astrophysics Data System (ADS)

    Knörchen, Achim; Ketzler, Gunnar; Schneider, Christoph

    2015-01-01

    Although Europe has been growing together for the past decades, cross-border information platforms on environmental issues are still scarce. With regard to the establishment of a web-mapping tool on airborne particulate matter (PM) concentration for the Euregio Meuse-Rhine located in the border region of Belgium, Germany and the Netherlands, this article describes the research on methodical and technical backgrounds implementing such a platform. An open-source solution was selected for presenting the data in a Web GIS (OpenLayers/GeoExt; both JavaScript-based), applying other free tools for data handling (Python), data management (PostgreSQL), geo-statistical modelling (Octave), geoprocessing (GRASS GIS/GDAL) and web mapping (MapServer). The multilingual, made-to-order online platform provides access to near-real time data on PM concentration as well as additional background information. In an open data section, commented configuration files for the Web GIS client are being made available for download. Furthermore, all geodata generated by the project is being published under public domain and can be retrieved in various formats or integrated into Desktop GIS as Web Map Services (WMS).

  4. PR-PR: Cross-Platform Laboratory Automation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, G; Stawski, N; Goyal, G

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Goldenmore » Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.« less

  5. PR-PR: cross-platform laboratory automation system.

    PubMed

    Linshiz, Gregory; Stawski, Nina; Goyal, Garima; Bi, Changhao; Poust, Sean; Sharma, Monica; Mutalik, Vivek; Keasling, Jay D; Hillson, Nathan J

    2014-08-15

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.

  6. Crux: Rapid Open Source Protein Tandem Mass Spectrometry Analysis

    PubMed Central

    2015-01-01

    Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit (http://cruxtoolkit.sourceforge.net) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276

  7. Architecture Design and Experimental Platform Demonstration of Optical Network based on OpenFlow Protocol

    NASA Astrophysics Data System (ADS)

    Xing, Fangyuan; Wang, Honghuan; Yin, Hongxi; Li, Ming; Luo, Shenzi; Wu, Chenguang

    2016-02-01

    With the extensive application of cloud computing and data centres, as well as the constantly emerging services, the big data with the burst characteristic has brought huge challenges to optical networks. Consequently, the software defined optical network (SDON) that combines optical networks with software defined network (SDN), has attracted much attention. In this paper, an OpenFlow-enabled optical node employed in optical cross-connect (OXC) and reconfigurable optical add/drop multiplexer (ROADM), is proposed. An open source OpenFlow controller is extended on routing strategies. In addition, the experiment platform based on OpenFlow protocol for software defined optical network, is designed. The feasibility and availability of the OpenFlow-enabled optical nodes and the extended OpenFlow controller are validated by the connectivity test, protection switching and load balancing experiments in this test platform.

  8. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    PubMed

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  9. ProteoCloud: a full-featured open source proteomics cloud computing pipeline.

    PubMed

    Muth, Thilo; Peters, Julian; Blackburn, Jonathan; Rapp, Erdmann; Martens, Lennart

    2013-08-02

    We here present the ProteoCloud pipeline, a freely available, full-featured cloud-based platform to perform computationally intensive, exhaustive searches in a cloud environment using five different peptide identification algorithms. ProteoCloud is entirely open source, and is built around an easy to use and cross-platform software client with a rich graphical user interface. This client allows full control of the number of cloud instances to initiate and of the spectra to assign for identification. It also enables the user to track progress, and to visualize and interpret the results in detail. Source code, binaries and documentation are all available at http://proteocloud.googlecode.com. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. RMG An Open Source Electronic Structure Code for Multi-Petaflops Calculations

    NASA Astrophysics Data System (ADS)

    Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Bernholc, Jerzy

    RMG (Real-space Multigrid) is an open source, density functional theory code for quantum simulations of materials. It solves the Kohn-Sham equations on real-space grids, which allows for natural parallelization via domain decomposition. Either subspace or Davidson diagonalization, coupled with multigrid methods, are used to accelerate convergence. RMG is a cross platform open source package which has been used in the study of a wide range of systems, including semiconductors, biomolecules, and nanoscale electronic devices. It can optionally use GPU accelerators to improve performance on systems where they are available. The recently released versions (>2.0) support multiple GPU's per compute node, have improved performance and scalability, enhanced accuracy and support for additional hardware platforms. New versions of the code are regularly released at http://www.rmgdft.org. The releases include binaries for Linux, Windows and MacIntosh systems, automated builds for clusters using cmake, as well as versions adapted to the major supercomputing installations and platforms. Several recent, large-scale applications of RMG will be discussed.

  11. OpenCFU, a new free and open-source software to count cell colonies and other circular objects.

    PubMed

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net.

  12. Performance Assessment of Network Intrusion-Alert Prediction

    DTIC Science & Technology

    2012-09-01

    the threats. In this thesis, we use Snort to generate the intrusion detection alerts. 2. SNORT Snort is an open source network intrusion...standard for IPS. (Snort, 2012) We choose Snort because it is an open source product that is free to download and can be deployed cross-platform...Learning & prediction in relational time series: A survey. 21st Behavior Representation in Modeling & Simulation ( BRIMS ) Conference 2012, 93–100. Tan

  13. A-Track: Detecting Moving Objects in FITS images

    NASA Astrophysics Data System (ADS)

    Atay, T.; Kaplan, M.; Kilic, Y.; Karapinar, N.

    2017-04-01

    A-Track is a fast, open-source, cross-platform pipeline for detecting moving objects (asteroids and comets) in sequential telescope images in FITS format. The moving objects are detected using a modified line detection algorithm.

  14. DCO-VIVO: A Collaborative Data Platform for the Deep Carbon Science Communities

    NASA Astrophysics Data System (ADS)

    Wang, H.; Chen, Y.; West, P.; Erickson, J. S.; Ma, X.; Fox, P. A.

    2014-12-01

    Deep Carbon Observatory (DCO) is a decade-long scientific endeavor to understand carbon in the complex deep Earth system. Thousands of DCO scientists from institutions across the globe are organized into communities representing four domains of exploration: Extreme Physics and Chemistry, Reservoirs and Fluxes, Deep Energy, and Deep Life. Cross-community and cross-disciplinary collaboration is one of the most distinctive features in DCO's flexible research framework. VIVO is an open-source Semantic Web platform that facilitates cross-institutional researcher and research discovery. it includes a number of standard ontologies that interconnect people, organizations, publications, activities, locations, and other entities of research interest to enable browsing, searching, visualizing, and generating Linked Open (research) Data. The DCO-VIVO solution expedites research collaboration between DCO scientists and communities. Based on DCO's specific requirements, the DCO Data Science team developed a series of extensions to the VIVO platform including extending the VIVO information model, extended query over the semantic information within VIVO, integration with other open source collaborative environments and data management systems, using single sign-on, assigning of unique Handles to DCO objects, and publication and dataset ingesting extensions using existing publication systems. We present here the iterative development of these requirements that are now in daily use by the DCO community of scientists for research reporting, information sharing, and resource discovery in support of research activities and program management.

  15. An Extensible Sensing and Control Platform for Building Energy Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, Anthony; Berges, Mario; Martin, Christopher

    2016-04-03

    The goal of this project is to develop Mortar.io, an open-source BAS platform designed to simplify data collection, archiving, event scheduling and coordination of cross-system interactions. Mortar.io is optimized for (1) robustness to network outages, (2) ease of installation using plug-and-play and (3) scalable support for small to large buildings and campuses.

  16. OpenCFU, a New Free and Open-Source Software to Count Cell Colonies and Other Circular Objects

    PubMed Central

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net. PMID:23457446

  17. What Can OpenEI Do For You?

    ScienceCinema

    None

    2018-02-06

    Open Energy Information (OpenEI) is an open source web platform—similar to the one used by Wikipedia—developed by the US Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to make the large amounts of energy-related data and information more easily searched, accessed, and used both by people and automated machine processes. Built utilizing the standards and practices of the Linked Open Data community, the OpenEI platform is much more robust and powerful than typical web sites and databases. As an open platform, all users can search, edit, add, and access data in OpenEI for free. The user community contributes the content and ensures its accuracy and relevance; as the community expands, so does the content's comprehensiveness and quality. The data are structured and tagged with descriptors to enable cross-linking among related data sets, advanced search functionality, and consistent, usable formatting. Data input protocols and quality standards help ensure the content is structured and described properly and derived from a credible source. Although DOE/NREL is developing OpenEI and seeding it with initial data, it is designed to become a true community model with millions of users, a large core of active contributors, and numerous sponsors.

  18. Implementing Open Source Platform for Education Quality Enhancement in Primary Education: Indonesia Experience

    ERIC Educational Resources Information Center

    Kisworo, Marsudi Wahyu

    2016-01-01

    Information and Communication Technology (ICT)-supported learning using free and open source platform draws little attention as open source initiatives were focused in secondary or tertiary educations. This study investigates possibilities of ICT-supported learning using open source platform for primary educations. The data of this study is taken…

  19. pytc: Open-Source Python Software for Global Analyses of Isothermal Titration Calorimetry Data.

    PubMed

    Duvvuri, Hiranmayi; Wheeler, Lucas C; Harms, Michael J

    2018-05-08

    Here we describe pytc, an open-source Python package for global fits of thermodynamic models to multiple isothermal titration calorimetry experiments. Key features include simplicity, the ability to implement new thermodynamic models, a robust maximum likelihood fitter, a fast Bayesian Markov-Chain Monte Carlo sampler, rigorous implementation, extensive documentation, and full cross-platform compatibility. pytc fitting can be done using an application program interface or via a graphical user interface. It is available for download at https://github.com/harmslab/pytc .

  20. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    PubMed

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  1. ProteoWizard: open source software for rapid proteomics tools development.

    PubMed

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  2. Mobile service for open data visualization on geo-based images

    NASA Astrophysics Data System (ADS)

    Lee, Kiwon; Kim, Kwangseob; Kang, Sanggoo

    2015-12-01

    Since the early 2010s, governments in most countries have adopted and promoted open data policy and open data platform. Korea are in the same situation, and government and public organizations have operated the public-accessible open data portal systems since 2011. The number of open data and data type have been increasing every year. These trends are more expandable or extensible on mobile environments. The purpose of this study is to design and implement a mobile application service to visualize various typed or formatted public open data with geo-based images on the mobile web. Open data cover downloadable data sets or open-accessible data application programming interface API. Geo-based images mean multi-sensor satellite imageries which are referred in geo-coordinates and matched with digital map sets. System components for mobile service are fully based on open sources and open development environments without any commercialized tools: PostgreSQL for database management system, OTB for remote sensing image processing, GDAL for data conversion, GeoServer for application server, OpenLayers for mobile web mapping, R for data analysis and D3.js for web-based data graphic processing. Mobile application in client side was implemented by using HTML5 for cross browser and cross platform. The result shows many advantageous points such as linking open data and geo-based data, integrating open data and open source, and demonstrating mobile applications with open data. It is expected that this approach is cost effective and process efficient implementation strategy for intelligent earth observing data.

  3. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Psynteract: A flexible, cross-platform, open framework for interactive experiments.

    PubMed

    Henninger, Felix; Kieslich, Pascal J; Hilbig, Benjamin E

    2017-10-01

    We introduce a novel platform for interactive studies, that is, any form of study in which participants' experiences depend not only on their own responses, but also on those of other participants who complete the same study in parallel, for example a prisoner's dilemma or an ultimatum game. The software thus especially serves the rapidly growing field of strategic interaction research within psychology and behavioral economics. In contrast to all available software packages, our platform does not handle stimulus display and response collection itself. Instead, we provide a mechanism to extend existing experimental software to incorporate interactive functionality. This approach allows us to draw upon the capabilities already available, such as accuracy of temporal measurement, integration with auxiliary hardware such as eye-trackers or (neuro-)physiological apparatus, and recent advances in experimental software, for example capturing response dynamics through mouse-tracking. Through integration with OpenSesame, an open-source graphical experiment builder, studies can be assembled via a drag-and-drop interface requiring little or no further programming skills. In addition, by using the same communication mechanism across software packages, we also enable interoperability between systems. Our source code, which provides support for all major operating systems and several popular experimental packages, can be freely used and distributed under an open source license. The communication protocols underlying its functionality are also well documented and easily adapted to further platforms. Code and documentation are available at https://github.com/psynteract/ .

  5. Visualization of Vgi Data Through the New NASA Web World Wind Virtual Globe

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Kilsedar, C. E.; Zamboni, G.

    2016-06-01

    GeoWeb 2.0, laying the foundations of Volunteered Geographic Information (VGI) systems, has led to platforms where users can contribute to the geographic knowledge that is open to access. Moreover, as a result of the advancements in 3D visualization, virtual globes able to visualize geographic data even on browsers emerged. However the integration of VGI systems and virtual globes has not been fully realized. The study presented aims to visualize volunteered data in 3D, considering also the ease of use aspects for general public, using Free and Open Source Software (FOSS). The new Application Programming Interface (API) of NASA, Web World Wind, written in JavaScript and based on Web Graphics Library (WebGL) is cross-platform and cross-browser, so that the virtual globe created using this API can be accessible through any WebGL supported browser on different operating systems and devices, as a result not requiring any installation or configuration on the client-side, making the collected data more usable to users, which is not the case with the World Wind for Java as installation and configuration of the Java Virtual Machine (JVM) is required. Furthermore, the data collected through various VGI platforms might be in different formats, stored in a traditional relational database or in a NoSQL database. The project developed aims to visualize and query data collected through Open Data Kit (ODK) platform and a cross-platform application, where data is stored in a relational PostgreSQL and NoSQL CouchDB databases respectively.

  6. Analyzing huge pathology images with open source software.

    PubMed

    Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc

    2013-06-06

    Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here:http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272.

  7. Analyzing huge pathology images with open source software

    PubMed Central

    2013-01-01

    Background Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer’s memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. Results We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Conclusions Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272 PMID:23829479

  8. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  9. Xi-cam: a versatile interface for data visualization and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  10. Xi-cam: a versatile interface for data visualization and analysis

    DOE PAGES

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...

    2018-05-31

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  11. What Can OpenEI Do For You?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-12-10

    Open Energy Information (OpenEI) is an open source web platform—similar to the one used by Wikipedia—developed by the US Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to make the large amounts of energy-related data and information more easily searched, accessed, and used both by people and automated machine processes. Built utilizing the standards and practices of the Linked Open Data community, the OpenEI platform is much more robust and powerful than typical web sites and databases. As an open platform, all users can search, edit, add, and access data in OpenEI for free. The user communitymore » contributes the content and ensures its accuracy and relevance; as the community expands, so does the content's comprehensiveness and quality. The data are structured and tagged with descriptors to enable cross-linking among related data sets, advanced search functionality, and consistent, usable formatting. Data input protocols and quality standards help ensure the content is structured and described properly and derived from a credible source. Although DOE/NREL is developing OpenEI and seeding it with initial data, it is designed to become a true community model with millions of users, a large core of active contributors, and numerous sponsors.« less

  12. CROPPER: a metagene creator resource for cross-platform and cross-species compendium studies.

    PubMed

    Paananen, Jussi; Storvik, Markus; Wong, Garry

    2006-09-22

    Current genomic research methods provide researchers with enormous amounts of data. Combining data from different high-throughput research technologies commonly available in biological databases can lead to novel findings and increase research efficiency. However, combining data from different heterogeneous sources is often a very arduous task. These sources can be different microarray technology platforms, genomic databases, or experiments performed on various species. Our aim was to develop a software program that could facilitate the combining of data from heterogeneous sources, and thus allow researchers to perform genomic cross-platform/cross-species studies and to use existing experimental data for compendium studies. We have developed a web-based software resource, called CROPPER that uses the latest genomic information concerning different data identifiers and orthologous genes from the Ensembl database. CROPPER can be used to combine genomic data from different heterogeneous sources, allowing researchers to perform cross-platform/cross-species compendium studies without the need for complex computational tools or the requirement of setting up one's own in-house database. We also present an example of a simple cross-platform/cross-species compendium study based on publicly available Parkinson's disease data derived from different sources. CROPPER is a user-friendly and freely available web-based software resource that can be successfully used for cross-species/cross-platform compendium studies.

  13. Open-Source 3-D Platform for Low-Cost Scientific Instrument Ecosystem.

    PubMed

    Zhang, C; Wijnen, B; Pearce, J M

    2016-08-01

    The combination of open-source software and hardware provides technically feasible methods to create low-cost, highly customized scientific research equipment. Open-source 3-D printers have proven useful for fabricating scientific tools. Here the capabilities of an open-source 3-D printer are expanded to become a highly flexible scientific platform. An automated low-cost 3-D motion control platform is presented that has the capacity to perform scientific applications, including (1) 3-D printing of scientific hardware; (2) laboratory auto-stirring, measuring, and probing; (3) automated fluid handling; and (4) shaking and mixing. The open-source 3-D platform not only facilities routine research while radically reducing the cost, but also inspires the creation of a diverse array of custom instruments that can be shared and replicated digitally throughout the world to drive down the cost of research and education further. © 2016 Society for Laboratory Automation and Screening.

  14. Sirepo for Synchrotron Radiation Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagler, Robert; Moeller, Paul; Rakitin, Maksim

    Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jinja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is the Synchrotron Radiation Workshop (SRW). SRW computes synchrotron radiation from relativistic electrons in arbitrary magnetic fields and propagates the radiation wavefronts through optical beamlines. SRW is open source and is primarily supported by Dr. Oleg Chubar of NSLS-II at Brookhaven National Laboratory.« less

  15. Open source hardware and software platform for robotics and artificial intelligence applications

    NASA Astrophysics Data System (ADS)

    Liang, S. Ng; Tan, K. O.; Lai Clement, T. H.; Ng, S. K.; Mohammed, A. H. Ali; Mailah, Musa; Azhar Yussof, Wan; Hamedon, Zamzuri; Yussof, Zulkifli

    2016-02-01

    Recent developments in open source hardware and software platforms (Android, Arduino, Linux, OpenCV etc.) have enabled rapid development of previously expensive and sophisticated system within a lower budget and flatter learning curves for developers. Using these platform, we designed and developed a Java-based 3D robotic simulation system, with graph database, which is integrated in online and offline modes with an Android-Arduino based rubbish picking remote control car. The combination of the open source hardware and software system created a flexible and expandable platform for further developments in the future, both in the software and hardware areas, in particular in combination with graph database for artificial intelligence, as well as more sophisticated hardware, such as legged or humanoid robots.

  16. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java* for EDUCATION

    NASA Astrophysics Data System (ADS)

    Hogan, P.; Kuehnel, F.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  17. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java*

    NASA Astrophysics Data System (ADS)

    Hogan, P.; Coughlan, J.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  18. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    NASA Astrophysics Data System (ADS)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  19. DataViewer3D: An Open-Source, Cross-Platform Multi-Modal Neuroimaging Data Visualization Tool

    PubMed Central

    Gouws, André; Woods, Will; Millman, Rebecca; Morland, Antony; Green, Gary

    2008-01-01

    Integration and display of results from multiple neuroimaging modalities [e.g. magnetic resonance imaging (MRI), magnetoencephalography, EEG] relies on display of a diverse range of data within a common, defined coordinate frame. DataViewer3D (DV3D) is a multi-modal imaging data visualization tool offering a cross-platform, open-source solution to simultaneous data overlay visualization requirements of imaging studies. While DV3D is primarily a visualization tool, the package allows an analysis approach where results from one imaging modality can guide comparative analysis of another modality in a single coordinate space. DV3D is built on Python, a dynamic object-oriented programming language with support for integration of modular toolkits, and development of cross-platform software for neuroimaging. DV3D harnesses the power of the Visualization Toolkit (VTK) for two-dimensional (2D) and 3D rendering, calling VTK's low level C++ functions from Python. Users interact with data via an intuitive interface that uses Python to bind wxWidgets, which in turn calls the user's operating system dialogs and graphical user interface tools. DV3D currently supports NIfTI-1, ANALYZE™ and DICOM formats for MRI data display (including statistical data overlay). Formats for other data types are supported. The modularity of DV3D and ease of use of Python allows rapid integration of additional format support and user development. DV3D has been tested on Mac OSX, RedHat Linux and Microsoft Windows XP. DV3D is offered for free download with an extensive set of tutorial resources and example data. PMID:19352444

  20. Open source posturography.

    PubMed

    Rey-Martinez, Jorge; Pérez-Fernández, Nicolás

    2016-12-01

    The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.

  1. A Geospatial Information Grid Framework for Geological Survey.

    PubMed

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.

  2. A Geospatial Information Grid Framework for Geological Survey

    PubMed Central

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255

  3. Sirepo - Warp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagler, Robert; Moeller, Paul

    Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jin-ja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is Warp. Warp is a particle-in-cell (PIC) code de-signed to simulate high-intensity charged particle beams and plasmas in both the electrostatic and electromagnetic regimes, with a wide variety of integrated physics models and diagnostics. At pre-sent, Sirepo supports a small subset of Warp’s capabilities. Warp is open source and is part of the Berkeley Lab Accelerator Simulation Toolkit.« less

  4. DStat: A Versatile, Open-Source Potentiostat for Electroanalysis and Integration.

    PubMed

    Dryden, Michael D M; Wheeler, Aaron R

    2015-01-01

    Most electroanalytical techniques require the precise control of the potentials in an electrochemical cell using a potentiostat. Commercial potentiostats function as "black boxes," giving limited information about their circuitry and behaviour which can make development of new measurement techniques and integration with other instruments challenging. Recently, a number of lab-built potentiostats have emerged with various design goals including low manufacturing cost and field-portability, but notably lacking is an accessible potentiostat designed for general lab use, focusing on measurement quality combined with ease of use and versatility. To fill this gap, we introduce DStat (http://microfluidics.utoronto.ca/dstat), an open-source, general-purpose potentiostat for use alone or integrated with other instruments. DStat offers picoampere current measurement capabilities, a compact USB-powered design, and user-friendly cross-platform software. DStat is easy and inexpensive to build, may be modified freely, and achieves good performance at low current levels not accessible to other lab-built instruments. In head-to-head tests, DStat's voltammetric measurements are much more sensitive than those of "CheapStat" (a popular open-source potentiostat described previously), and are comparable to those of a compact commercial "black box" potentiostat. Likewise, in head-to-head tests, DStat's potentiometric precision is similar to that of a commercial pH meter. Most importantly, the versatility of DStat was demonstrated through integration with the open-source DropBot digital microfluidics platform. In sum, we propose that DStat is a valuable contribution to the "open source" movement in analytical science, which is allowing users to adapt their tools to their experiments rather than alter their experiments to be compatible with their tools.

  5. OpenStereo: Open Source, Cross-Platform Software for Structural Geology Analysis

    NASA Astrophysics Data System (ADS)

    Grohmann, C. H.; Campanha, G. A.

    2010-12-01

    Free and open source software (FOSS) are increasingly seen as synonyms of innovation and progress. Freedom to run, copy, distribute, study, change and improve the software (through access to the source code) assure a high level of positive feedback between users and developers, which results in stable, secure and constantly updated systems. Several software packages for structural geology analysis are available to the user, with commercial licenses or that can be downloaded at no cost from the Internet. Some provide basic tools of stereographic projections such as plotting poles, great circles, density contouring, eigenvector analysis, data rotation etc, while others perform more specific tasks, such as paleostress or geotechnical/rock stability analysis. This variety also means a wide range of data formating for input, Graphical User Interface (GUI) design and graphic export format. The majority of packages is built for MS-Windows and even though there are packages for the UNIX-based MacOS, there aren't native packages for *nix (UNIX, Linux, BSD etc) Operating Systems (OS), forcing the users to run these programs with emulators or virtual machines. Those limitations lead us to develop OpenStereo, an open source, cross-platform software for stereographic projections and structural geology. The software is written in Python, a high-level, cross-platform programming language and the GUI is designed with wxPython, which provide a consistent look regardless the OS. Numeric operations (like matrix and linear algebra) are performed with the Numpy module and all graphic capabilities are provided by the Matplolib library, including on-screen plotting and graphic exporting to common desktop formats (emf, eps, ps, pdf, png, svg). Data input is done with simple ASCII text files, with values of dip direction and dip/plunge separated by spaces, tabs or commas. The user can open multiple file at the same time (or the same file more than once), and overlay different elements of each dataset (poles, great circles etc). The GUI shows the opened files in a tree structure, similar to “layers” of many illustration software, where the vertical order of the files in the tree reflects the drawing order of the selected elements. At this stage, the software performs plotting operations of poles to planes, lineations, great circles, density contours and rose diagrams. A set of statistics is calculated for each file and its eigenvalues and eigenvectors are used to suggest if the data is clustered about a mean value or distributed along a girdle. Modified Flinn, Triangular and histograms plots are also available. Next step of development will focus on tools as merging and rotation of datasets, possibility to save 'projects' and paleostress analysis. In its current state, OpenStereo requires Python, wxPython, Numpy and Matplotlib installed in the system. We recommend installing PythonXY or the Enthought Python Distribution on MS-Windows and MacOS machines, since all dependencies are provided. Most Linux distributions provide an easy way to install all dependencies through software repositories. OpenStereo is released under the GNU General Public License. Programmers willing to contribute are encouraged to contact the authors directly. FAPESP Grant #09/17675-5

  6. CAGE IIIA Distributed Simulation Design Methodology

    DTIC Science & Technology

    2014-05-01

    2 VHF Very High Frequency VLC Video LAN Codec – an Open-source cross-platform multimedia player and framework VM Virtual Machine VOIP Voice Over...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the...operation and to be available in the other nation’s simulations. The challenge for the CAGE campaign of experiments is to continue to build upon this

  7. A novel low-cost open-hardware platform for monitoring soil water content and multiple soil-air-vegetation parameters.

    PubMed

    Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana

    2014-10-21

    Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost "open hardware" platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSE(training) = 2.63; RMSE(validation) = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies.

  8. A Novel Low-Cost Open-Hardware Platform for Monitoring Soil Water Content and Multiple Soil-Air-Vegetation Parameters

    PubMed Central

    Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana

    2014-01-01

    Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost “open hardware” platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSEtraining = 2.63; RMSEvalidation = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies. PMID:25337742

  9. EPRI and Schneider Electric Demonstrate Distributed Resource Communications

    Science.gov Websites

    Electric Power Research Institute (EPRI) is designing, building, and testing a flexible, open-source Schneider Electric ADMS, open software platforms, an open-platform home energy management system

  10. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  11. OpenChrom: a cross-platform open source software for the mass spectrometric analysis of chromatographic data.

    PubMed

    Wenig, Philip; Odermatt, Juergen

    2010-07-30

    Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. This work describes a native approach to handle chromatographic data files. The approach can be extended in its functionality such as facilities to detect baselines, to detect, integrate and identify peaks and to compare mass spectra, as well as the ability to internationalize the application. Additionally, filters can be applied on the chromatographic data to enhance its quality, for example to remove background and noise. Extended operations like do, undo and redo are supported. OpenChrom is a software application to edit and analyze mass spectrometric chromatographic data. It is extensible in many different ways, depending on the demands of the users or the analytical procedures and algorithms. It offers a customizable graphical user interface. The software is independent of the operating system, due to the fact that the Rich Client Platform is written in Java. OpenChrom is released under the Eclipse Public License 1.0 (EPL). There are no license constraints regarding extensions. They can be published using open source as well as proprietary licenses. OpenChrom is available free of charge at http://www.openchrom.net.

  12. Open-Source Learning Management System and Web 2.0 Online Social Software Applications as Learning Platforms for an Elementary School in Singapore

    ERIC Educational Resources Information Center

    Tay, Lee Yong; Lim, Cher Ping; Lye, Sze Yee; Ng, Kay Joo; Lim, Siew Khiaw

    2011-01-01

    This paper analyses how an elementary-level future school in Singapore implements and uses various open-source online platforms, which are easily available online and could be implemented with minimal software cost, for the purpose of teaching and learning. Online platforms have the potential to facilitate students' engagement for independent and…

  13. 77 FR 33739 - Announcement of Requirements and Registration for “Health Data Platform Metadata Challenge”

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-07

    ... Information Technology. SUMMARY: As part of the HHS Open Government Plan, the HealthData.gov Platform (HDP) is... application of existing voluntary consensus standards for metadata common to all open government data, and... vocabulary recommendations for Linked Data publishers, defining cross domain semantic metadata of open...

  14. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

  15. Crossing the Virtual World Barrier with OpenAvatar

    NASA Technical Reports Server (NTRS)

    Joy, Bruce; Kavle, Lori; Tan, Ian

    2012-01-01

    There are multiple standards and formats for 3D models in virtual environments. The problem is that there is no open source platform for generating models out of discrete parts; this results in the process of having to "reinvent the wheel" when new games, virtual worlds and simulations want to enable their users to create their own avatars or easily customize in-world objects. OpenAvatar is designed to provide a framework to allow artists and programmers to create reusable assets which can be used by end users to generate vast numbers of complete models that are unique and functional. OpenAvatar serves as a framework which facilitates the modularization of 3D models allowing parts to be interchanged within a set of logical constraints.

  16. Virtual Observer Controller (VOC) for Small Unit Infantry Laser Simulation Training

    DTIC Science & Technology

    2007-04-01

    per-seat license when deployed. As a result, ViaVoice was abandoned early in development. Next, the SPHINX engine from Carnegie Mellon University was...examined. Sphinx is Java-based software, providing cross-platform functionality, and it is also free, open-source software. Software developers at...IST had experience using SPHINX , so it was initially selected it to be the VOC speech engine. After implementing a small portion of the VOC grammar

  17. Study of Adversarial and Defensive Components in an Experimental Machinery Control Systems Laboratory Environment

    DTIC Science & Technology

    2014-09-01

    prevention system (IPS), capable of performing real-time traffic analysis and packet logging on IP networks [25]. Snort’s features include protocol... analysis and content searching/matching. Snort can detect a variety of attacks and network probes, such as buffer overflows, port scans and OS...www.digitalbond.com/tools/the- rack/jtr-s7-password-cracking/ Kismet Mike Kershaw Cross- platform Open source wireless network detector and wireless sniffer

  18. DStat: A Versatile, Open-Source Potentiostat for Electroanalysis and Integration

    PubMed Central

    Dryden, Michael D. M.; Wheeler, Aaron R.

    2015-01-01

    Most electroanalytical techniques require the precise control of the potentials in an electrochemical cell using a potentiostat. Commercial potentiostats function as “black boxes,” giving limited information about their circuitry and behaviour which can make development of new measurement techniques and integration with other instruments challenging. Recently, a number of lab-built potentiostats have emerged with various design goals including low manufacturing cost and field-portability, but notably lacking is an accessible potentiostat designed for general lab use, focusing on measurement quality combined with ease of use and versatility. To fill this gap, we introduce DStat (http://microfluidics.utoronto.ca/dstat), an open-source, general-purpose potentiostat for use alone or integrated with other instruments. DStat offers picoampere current measurement capabilities, a compact USB-powered design, and user-friendly cross-platform software. DStat is easy and inexpensive to build, may be modified freely, and achieves good performance at low current levels not accessible to other lab-built instruments. In head-to-head tests, DStat’s voltammetric measurements are much more sensitive than those of “CheapStat” (a popular open-source potentiostat described previously), and are comparable to those of a compact commercial “black box” potentiostat. Likewise, in head-to-head tests, DStat’s potentiometric precision is similar to that of a commercial pH meter. Most importantly, the versatility of DStat was demonstrated through integration with the open-source DropBot digital microfluidics platform. In sum, we propose that DStat is a valuable contribution to the “open source” movement in analytical science, which is allowing users to adapt their tools to their experiments rather than alter their experiments to be compatible with their tools. PMID:26510100

  19. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  20. Unified Framework for Development, Deployment and Robust Testing of Neuroimaging Algorithms

    PubMed Central

    Joshi, Alark; Scheinost, Dustin; Okuda, Hirohito; Belhachemi, Dominique; Murphy, Isabella; Staib, Lawrence H.; Papademetris, Xenophon

    2011-01-01

    Developing both graphical and command-line user interfaces for neuroimaging algorithms requires considerable effort. Neuroimaging algorithms can meet their potential only if they can be easily and frequently used by their intended users. Deployment of a large suite of such algorithms on multiple platforms requires consistency of user interface controls, consistent results across various platforms and thorough testing. We present the design and implementation of a novel object-oriented framework that allows for rapid development of complex image analysis algorithms with many reusable components and the ability to easily add graphical user interface controls. Our framework also allows for simplified yet robust nightly testing of the algorithms to ensure stability and cross platform interoperability. All of the functionality is encapsulated into a software object requiring no separate source code for user interfaces, testing or deployment. This formulation makes our framework ideal for developing novel, stable and easy-to-use algorithms for medical image analysis and computer assisted interventions. The framework has been both deployed at Yale and released for public use in the open source multi-platform image analysis software—BioImage Suite (bioimagesuite.org). PMID:21249532

  1. Sharing Lessons-Learned on Effective Open Data, Open-Source Practices from OpenAQ, a Global Open Air Quality Community.

    NASA Astrophysics Data System (ADS)

    Hasenkopf, C. A.

    2017-12-01

    Increasingly, open data, open-source projects are unearthing rich datasets and tools, previously impossible for more traditional avenues to generate. These projects are possible, in part, because of the emergence of online collaborative and code-sharing tools, decreasing costs of cloud-based services to fetch, store, and serve data, and increasing interest of individuals to contribute their time and skills to 'open projects.' While such projects have generated palpable enthusiasm from many sectors, many of these projects face uncharted paths for sustainability, visibility, and acceptance. Our project, OpenAQ, is an example of an open-source, open data community that is currently forging its own uncharted path. OpenAQ is an open air quality data platform that aggregates and universally formats government and research-grade air quality data from 50 countries across the world. To date, we make available more than 76 million air quality (PM2.5, PM10, SO2, NO2, O3, CO and black carbon) data points through an open Application Programming Interface (API) and a user-customizable download interface at https://openaq.org. The goal of the platform is to enable an ecosystem of users to advance air pollution efforts from science to policy to the private sector. The platform is also an open-source project (https://github.com/openaq) and has only been made possible through the coding and data contributions of individuals around the world. In our first two years of existence, we have seen requests for data to our API skyrocket to more than 6 million datapoints per month, and use-cases as varied as ingesting data aggregated from our system into real-time models of wildfires to building open-source statistical packages (e.g. ropenaq and py-openaq) on top of the platform to creating public-friendly apps and chatbots. We will share a whirl-wind trip through our evolution and the many lessons learned so far related to platform structure, community engagement, organizational model type and sustainability.

  2. Social network of PESCA (Open Source Platform for eHealth).

    PubMed

    Sanchez, Carlos L; Romero-Cuevas, Miguel; Lopez, Diego M; Lorca, Julio; Alcazar, Francisco J; Ruiz, Sergio; Mercado, Carmen; Garcia-Fortea, Pedro

    2008-01-01

    Information and Communication Technologies (ICTs) are revolutionizing how healthcare systems deliver top-quality care to citizens. In this way, Open Source Software (OSS) has demonstrated to be an important strategy to spread ICTs use. Several human and technological barriers in adopting OSS for healthcare have been identified. Human barriers include user acceptance, limited support, technical skillfulness, awareness, resistance to change, etc., while Technological barriers embrace need for open standards, heterogeneous OSS developed without normalization and metrics, lack of initiatives to evaluate existing health OSS and need for quality control and functional validation. The goals of PESCA project are to create a platform of interoperable modules to evaluate, classify and validate good practices in health OSS. Furthermore, a normalization platform will provide interoperable solutions in the fields of healthcare services, health surveillance, health literature, and health education, knowledge and research. Within the platform, the first goal to achieve is the setup of the collaborative work infrastructure. The platform is being organized as a Social Network which works to evaluate five scopes of every existing open source tools for eHealth: Open Source Software, Quality, Pedagogical, Security and privacy and Internationalization/I18N. In the meantime, the knowledge collected from the networking will configure a Good Practice Repository on eHealth promoting the effective use of ICT on behalf of the citizen's health.

  3. NucliTrack: an integrated nuclei tracking application.

    PubMed

    Cooper, Sam; Barr, Alexis R; Glen, Robert; Bakal, Chris

    2017-10-15

    Live imaging studies give unparalleled insight into dynamic single cell behaviours and fate decisions. However, the challenge of reliably tracking single cells over long periods of time limits both the throughput and ease with which such studies can be performed. Here, we present NucliTrack, a cross platform solution for automatically segmenting, tracking and extracting features from fluorescently labelled nuclei. NucliTrack performs similarly to other state-of-the-art cell tracking algorithms, but NucliTrack's interactive, graphical interface makes it significantly more user friendly. NucliTrack is available as a free, cross platform application and open source Python package. Installation details and documentation are at: http://nuclitrack.readthedocs.io/en/latest/ A video guide can be viewed online: https://www.youtube.com/watch?v=J6e0D9F-qSU Source code is available through Github: https://github.com/samocooper/nuclitrack. A Matlab toolbox is also available at: https://uk.mathworks.com/matlabcentral/fileexchange/61479-samocooper-nuclitrack-matlab. sam@socooper.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  4. NucliTrack: an integrated nuclei tracking application

    PubMed Central

    Cooper, Sam; Barr, Alexis R.; Glen, Robert; Bakal, Chris

    2017-01-01

    Abstract Summary Live imaging studies give unparalleled insight into dynamic single cell behaviours and fate decisions. However, the challenge of reliably tracking single cells over long periods of time limits both the throughput and ease with which such studies can be performed. Here, we present NucliTrack, a cross platform solution for automatically segmenting, tracking and extracting features from fluorescently labelled nuclei. NucliTrack performs similarly to other state-of-the-art cell tracking algorithms, but NucliTrack’s interactive, graphical interface makes it significantly more user friendly. Availability and implementation NucliTrack is available as a free, cross platform application and open source Python package. Installation details and documentation are at: http://nuclitrack.readthedocs.io/en/latest/ A video guide can be viewed online: https://www.youtube.com/watch?v=J6e0D9F-qSU Source code is available through Github: https://github.com/samocooper/nuclitrack. A Matlab toolbox is also available at: https://uk.mathworks.com/matlabcentral/fileexchange/61479-samocooper-nuclitrack-matlab. Contact sam@socooper.com Supplementary information Supplementary data are available at Bioinformatics online. PMID:28637183

  5. Aerostat-Lofted Instrument Platform and Sampling Method for Determination of Emissions from Open Area Sources

    EPA Science Inventory

    Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...

  6. GeolOkit 1.0: a new Open Source, Cross-Platform software for geological data visualization in Google Earth environment

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Antoine; Bastin, Christophe; Watlet, Arnaud

    2016-04-01

    GIS software suites are today's essential tools to gather and visualise geological data, to apply spatial and temporal analysis and in fine, to create and share interactive maps for further geosciences' investigations. For these purposes, we developed GeolOkit: an open-source, freeware and lightweight software, written in Python, a high-level, cross-platform programming language. GeolOkit software is accessible through a graphical user interface, designed to run in parallel with Google Earth. It is a super user-friendly toolbox that allows 'geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to plot these one into Google Earth environment using KML code. This workflow requires no need of any third party software, except Google Earth itself. GeolOkit comes with large number of geosciences' labels, symbols, colours and placemarks and may process : (i) multi-points data, (ii) contours via several interpolations methods, (iii) discrete planar and linear structural data in 2D or 3D supporting large range of structures input format, (iv) clustered stereonets and rose diagram, (v) drawn cross-sections as vertical sections, (vi) georeferenced maps and vectors, (vii) field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS. We are looking for you to discover all the functionalities of GeolOkit software. As this project is under development, we are definitely looking to discussions regarding your proper needs, your ideas and contributions to GeolOkit project.

  7. Anser EMT: the first open-source electromagnetic tracking platform for image-guided interventions.

    PubMed

    Jaeger, Herman Alexander; Franz, Alfred Michael; O'Donoghue, Kilian; Seitel, Alexander; Trauzettel, Fabian; Maier-Hein, Lena; Cantillon-Murphy, Pádraig

    2017-06-01

    Electromagnetic tracking is the gold standard for instrument tracking and navigation in the clinical setting without line of sight. Whilst clinical platforms exist for interventional bronchoscopy and neurosurgical navigation, the limited flexibility and high costs of electromagnetic tracking (EMT) systems for research investigations mitigate against a better understanding of the technology's characterisation and limitations. The Anser project provides an open-source implementation for EMT with particular application to image-guided interventions. This work provides implementation schematics for our previously reported EMT system which relies on low-cost acquisition and demodulation techniques using both National Instruments and Arduino hardware alongside MATLAB support code. The system performance is objectively compared to other commercial tracking platforms using the Hummel assessment protocol. Positional accuracy of 1.14 mm and angular rotation accuracy of [Formula: see text] are reported. Like other EMT platforms, Anser is susceptible to tracking errors due to eddy current and ferromagnetic distortion. The system is compatible with commercially available EMT sensors as well as the Open Network Interface for image-guided therapy (OpenIGTLink) for easy communication with visualisation and medical imaging toolkits such as MITK and 3D Slicer. By providing an open-source platform for research investigations, we believe that novel and collaborative approaches can overcome the limitations of current EMT technology.

  8. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  9. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit

    PubMed Central

    O'Boyle, Noel M; Morley, Chris; Hutchison, Geoffrey R

    2008-01-01

    Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers. PMID:18328109

  10. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit.

    PubMed

    O'Boyle, Noel M; Morley, Chris; Hutchison, Geoffrey R

    2008-03-09

    Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  11. The Open Microscopy Environment: open image informatics for the biological sciences

    NASA Astrophysics Data System (ADS)

    Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.

    2016-07-01

    Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).

  12. openECA Platform and Analytics Alpha Test Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.

  13. openECA Platform and Analytics Beta Demonstration Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.

  14. Testing RISKGIS Platform with Students to Improve Learning and Teaching Skills

    NASA Astrophysics Data System (ADS)

    Olyazadeh, R.; Aye, Z. C.; Jaboyedoff, M.; Derron, M. H.

    2016-12-01

    Nowadays, open-source developments in the field of natural hazards and risk management increase rapidly. The governments, NGOs and other research institutes are producing data for risk and disaster analysis, but a few platforms are available to bring a real-life experience to the students. This work focuses on the preliminary results of testing a WebGIS platform called RISKGIS with the bachelor students at the University of Lausanne. The platform is designed based on a geospatial open-source technology called OpenGeo (Boundless). This platform can calculate the potential risk of the buildings and assist the students to understand the situations for risk reduction mitigation and decision-making. The center of Jomsom in Nepal was selected for the first exercise that may be affected by amplifications of earthquake. The shaking intensity map was designed by an expert based on the geological characteristics and DEM (Digital Elevation Model) of the area. All buildings data were extracted from OpenStreetMap using QGIS and adapted to the platform. The video tutorial was prepared to guide the students through the platform, and 80 students have tested the application online successfully and 40 of them participated in Moodle (a free Open Source software package for educators) for online feedback and quiz. Among those, 30 of them have completely answered to both. We had interesting results for effectiveness, efficiency and satisfaction based on System Usability Scale (SUS). The SUS score for this platform was 68.6 out of 100. The average result of the quiz was 9.39 out of 10 with a duration of 8 to 33 minutes to answer the quiz. There were several outliers for this duration with 2 minutes (two students) and 9 to 18 hours (three students). Further exercises will be carried out with students by adding more advanced functions to the platform and improving the willingness of participation in this online learning platform. This project is funded by Fonds d'innovation pédagogique de l'Université de Lausanne (FIP). We think this initial, ongoing platform can help both students and teachers to improve their skills in the field of risk and disaster management.Keywords: Risk and disaster Management, GIS, Open-Source, Boundless, Moodle, Teaching and learning

  15. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  16. MetaboLights: An Open-Access Database Repository for Metabolomics Data.

    PubMed

    Kale, Namrata S; Haug, Kenneth; Conesa, Pablo; Jayseelan, Kalaivani; Moreno, Pablo; Rocca-Serra, Philippe; Nainala, Venkata Chandrasekhar; Spicer, Rachel A; Williams, Mark; Li, Xuefei; Salek, Reza M; Griffin, Julian L; Steinbeck, Christoph

    2016-03-24

    MetaboLights is the first general purpose, open-access database repository for cross-platform and cross-species metabolomics research at the European Bioinformatics Institute (EMBL-EBI). Based upon the open-source ISA framework, MetaboLights provides Metabolomics Standard Initiative (MSI) compliant metadata and raw experimental data associated with metabolomics experiments. Users can upload their study datasets into the MetaboLights Repository. These studies are then automatically assigned a stable and unique identifier (e.g., MTBLS1) that can be used for publication reference. The MetaboLights Reference Layer associates metabolites with metabolomics studies in the archive and is extensively annotated with data fields such as structural and chemical information, NMR and MS spectra, target species, metabolic pathways, and reactions. The database is manually curated with no specific release schedules. MetaboLights is also recommended by journals for metabolomics data deposition. This unit provides a guide to using MetaboLights, downloading experimental data, and depositing metabolomics datasets using user-friendly submission tools. Copyright © 2016 John Wiley & Sons, Inc.

  17. PyGaze: an open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments.

    PubMed

    Dalmaijer, Edwin S; Mathôt, Sebastiaan; Van der Stigchel, Stefan

    2014-12-01

    The PyGaze toolbox is an open-source software package for Python, a high-level programming language. It is designed for creating eyetracking experiments in Python syntax with the least possible effort, and it offers programming ease and script readability without constraining functionality and flexibility. PyGaze can be used for visual and auditory stimulus presentation; for response collection via keyboard, mouse, joystick, and other external hardware; and for the online detection of eye movements using a custom algorithm. A wide range of eyetrackers of different brands (EyeLink, SMI, and Tobii systems) are supported. The novelty of PyGaze lies in providing an easy-to-use layer on top of the many different software libraries that are required for implementing eyetracking experiments. Essentially, PyGaze is a software bridge for eyetracking research.

  18. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    NASA Astrophysics Data System (ADS)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  19. mmpdb: An Open-Source Matched Molecular Pair Platform for Large Multiproperty Data Sets.

    PubMed

    Dalke, Andrew; Hert, Jérôme; Kramer, Christian

    2018-05-29

    Matched molecular pair analysis (MMPA) enables the automated and systematic compilation of medicinal chemistry rules from compound/property data sets. Here we present mmpdb, an open-source matched molecular pair (MMP) platform to create, compile, store, retrieve, and use MMP rules. mmpdb is suitable for the large data sets typically found in pharmaceutical and agrochemical companies and provides new algorithms for fragment canonicalization and stereochemistry handling. The platform is written in Python and based on the RDKit toolkit. It is freely available from https://github.com/rdkit/mmpdb .

  20. FloorspaceJS - A New, Open Source, Web-Based Geometry Editor for Building Energy Modeling (BEM): Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macumber, Daniel L; Horowitz, Scott G; Schott, Marjorie

    Across most industries, desktop applications are being rapidly migrated to web applications for a variety of reasons. Web applications are inherently cross platform, mobile, and easier to distribute than desktop applications. Fueling this trend are a wide range of free, open source libraries and frameworks that make it incredibly easy to develop powerful web applications. The building energy modeling community is just beginning to pick up on these larger trends, with a small but growing number of building energy modeling applications starting on or moving to the web. This paper presents a new, open source, web based geometry editor formore » Building Energy Modeling (BEM). The editor is written completely in JavaScript and runs in a modern web browser. The editor works on a custom JSON file format and is designed to be integrated into a variety of web and desktop applications. The web based editor is available to use as a standalone web application at: https://nrel.github.io/openstudio-geometry-editor/. An example integration is demonstrated with the OpenStudio desktop application. Finally, the editor can be easily integrated with a wide range of possible building energy modeling web applications.« less

  1. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  2. McIDAS-V: Data Analysis and Visualization for NPOESS and GOES-R

    NASA Astrophysics Data System (ADS)

    Rink, T.; Achtor, T. H.

    2009-12-01

    McIDAS-V, the next-generation McIDAS, is being built on top a modern, cross-platform software framework which supports development of 4-D, interactive displays and integration of wide-array of geophysical data. As the replacement of McIDAS, the development emphasis is on future satellite observation platforms such as NPOESS and GOES-R. Data interrogation, analysis and visualization capabilities have been developed for multi- and hyper-spectral instruments like MODIS, AIRS and IASI, and are being extended for application to VIIRS and CrIS. Compatibility with GOES-R ABI level1 and level2 product storage formats has been demonstrated. The abstract data model, which can internalize most any geophysical data, opens up new possibilities for data fusion techniques, for example, polar and geostationary, (LEO/GEO), synergy for research and validation. McIDAS-V follows an object-oriented design model, using the Java programming language, allowing specialized extensions for for new sources of data, and novel displays and interactive behavior. The reference application, what the user sees on startup, can be customized, and the system has a persistence mechanism allowing sharing of the application state across the internet. McIDAS-V is open-source, and free to the public.

  3. Specification Patent Management for Web Application Platform Ecosystem

    NASA Astrophysics Data System (ADS)

    Fukami, Yoshiaki; Isshiki, Masao; Takeda, Hideaki; Ohmukai, Ikki; Kokuryo, Jiro

    Diversified usage of web applications has encouraged disintegration of web platform into management of identification and applications. Users make use of various kinds of data linked to their identity with multiple applications on certain social web platforms such as Facebook or MySpace. There has emerged competition among web application platforms. Platformers can design relationship with developers by controlling patent of their own specification and adopt open technologies developed external organizations. Platformers choose a way to open according to feature of the specification and their position. Patent management of specification come to be a key success factor to build competitive web application platforms. Each way to attract external developers such as standardization, open source has not discussed and analyzed all together.

  4. An open-source framework for testing tracking devices using Lego Mindstorms

    NASA Astrophysics Data System (ADS)

    Jomier, Julien; Ibanez, Luis; Enquobahrie, Andinet; Pace, Danielle; Cleary, Kevin

    2009-02-01

    In this paper, we present an open-source framework for testing tracking devices in surgical navigation applications. At the core of image-guided intervention systems is the tracking interface that handles communication with the tracking device and gathers tracking information. Given that the correctness of tracking information is critical for protecting patient safety and for ensuring the successful execution of an intervention, the tracking software component needs to be thoroughly tested on a regular basis. Furthermore, with widespread use of extreme programming methodology that emphasizes continuous and incremental testing of application components, testing design becomes critical. While it is easy to automate most of the testing process, it is often more difficult to test components that require manual intervention such as tracking device. Our framework consists of a robotic arm built from a set of Lego Mindstorms and an open-source toolkit written in C++ to control the robot movements and assess the accuracy of the tracking devices. The application program interface (API) is cross-platform and runs on Windows, Linux and MacOS. We applied this framework for the continuous testing of the Image-Guided Surgery Toolkit (IGSTK), an open-source toolkit for image-guided surgery and shown that regression testing on tracking devices can be performed at low cost and improve significantly the quality of the software.

  5. HydroDesktop as a Community Designed and Developed Resource for Hydrologic Data Discovery and Analysis

    NASA Astrophysics Data System (ADS)

    Ames, D. P.

    2013-12-01

    As has been seen in other informatics fields, well-documented and appropriately licensed open source software tools have the potential to significantly increase both opportunities and motivation for inter-institutional science and technology collaboration. The CUAHSI HIS (and related HydroShare) projects have aimed to foster such activities in hydrology resulting in the development of many useful community software components including the HydroDesktop software application. HydroDesktop is an open source, GIS-based, scriptable software application for discovering data on the CUAHSI Hydrologic Information System and related resources. It includes a well-defined plugin architecture and interface to allow 3rd party developers to create extensions and add new functionality without requiring recompiling of the full source code. HydroDesktop is built in the C# programming language and uses the open source DotSpatial GIS engine for spatial data management. Capabilities include data search, discovery, download, visualization, and export. An extension that integrates the R programming language with HydroDesktop provides scripting and data automation capabilities and an OpenMI plugin provides the ability to link models. Current revision and updates to HydroDesktop include migration of core business logic to cross platform, scriptable Python code modules that can be executed in any operating system or linked into other software front-end applications.

  6. An Embedded Systems Course for Engineering Students Using Open-Source Platforms in Wireless Scenarios

    ERIC Educational Resources Information Center

    Rodriguez-Sanchez, M. C.; Torrado-Carvajal, Angel; Vaquero, Joaquin; Borromeo, Susana; Hernandez-Tamames, Juan A.

    2016-01-01

    This paper presents a case study analyzing the advantages and disadvantages of using project-based learning (PBL) combined with collaborative learning (CL) and industry best practices, integrated with information communication technologies, open-source software, and open-source hardware tools, in a specialized microcontroller and embedded systems…

  7. Next Stop: OpenSim!

    ERIC Educational Resources Information Center

    Korolov, Maria

    2011-01-01

    Unhappy with conditions in Second Life, educators are migrating to a developing virtual world that offers them greater autonomy and a safer platform for their students at far less a cost. OpenSimulator is an open source virtual world platform that schools can run for free on their own servers or can get cheaply and quickly--the space can be up and…

  8. Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.

    PubMed

    Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars

    2015-07-15

    Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Open source OCR framework using mobile devices

    NASA Astrophysics Data System (ADS)

    Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan

    2008-02-01

    Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.

  10. PDDL4J: a planning domain description library for java

    NASA Astrophysics Data System (ADS)

    Pellier, D.; Fiorino, H.

    2018-01-01

    PDDL4J (Planning Domain Description Library for Java) is an open source toolkit for Java cross-platform developers meant (1) to provide state-of-the-art planners based on the Pddl language, and (2) to facilitate research works on new planners. In this article, we present an overview of the Automated Planning concepts and languages. We present some planning systems and their most significant applications. Then, we detail the Pddl4j toolkit with an emphasis on the available informative structures, heuristics and search algorithms.

  11. Open Energy Info (OpenEI) (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2010-12-01

    The Open Energy Information (OpenEI.org) initiative is a free, open-source, knowledge-sharing platform. OpenEI was created to provide access to data, models, tools, and information that accelerate the transition to clean energy systems through informed decisions.

  12. Open-Source 3D-Printable Optics Equipment

    PubMed Central

    Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104

  13. Open-source 3D-printable optics equipment.

    PubMed

    Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.

  14. Maritime Analytics Prototype: Final Development Report

    DTIC Science & Technology

    2014-04-01

    access management platform OpenAM , support for multiple instances of the same type of widget and support for installation specific configuration files to...et de la gestion de l’accès OpenAM , le support pour plusieurs instances du même type de widget et le support des fichiers d’installation de...open source authentication and access management platform OpenAM , support for multiple instances of the same type of widget and support for

  15. EverVIEW: a visualization platform for hydrologic and Earth science gridded data

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Suir, Kevin J.; Conzelmann, Craig

    2015-01-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  16. VitaPad: visualization tools for the analysis of pathway data.

    PubMed

    Holford, Matthew; Li, Naixin; Nadkarni, Prakash; Zhao, Hongyu

    2005-04-15

    Packages that support the creation of pathway diagrams are limited by their inability to be readily extended to new classes of pathway-related data. VitaPad is a cross-platform application that enables users to create and modify biological pathway diagrams and incorporate microarray data with them. It improves on existing software in the following areas: (i) It can create diagrams dynamically through graph layout algorithms. (ii) It is open-source and uses an open XML format to store data, allowing for easy extension or integration with other tools. (iii) It features a cutting-edge user interface with intuitive controls, high-resolution graphics and fully customizable appearance. http://bioinformatics.med.yale.edu matthew.holford@yale.edu; hongyu.zhao@yale.edu.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.

    VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less

  18. MarDRe: efficient MapReduce-based removal of duplicate DNA reads in the cloud.

    PubMed

    Expósito, Roberto R; Veiga, Jorge; González-Domínguez, Jorge; Touriño, Juan

    2017-09-01

    This article presents MarDRe, a de novo cloud-ready duplicate and near-duplicate removal tool that can process single- and paired-end reads from FASTQ/FASTA datasets. MarDRe takes advantage of the widely adopted MapReduce programming model to fully exploit Big Data technologies on cloud-based infrastructures. Written in Java to maximize cross-platform compatibility, MarDRe is built upon the open-source Apache Hadoop project, the most popular distributed computing framework for scalable Big Data processing. On a 16-node cluster deployed on the Amazon EC2 cloud platform, MarDRe is up to 8.52 times faster than a representative state-of-the-art tool. Source code in Java and Hadoop as well as a user's guide are freely available under the GNU GPLv3 license at http://mardre.des.udc.es . rreye@udc.es. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  19. A case study in open source innovation: developing the Tidepool Platform for interoperability in type 1 diabetes management.

    PubMed

    Neinstein, Aaron; Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh

    2016-03-01

    Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application ("app"), Blip, to visualize the data. Tidepool's software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool's open source, cloud model for health data interoperability is applicable to other healthcare use cases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  20. A case study in open source innovation: developing the Tidepool Platform for interoperability in type 1 diabetes management

    PubMed Central

    Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh

    2016-01-01

    Objective Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. Materials and Methods An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Results Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application (“app”), Blip, to visualize the data. Tidepool’s software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. Discussion By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. Conclusion The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool’s open source, cloud model for health data interoperability is applicable to other healthcare use cases. PMID:26338218

  1. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  2. SIG Contribution in the Making of Geotechnical Maps in Urban Areas

    NASA Astrophysics Data System (ADS)

    Monteiro, António; Pais, Luís Andrade; Rodrigues, Carlos; Carvalho, Paulo

    2017-10-01

    The use of Geographic Information Systems (GIS) has spread to several science areas, from oceanography to geotechnics. Its application in the urban mapping was intensified in the last century, which allowed a great development, due to the use of geographic database, new analysis tools and, more recently, free open source software. Geotechnical cartography struggle with a permanent and large environment re-organization in urban area, due to new building construction, trenching and the drilling of sampling wells and holes. This creates an extra important and largest volume of data at any pre-existence geological map. The main problem results on the fact that the natural environment is covered with buildings and communications system. The purpose of this work is to create a viable geographic information base for geotechnical mapping through a free GIS computer program and open source, with non-traditional cartographic sources, giving preference to open platforms. QGIS was used as software and “Google Maps”, “Bing Maps” and “OpenStreetMap” were applied as cartographic sources using the “OpenLayers plugin” module. Finally, we also pretend to identify and delimit the degree of granite’s change and fracturing areas using a “Streetview” platform. This model has cartographic input which are a geological map study area, open cartographic web archives and the use of “Streetview” platform. The output has several layouts, such as topography intersection (roads, borders, etc.), with geological map and the bordering area of Guarda Urban Zone. The use of this platform types decrease the collect data time and, sometimes, a careful observation of pictures that were taken during excavations may reveal important details for geological mapping in the study area.

  3. phylo-node: A molecular phylogenetic toolkit using Node.js.

    PubMed

    O'Halloran, Damien M

    2017-01-01

    Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis. To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines. phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node.

  4. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  5. OpenDA-WFLOW framework for improving hydrologic predictions using distributed hydrologic models

    NASA Astrophysics Data System (ADS)

    Weerts, Albrecht; Schellekens, Jaap; Kockx, Arno; Hummel, Stef

    2017-04-01

    Data assimilation (DA) holds considerable potential for improving hydrologic predictions (Liu et al., 2012) and increase the potential for early warning and/or smart water management. However, advances in hydrologic DA research have not yet been adequately or timely implemented in operational forecast systems to improve the skill of forecasts for better informed real-world decision making. The objective of this work is to highlight the development of a generic linkage of the open source OpenDA package and the open source community hydrologic modeling framework Openstreams/WFLOW and its application in operational hydrological forecasting on various spatial scales. The coupling between OpenDA and Openstreams/wflow framework is based on the emerging standard Basic Model Interface (BMI) as advocated by CSDMS using cross-platform webservices (i.e. Apache Thrift) developed by Hut et al. (2016). The potential application of the OpenDA-WFLOW for operational hydrologic forecasting including its integration with Delft-FEWS (used by more than 40 operational forecast centers around the world (Werner et al., 2013)) is demonstrated by the presented case studies. We will also highlight the possibility to give real-time insight into the working of the DA methods applied for supporting the forecaster as mentioned as one of the burning issues by Liu et al., (2012).

  6. Enlisting User Community Perspectives to Inform Development of a Semantic Web Application for Discovery of Cross-Institutional Research Information and Data

    NASA Astrophysics Data System (ADS)

    Johns, E. M.; Mayernik, M. S.; Boler, F. M.; Corson-Rikert, J.; Daniels, M. D.; Gross, M. B.; Khan, H.; Maull, K. E.; Rowan, L. R.; Stott, D.; Williams, S.; Krafft, D. B.

    2015-12-01

    Researchers seek information and data through a variety of avenues: published literature, search engines, repositories, colleagues, etc. In order to build a web application that leverages linked open data to enable multiple paths for information discovery, the EarthCollab project has surveyed two geoscience user communities to consider how researchers find and share scholarly output. EarthCollab, a cross-institutional, EarthCube funded project partnering UCAR, Cornell University, and UNAVCO, is employing the open-source semantic web software, VIVO, as the underlying technology to connect the people and resources of virtual research communities. This study will present an analysis of survey responses from members of the two case study communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy. The survey results illustrate the types of research products that respondents indicate should be discoverable within a digital platform and the current methods used to find publications, data, personnel, tools, and instrumentation. The responses showed that scientists rely heavily on general purpose search engines, such as Google, to find information, but that data center websites and the published literature were also critical sources for finding collaborators, data, and research tools.The survey participants also identify additional features of interest for an information platform such as search engine indexing, connection to institutional web pages, generation of bibliographies and CVs, and outward linking to social media. Through the survey, the user communities prioritized the type of information that is most important to display and describe their work within a research profile. The analysis of this survey will inform our further development of a platform that will facilitate different types of information discovery strategies, and help researchers to find and use the associated resources of a research project.

  7. Raspberry Pi-powered imaging for plant phenotyping.

    PubMed

    Tovar, Jose C; Hoyer, J Steen; Lin, Andy; Tielking, Allison; Callen, Steven T; Elizabeth Castillo, S; Miller, Michael; Tessman, Monica; Fahlgren, Noah; Carrington, James C; Nusinow, Dmitri A; Gehan, Malia A

    2018-03-01

    Image-based phenomics is a powerful approach to capture and quantify plant diversity. However, commercial platforms that make consistent image acquisition easy are often cost-prohibitive. To make high-throughput phenotyping methods more accessible, low-cost microcomputers and cameras can be used to acquire plant image data. We used low-cost Raspberry Pi computers and cameras to manage and capture plant image data. Detailed here are three different applications of Raspberry Pi-controlled imaging platforms for seed and shoot imaging. Images obtained from each platform were suitable for extracting quantifiable plant traits (e.g., shape, area, height, color) en masse using open-source image processing software such as PlantCV. This protocol describes three low-cost platforms for image acquisition that are useful for quantifying plant diversity. When coupled with open-source image processing tools, these imaging platforms provide viable low-cost solutions for incorporating high-throughput phenomics into a wide range of research programs.

  8. Open Source in Education

    ERIC Educational Resources Information Center

    Lakhan, Shaheen E.; Jhunjhunwala, Kavita

    2008-01-01

    Educational institutions have rushed to put their academic resources and services online, beginning the global community onto a common platform and awakening the interest of investors. Despite continuing technical challenges, online education shows great promise. Open source software offers one approach to addressing the technical problems in…

  9. Developing an Intelligent Diagnosis and Assessment E-Learning Tool for Introductory Programming

    ERIC Educational Resources Information Center

    Huang, Chenn-Jung; Chen, Chun-Hua; Luo, Yun-Cheng; Chen, Hong-Xin; Chuang, Yi-Ta

    2008-01-01

    Recently, a lot of open source e-learning platforms have been offered for free in the Internet. We thus incorporate the intelligent diagnosis and assessment tool into an open software e-learning platform developed for programming language courses, wherein the proposed learning diagnosis assessment tools based on text mining and machine learning…

  10. GridAPPS-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-03-28

    GridAPPS-D is an open-source, open architecture, standards based platform for development of advanced electric power system planning and operations applications. GridAPPS-D provides a documented data abstraction for the application developer enabling creation of applications that can be run in any compliant system or platform. This enables development of applications that are platform vendor independent applications and applications that take advantage of the possibility of data rich and data driven applications based on deployment of smart grid devices and systems.

  11. Xi-cam: Flexible High Throughput Data Processing for GISAXS

    NASA Astrophysics Data System (ADS)

    Pandolfi, Ronald; Kumar, Dinesh; Venkatakrishnan, Singanallur; Sarje, Abinav; Krishnan, Hari; Pellouchoud, Lenson; Ren, Fang; Fournier, Amanda; Jiang, Zhang; Tassone, Christopher; Mehta, Apurva; Sethian, James; Hexemer, Alexander

    With increasing capabilities and data demand for GISAXS beamlines, supporting software is under development to handle larger data rates, volumes, and processing needs. We aim to provide a flexible and extensible approach to GISAXS data treatment as a solution to these rising needs. Xi-cam is the CAMERA platform for data management, analysis, and visualization. The core of Xi-cam is an extensible plugin-based GUI platform which provides users an interactive interface to processing algorithms. Plugins are available for SAXS/GISAXS data and data series visualization, as well as forward modeling and simulation through HipGISAXS. With Xi-cam's advanced mode, data processing steps are designed as a graph-based workflow, which can be executed locally or remotely. Remote execution utilizes HPC or de-localized resources, allowing for effective reduction of high-throughput data. Xi-cam is open-source and cross-platform. The processing algorithms in Xi-cam include parallel cpu and gpu processing optimizations, also taking advantage of external processing packages such as pyFAI. Xi-cam is available for download online.

  12. Performance Evaluation of Block Acquisition and Tracking Algorithms Using an Open Source GPS Receiver Platform

    NASA Technical Reports Server (NTRS)

    Ramachandran, Ganesh K.; Akopian, David; Heckler, Gregory W.; Winternitz, Luke B.

    2011-01-01

    Location technologies have many applications in wireless communications, military and space missions, etc. US Global Positioning System (GPS) and other existing and emerging Global Navigation Satellite Systems (GNSS) are expected to provide accurate location information to enable such applications. While GNSS systems perform very well in strong signal conditions, their operation in many urban, indoor, and space applications is not robust or even impossible due to weak signals and strong distortions. The search for less costly, faster and more sensitive receivers is still in progress. As the research community addresses more and more complicated phenomena there exists a demand on flexible multimode reference receivers, associated SDKs, and development platforms which may accelerate and facilitate the research. One of such concepts is the software GPS/GNSS receiver (GPS SDR) which permits a facilitated access to algorithmic libraries and a possibility to integrate more advanced algorithms without hardware and essential software updates. The GNU-SDR and GPS-SDR open source receiver platforms are such popular examples. This paper evaluates the performance of recently proposed block-corelator techniques for acquisition and tracking of GPS signals using open source GPS-SDR platform.

  13. A-Track: A new approach for detection of moving objects in FITS images

    NASA Astrophysics Data System (ADS)

    Atay, T.; Kaplan, M.; Kilic, Y.; Karapinar, N.

    2016-10-01

    We have developed a fast, open-source, cross-platform pipeline, called A-Track, for detecting the moving objects (asteroids and comets) in sequential telescope images in FITS format. The pipeline is coded in Python 3. The moving objects are detected using a modified line detection algorithm, called MILD. We tested the pipeline on astronomical data acquired by an SI-1100 CCD with a 1-meter telescope. We found that A-Track performs very well in terms of detection efficiency, stability, and processing time. The code is hosted on GitHub under the GNU GPL v3 license.

  14. SolarPILOT Feature Requests and Collaboration | Concentrating Solar Power |

    Science.gov Websites

    DOE of the CSP community's needs. As of March 2018, SolarPILOT is also available as an open source project. While not every project benefits from an open source approach, several factors influenced this , but lack of availability has, in some cases, prevented widespread adoption of a common platform. Open

  15. Pyteomics--a Python framework for exploratory data analysis and rapid software prototyping in proteomics.

    PubMed

    Goloborodko, Anton A; Levitsky, Lev I; Ivanov, Mark V; Gorshkov, Mikhail V

    2013-02-01

    Pyteomics is a cross-platform, open-source Python library providing a rich set of tools for MS-based proteomics. It provides modules for reading LC-MS/MS data, search engine output, protein sequence databases, theoretical prediction of retention times, electrochemical properties of polypeptides, mass and m/z calculations, and sequence parsing. Pyteomics is available under Apache license; release versions are available at the Python Package Index http://pypi.python.org/pyteomics, the source code repository at http://hg.theorchromo.ru/pyteomics, documentation at http://packages.python.org/pyteomics. Pyteomics.biolccc documentation is available at http://packages.python.org/pyteomics.biolccc/. Questions on installation and usage can be addressed to pyteomics mailing list: pyteomics@googlegroups.com.

  16. Command & Control in Virtual Environments: Designing a Virtual Environment for Experimentation

    DTIC Science & Technology

    2010-06-01

    proceed with the research: Second Life/ OpenSim A popular leader in the desktop virtual worlds revolution, for many Second Life has become...prototype environments and adapt them quickly within the world. OpenSim is an open-source community built around upon the Second Life platform...functionality natively present in Second Life and the Opensim platform. With the recent release of Second Life Viewer 2.0, which contains a complete

  17. A Generic Software Architecture For Prognostics

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Daigle, Matthew J.; Sankararaman, Shankar; Goebel, Kai; Watkins, Jason

    2017-01-01

    Prognostics is a systems engineering discipline focused on predicting end-of-life of components and systems. As a relatively new and emerging technology, there are few fielded implementations of prognostics, due in part to practitioners perceiving a large hurdle in developing the models, algorithms, architecture, and integration pieces. As a result, no open software frameworks for applying prognostics currently exist. This paper introduces the Generic Software Architecture for Prognostics (GSAP), an open-source, cross-platform, object-oriented software framework and support library for creating prognostics applications. GSAP was designed to make prognostics more accessible and enable faster adoption and implementation by industry, by reducing the effort and investment required to develop, test, and deploy prognostics. This paper describes the requirements, design, and testing of GSAP. Additionally, a detailed case study involving battery prognostics demonstrates its use.

  18. Online molecular image repository and analysis system: A multicenter collaborative open-source infrastructure for molecular imaging research and application.

    PubMed

    Rahman, Mahabubur; Watabe, Hiroshi

    2018-05-01

    Molecular imaging serves as an important tool for researchers and clinicians to visualize and investigate complex biochemical phenomena using specialized instruments; these instruments are either used individually or in combination with targeted imaging agents to obtain images related to specific diseases with high sensitivity, specificity, and signal-to-noise ratios. However, molecular imaging, which is a multidisciplinary research field, faces several challenges, including the integration of imaging informatics with bioinformatics and medical informatics, requirement of reliable and robust image analysis algorithms, effective quality control of imaging facilities, and those related to individualized disease mapping, data sharing, software architecture, and knowledge management. As a cost-effective and open-source approach to address these challenges related to molecular imaging, we develop a flexible, transparent, and secure infrastructure, named MIRA, which stands for Molecular Imaging Repository and Analysis, primarily using the Python programming language, and a MySQL relational database system deployed on a Linux server. MIRA is designed with a centralized image archiving infrastructure and information database so that a multicenter collaborative informatics platform can be built. The capability of dealing with metadata, image file format normalization, and storing and viewing different types of documents and multimedia files make MIRA considerably flexible. With features like logging, auditing, commenting, sharing, and searching, MIRA is useful as an Electronic Laboratory Notebook for effective knowledge management. In addition, the centralized approach for MIRA facilitates on-the-fly access to all its features remotely through any web browser. Furthermore, the open-source approach provides the opportunity for sustainable continued development. MIRA offers an infrastructure that can be used as cross-boundary collaborative MI research platform for the rapid achievement in cancer diagnosis and therapeutics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Building a Billion Spatio-Temporal Object Search and Visualization Platform

    NASA Astrophysics Data System (ADS)

    Kakkar, D.; Lewis, B.

    2017-10-01

    With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC), an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  20. An Open Source Model for Open Access Journal Publication

    PubMed Central

    Blesius, Carl R.; Williams, Michael A.; Holzbach, Ana; Huntley, Arthur C.; Chueh, Henry

    2005-01-01

    We describe an electronic journal publication infrastructure that allows a flexible publication workflow, academic exchange around different forms of user submissions, and the exchange of articles between publishers and archives using a common XML based standard. This web-based application is implemented on a freely available open source software stack. This publication demonstrates the Dermatology Online Journal's use of the platform for non-biased independent open access publication. PMID:16779183

  1. Open-source mobile digital platform for clinical trial data collection in low-resource settings.

    PubMed

    van Dam, Joris; Omondi Onyango, Kevin; Midamba, Brian; Groosman, Nele; Hooper, Norman; Spector, Jonathan; Pillai, Goonaseelan Colin; Ogutu, Bernhards

    2017-02-01

    Governments, universities and pan-African research networks are building durable infrastructure and capabilities for biomedical research in Africa. This offers the opportunity to adopt from the outset innovative approaches and technologies that would be challenging to retrofit into fully established research infrastructures such as those regularly found in high-income countries. In this context we piloted the use of a novel mobile digital health platform, designed specifically for low-resource environments, to support high-quality data collection in a clinical research study. Our primary aim was to assess the feasibility of a using a mobile digital platform for clinical trial data collection in a low-resource setting. Secondarily, we sought to explore the potential benefits of such an approach. The investigative site was a research institute in Nairobi, Kenya. We integrated an open-source platform for mobile data collection commonly used in the developing world with an open-source, standard platform for electronic data capture in clinical trials. The integration was developed using common data standards (Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model), maximising the potential to extend the approach to other platforms. The system was deployed in a pharmacokinetic study involving healthy human volunteers. The electronic data collection platform successfully supported conduct of the study. Multidisciplinary users reported high levels of satisfaction with the mobile application and highlighted substantial advantages when compared with traditional paper record systems. The new system also demonstrated a potential for expediting data quality review. This pilot study demonstrated the feasibility of using a mobile digital platform for clinical research data collection in low-resource settings. Sustainable scientific capabilities and infrastructure are essential to attract and support clinical research studies. Since many research structures in Africa are being developed anew, stakeholders should consider implementing innovative technologies and approaches.

  2. AirShow 1.0 CFD Software Users' Guide

    NASA Technical Reports Server (NTRS)

    Mohler, Stanley R., Jr.

    2005-01-01

    AirShow is visualization post-processing software for Computational Fluid Dynamics (CFD). Upon reading binary PLOT3D grid and solution files into AirShow, the engineer can quickly see how hundreds of complex 3-D structured blocks are arranged and numbered. Additionally, chosen grid planes can be displayed and colored according to various aerodynamic flow quantities such as Mach number and pressure. The user may interactively rotate and translate the graphical objects using the mouse. The software source code was written in cross-platform Java, C++, and OpenGL, and runs on Unix, Linux, and Windows. The graphical user interface (GUI) was written using Java Swing. Java also provides multiple synchronized threads. The Java Native Interface (JNI) provides a bridge between the Java code and the C++ code where the PLOT3D files are read, the OpenGL graphics are rendered, and numerical calculations are performed. AirShow is easy to learn and simple to use. The source code is available for free from the NASA Technology Transfer and Partnership Office.

  3. A new tool for converting food frequency questionnaire data into nutrient and food group values: FETA research methods and availability

    PubMed Central

    Mulligan, Angela A; Luben, Robert N; Bhaniani, Amit; Parry-Smith, David J; O'Connor, Laura; Khawaja, Anthony P; Forouhi, Nita G; Khaw, Kay-Tee

    2014-01-01

    Objectives To describe the research methods for the development of a new open source, cross-platform tool which processes data from the European Prospective Investigation into Cancer and Nutrition Norfolk Food Frequency Questionnaire (EPIC-Norfolk FFQ). A further aim was to compare nutrient and food group values derived from the current tool (FETA, FFQ EPIC Tool for Analysis) with the previously validated but less accessible tool, CAFÉ (Compositional Analyses from Frequency Estimates). The effect of text matching on intake data was also investigated. Design Cross-sectional analysis of a prospective cohort study—EPIC-Norfolk. Setting East England population (city of Norwich and its surrounding small towns and rural areas). Participants Complete FFQ data from 11 250 men and 13 602 women (mean age 59 years; range 40–79 years). Outcome measures Nutrient and food group intakes derived from FETA and CAFÉ analyses of EPIC-Norfolk FFQ data. Results Nutrient outputs from FETA and CAFÉ were similar; mean (SD) energy intake from FETA was 9222 kJ (2633) in men, 8113 kJ (2296) in women, compared with CAFÉ intakes of 9175 kJ (2630) in men, 8091 kJ (2298) in women. The majority of differences resulted in one or less quintile change (98.7%). Only mean daily fruit and vegetable food group intakes were higher in women than in men (278 vs 212 and 284 vs 255 g, respectively). Quintile changes were evident for all nutrients, with the exception of alcohol, when text matching was not executed; however, only the cereals food group was affected. Conclusions FETA produces similar nutrient and food group values to the previously validated CAFÉ but has the advantages of being open source, cross-platform and complete with a data-entry form directly compatible with the software. The tool will facilitate research using the EPIC-Norfolk FFQ, and can be customised for different study populations. PMID:24674997

  4. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  5. Optical sedimentation recorder

    DOEpatents

    Bishop, James K.B.

    2014-05-06

    A robotic optical sedimentation recorder is described for the recordation of carbon flux in the oceans wherein both POC and PIC particles are captured at the open end of a submersible sampling platform, the captured particles allowed to drift down onto a collection plate where they can be imaged over time. The particles are imaged using three separate light sources, activated in sequence, one source being a back light, a second source being a side light to provide dark field illumination, and a third source comprising a cross polarized light source to illuminate birefringent particles. The recorder in one embodiment is attached to a buoyancy unit which is capable upon command for bringing the sedimentation recorder to a programmed depth below the ocean surface during recordation mode, and on command returning the unit to the ocean surface for transmission of recorded data and receipt of new instructions. The combined unit is provided with its own power source and is designed to operate autonomously in the ocean for extended periods of time.

  6. OpenQuake, a platform for collaborative seismic hazard and risk assessment

    NASA Astrophysics Data System (ADS)

    Henshaw, Paul; Burton, Christopher; Butler, Lars; Crowley, Helen; Danciu, Laurentiu; Nastasi, Matteo; Monelli, Damiano; Pagani, Marco; Panzeri, Luigi; Simionato, Michele; Silva, Vitor; Vallarelli, Giuseppe; Weatherill, Graeme; Wyss, Ben

    2013-04-01

    Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, tools and models for global seismic hazard and risk assessment, within the context of the Global Earthquake Model (GEM). Guided by the needs and experiences of governments, companies and international organisations, all contributions are being integrated into OpenQuake: a web-based platform that - together with other resources - will become accessible in 2014. With OpenQuake, stakeholders worldwide will be able to calculate, visualize and investigate earthquake hazard and risk, capture new data and share findings for joint learning. The platform is envisaged as a collaborative hub for earthquake risk assessment, used at global and local scales, around which an active network of users has formed. OpenQuake will comprise both online and offline tools, many of which can also be used independently. One of the first steps in OpenQuake development was the creation of open-source software for advanced seismic hazard and risk calculations at any scale, the OpenQuake Engine. Although in continuous development, a command-line version of the software is already being test-driven and used by hundreds worldwide; from non-profits in Central Asia, seismologists in sub-Saharan Africa and companies in South Asia to the European seismic hazard harmonization programme (SHARE). In addition, several technical trainings were organized with scientists from different regions of the world (sub-Saharan Africa, Central Asia, Asia-Pacific) to introduce the engine and other OpenQuake tools to the community, something that will continue to happen over the coming years. Other tools that are being developed of direct interest to the hazard community are: • OpenQuake Modeller; fundamental instruments for the creation of seismogenic input models for seismic hazard assessment, a critical input to the OpenQuake Engine. OpenQuake Modeller will consist of a suite of tools (Hazard Modellers Toolkit) for characterizing the seismogenic sources of earthquakes and their models of earthquakes recurrence. An earthquake catalogue homogenization tool, for integration, statistical comparison and user-defined harmonization of multiple catalogues of earthquakes is also included in the OpenQuake modeling tools. • A data capture tool for active faults; a tool that allows geologists to draw (new) fault discoveries on a map in an intuitive GIS-environment and add details on the fault through the tool. This data, once quality checked, can then be integrated with the global active faults database, which will increase in value with every new fault insertion. Building on many ongoing efforts and the knowledge of scientists worldwide, GEM will for the first time integrate state-of-the-art data, models, results and open-source tools into a single platform. The platform will continue to increase in value, in particular for use in local contexts, through contributions from and collaborations with scientists and organisations worldwide. This presentation will showcase the OpenQuake Platform, focusing on the IT solutions that have been adopted as well as the added value that the platform will bring to scientists worldwide.

  7. A Platform for Scalable Satellite and Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Beneke, C. M.; Skillman, S.; Warren, M. S.; Kelton, T.; Brumby, S. P.; Chartrand, R.; Mathis, M.

    2017-12-01

    At Descartes Labs, we use the commercial cloud to run global-scale machine learning applications over satellite imagery. We have processed over 5 Petabytes of public and commercial satellite imagery, including the full Landsat and Sentinel archives. By combining open-source tools with a FUSE-based filesystem for cloud storage, we have enabled a scalable compute platform that has demonstrated reading over 200 GB/s of satellite imagery into cloud compute nodes. In one application, we generated global 15m Landsat-8, 20m Sentinel-1, and 10m Sentinel-2 composites from 15 trillion pixels, using over 10,000 CPUs. We recently created a public open-source Python client library that can be used to query and access preprocessed public satellite imagery from within our platform, and made this platform available to researchers for non-commercial projects. In this session, we will describe how you can use the Descartes Labs Platform for rapid prototyping and scaling of geospatial analyses and demonstrate examples in land cover classification.

  8. An ontology based information system for the management of institutional repository's collections

    NASA Astrophysics Data System (ADS)

    Tsolakidis, A.; Kakoulidis, P.; Skourlas, C.

    2015-02-01

    In this paper we discuss a simple methodological approach to create, and customize institutional repositories for the domain of the technological education. The use of the open source software platform of DSpace is proposed to build up the repository application and provide access to digital resources including research papers, dissertations, administrative documents, educational material, etc. Also the use of owl ontologies is proposed for indexing and accessing the various, heterogeneous items stored in the repository. Customization and operation of a platform for the selection and use of terms or parts of similar existing owl ontologies is also described. This platform could be based on the open source software Protégé that supports owl, is widely used, and also supports visualization, SPARQL etc. The combined use of the owl platform and the DSpace repository form a basis for creating customized ontologies, accommodating the semantic metadata of items and facilitating searching.

  9. An open-source platform to study uniaxial stress effects on nanoscale devices

    NASA Astrophysics Data System (ADS)

    Signorello, G.; Schraff, M.; Zellekens, P.; Drechsler, U.; Bürge, M.; Steinauer, H. R.; Heller, R.; Tschudy, M.; Riel, H.

    2017-05-01

    We present an automatic measurement platform that enables the characterization of nanodevices by electrical transport and optical spectroscopy as a function of the uniaxial stress. We provide insights into and detailed descriptions of the mechanical device, the substrate design and fabrication, and the instrument control software, which is provided under open-source license. The capability of the platform is demonstrated by characterizing the piezo-resistance of an InAs nanowire device using a combination of electrical transport and Raman spectroscopy. The advantages of this measurement platform are highlighted by comparison with state-of-the-art piezo-resistance measurements in InAs nanowires. We envision that the systematic application of this methodology will provide new insights into the physics of nanoscale devices and novel materials for electronics, and thus contribute to the assessment of the potential of strain as a technology booster for nanoscale electronics.

  10. 20170917 - A Cross-platform Format to Associate NMR-extracted data (NMReDATA) to Chemical Structures (SMASHNMR)

    EPA Science Inventory

    An open initiative involving cross-disciplinary contributors of computer-assisted structure elucidation (CASE), including methodology specialists, software and database developers and the editorial board of Magnetic Resonance in Chemistry, is addressing the old problem of reporti...

  11. Stories from OpenAQ, a Global and Grassroots Open Air Quality Community

    NASA Astrophysics Data System (ADS)

    Hasenkopf, C. A.; Flasher, J. C.; Veerman, O.; Scalamogna, A.; Silva, D.; Salmon, M.; Buuralda, D.; DeWitt, L. H.

    2016-12-01

    Air pollution, responsible for more deaths each year than HIV/AIDS and malaria, combined, is a global public health crisis. Yet many scientific questions, including those directly relevant for policy, remain unanswered when it comes to the impact of air pollution on health in highly polluted environments. Often, specific solutions to improving air quality are local and sustained through public engagement, policy and monitoring. Both the overarching science of air quality and health and local solutions rely on access to reliable, timely air quality data. Over the past year, the OpenAQ community has opened up existing disparate air quality data in 24 countries through an open source platform (openaq.org) so that communities around the world can use it to advance science, public engagement, and policy. We will share stories of communities, from Delhi to Ulaanbaatar and from scientists to journalists, using open air quality data from our platform to advance their fight against air inequality. We will share recent research we have conducted on best practices for engaging different communities and building tools that enable the public to fully unleash the power of open air quality data to fight air inequality. The subsequent open-source tools (github.com/openaq) we have developed from this research and our entire data-sharing platform may be of interest to other open data communities.

  12. An open-source software platform for data management, visualisation, model building and model sharing in water, energy and other resource modelling domains.

    NASA Astrophysics Data System (ADS)

    Knox, S.; Meier, P.; Mohammed, K.; Korteling, B.; Matrosov, E. S.; Hurford, A.; Huskova, I.; Harou, J. J.; Rosenberg, D. E.; Thilmant, A.; Medellin-Azuara, J.; Wicks, J.

    2015-12-01

    Capacity expansion on resource networks is essential to adapting to economic and population growth and pressures such as climate change. Engineered infrastructure systems such as water, energy, or transport networks require sophisticated and bespoke models to refine management and investment strategies. Successful modeling of such complex systems relies on good data management and advanced methods to visualize and share data.Engineered infrastructure systems are often represented as networks of nodes and links with operating rules describing their interactions. Infrastructure system management and planning can be abstracted to simulating or optimizing new operations and extensions of the network. By separating the data storage of abstract networks from manipulation and modeling we have created a system where infrastructure modeling across various domains is facilitated.We introduce Hydra Platform, a Free Open Source Software designed for analysts and modelers to store, manage and share network topology and data. Hydra Platform is a Python library with a web service layer for remote applications, called Apps, to connect. Apps serve various functions including network or results visualization, data export (e.g. into a proprietary format) or model execution. This Client-Server architecture allows users to manipulate and share centrally stored data. XML templates allow a standardised description of the data structure required for storing network data such that it is compatible with specific models.Hydra Platform represents networks in an abstract way and is therefore not bound to a single modeling domain. It is the Apps that create domain-specific functionality. Using Apps researchers from different domains can incorporate different models within the same network enabling cross-disciplinary modeling while minimizing errors and streamlining data sharing. Separating the Python library from the web layer allows developers to natively expand the software or build web-based apps in other languages for remote functionality. Partner CH2M is developing a commercial user-interface for Hydra Platform however custom interfaces and visualization tools can be built. Hydra Platform is available on GitHub while Apps will be shared on a central repository.

  13. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov Websites

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  14. Real-time implementation of logo detection on open source BeagleBoard

    NASA Astrophysics Data System (ADS)

    George, M.; Kehtarnavaz, N.; Estevez, L.

    2011-03-01

    This paper presents the real-time implementation of our previously developed logo detection and tracking algorithm on the open source BeagleBoard mobile platform. This platform has an OMAP processor that incorporates an ARM Cortex processor. The algorithm combines Scale Invariant Feature Transform (SIFT) with k-means clustering, online color calibration and moment invariants to robustly detect and track logos in video. Various optimization steps that are carried out to allow the real-time execution of the algorithm on BeagleBoard are discussed. The results obtained are compared to the PC real-time implementation results.

  15. Open chemistry registry and mapping platform based on open source cheminformatics toolkits (ACS Fall meeting)

    EPA Science Inventory

    TheOpen PHACTS project (openphacts.org) is a European initiative, constituting a public–private partnership to enable easier, cheaper and faster drug discovery [1]. The project is supported by the Open PHACTS Foundation (www.openphactsfoundation.org) and funded by contributions f...

  16. OpenHealth Platform for Interactive Contextualization of Population Health Open Data.

    PubMed

    Almeida, Jonas S; Hajagos, Janos; Crnosija, Ivan; Kurc, Tahsin; Saltz, Mary; Saltz, Joel

    The financial incentives for data science applications leading to improved health outcomes, such as DSRIP (bit.ly/dsrip), are well-aligned with the broad adoption of Open Data by State and Federal agencies. This creates entirely novel opportunities for analytical applications that make exclusive use of the pervasive Web Computing platform. The framework described here explores this new avenue to contextualize Health data in a manner that relies exclusively on the native JavaScript interpreter and data processing resources of the ubiquitous Web Browser. The OpenHealth platform is made publicly available, and is publicly hosted with version control and open source, at https://github.com/mathbiol/openHealth. The different data/analytics workflow architectures explored are accompanied with live applications ranging from DSRIP, such as Hospital Inpatient Prevention Quality Indicators at http://bit.ly/pqiSuffolk, to The Cancer Genome Atlas (TCGA) as illustrated by http://bit.ly/tcgascopeGBM.

  17. Open-WiSe: a solar powered wireless sensor network platform.

    PubMed

    González, Apolinar; Aquino, Raúl; Mata, Walter; Ochoa, Alberto; Saldaña, Pedro; Edwards, Arthur

    2012-01-01

    Because battery-powered nodes are required in wireless sensor networks and energy consumption represents an important design consideration, alternate energy sources are needed to provide more effective and optimal function. The main goal of this work is to present an energy harvesting wireless sensor network platform, the Open Wireless Sensor node (WiSe). The design and implementation of the solar powered wireless platform is described including the hardware architecture, firmware, and a POSIX Real-Time Kernel. A sleep and wake up strategy was implemented to prolong the lifetime of the wireless sensor network. This platform was developed as a tool for researchers investigating Wireless sensor network or system integrators.

  18. LK Scripting Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The LK scripting language is a simple and fast computer programming language designed for easy integration with existing software to enable automation of tasks. The LK language is used by NREL’s System Advisor Model (SAM), the SAM Software Development Kit (SDK), and SolTrace products. LK is easy extensible and adaptable to new software due to its small footprint and is designed to be statically linked into other software. It is written in standard C++, is cross-platform (Windows, Linux, and OSX), and includes optional portions that enable direct integration with graphical user interfaces written in the open source C++ wxWidgets Versionmore » 3.0+ toolkit.« less

  19. The Unidata Integrated Data Viewer

    NASA Astrophysics Data System (ADS)

    Weber, W. J.; Ho, Y.

    2016-12-01

    The Unidata Integrated Data Viewer (IDV) is a free and open source, virtual globe, software application that enables three dimensional viewing of earth science data. The Unidata IDV is data agnostic and can display and analyze disparate data in a single view. This capability facilitates cross discipline research and allows for multiple observation platforms to be displayed simultaneously for any given event. The Unidata IDV is a mature application, written in JAVA, and has been serving the earth science community for over 15 years. This demonstration will focus on near real time global satelliteobservations, the integration of the COSMIC radio occultation data set that profiles the atmosphere, and high resolution numerical weather prediction.

  20. OSIRIX: open source multimodality image navigation software

    NASA Astrophysics Data System (ADS)

    Rosset, Antoine; Pysher, Lance; Spadola, Luca; Ratib, Osman

    2005-04-01

    The goal of our project is to develop a completely new software platform that will allow users to efficiently and conveniently navigate through large sets of multidimensional data without the need of high-end expensive hardware or software. We also elected to develop our system on new open source software libraries allowing other institutions and developers to contribute to this project. OsiriX is a free and open-source imaging software designed manipulate and visualize large sets of medical images: http://homepage.mac.com/rossetantoine/osirix/

  1. Open Ephys: an open-source, plugin-based platform for multichannel electrophysiology.

    PubMed

    Siegle, Joshua H; López, Aarón Cuevas; Patel, Yogi A; Abramov, Kirill; Ohayon, Shay; Voigts, Jakob

    2017-08-01

    Closed-loop experiments, in which causal interventions are conditioned on the state of the system under investigation, have become increasingly common in neuroscience. Such experiments can have a high degree of explanatory power, but they require a precise implementation that can be difficult to replicate across laboratories. We sought to overcome this limitation by building open-source software that makes it easier to develop and share algorithms for closed-loop control. We created the Open Ephys GUI, an open-source platform for multichannel electrophysiology experiments. In addition to the standard 'open-loop' visualization and recording functionality, the GUI also includes modules for delivering feedback in response to events detected in the incoming data stream. Importantly, these modules can be built and shared as plugins, which makes it possible for users to extend the functionality of the GUI through a simple API, without having to understand the inner workings of the entire application. In combination with low-cost, open-source hardware for amplifying and digitizing neural signals, the GUI has been used for closed-loop experiments that perturb the hippocampal theta rhythm in a phase-specific manner. The Open Ephys GUI is the first widely used application for multichannel electrophysiology that leverages a plugin-based workflow. We hope that it will lower the barrier to entry for electrophysiologists who wish to incorporate real-time feedback into their research.

  2. Open Ephys: an open-source, plugin-based platform for multichannel electrophysiology

    NASA Astrophysics Data System (ADS)

    Siegle, Joshua H.; Cuevas López, Aarón; Patel, Yogi A.; Abramov, Kirill; Ohayon, Shay; Voigts, Jakob

    2017-08-01

    Objective. Closed-loop experiments, in which causal interventions are conditioned on the state of the system under investigation, have become increasingly common in neuroscience. Such experiments can have a high degree of explanatory power, but they require a precise implementation that can be difficult to replicate across laboratories. We sought to overcome this limitation by building open-source software that makes it easier to develop and share algorithms for closed-loop control. Approach. We created the Open Ephys GUI, an open-source platform for multichannel electrophysiology experiments. In addition to the standard ‘open-loop’ visualization and recording functionality, the GUI also includes modules for delivering feedback in response to events detected in the incoming data stream. Importantly, these modules can be built and shared as plugins, which makes it possible for users to extend the functionality of the GUI through a simple API, without having to understand the inner workings of the entire application. Main results. In combination with low-cost, open-source hardware for amplifying and digitizing neural signals, the GUI has been used for closed-loop experiments that perturb the hippocampal theta rhythm in a phase-specific manner. Significance. The Open Ephys GUI is the first widely used application for multichannel electrophysiology that leverages a plugin-based workflow. We hope that it will lower the barrier to entry for electrophysiologists who wish to incorporate real-time feedback into their research.

  3. Increasing the impact of medical image computing using community-based open-access hackathons: The NA-MIC and 3D Slicer experience.

    PubMed

    Kapur, Tina; Pieper, Steve; Fedorov, Andriy; Fillion-Robin, J-C; Halle, Michael; O'Donnell, Lauren; Lasso, Andras; Ungi, Tamas; Pinter, Csaba; Finet, Julien; Pujol, Sonia; Jagadeesan, Jayender; Tokuda, Junichi; Norton, Isaiah; Estepar, Raul San Jose; Gering, David; Aerts, Hugo J W L; Jakab, Marianna; Hata, Nobuhiko; Ibanez, Luiz; Blezek, Daniel; Miller, Jim; Aylward, Stephen; Grimson, W Eric L; Fichtinger, Gabor; Wells, William M; Lorensen, William E; Schroeder, Will; Kikinis, Ron

    2016-10-01

    The National Alliance for Medical Image Computing (NA-MIC) was launched in 2004 with the goal of investigating and developing an open source software infrastructure for the extraction of information and knowledge from medical images using computational methods. Several leading research and engineering groups participated in this effort that was funded by the US National Institutes of Health through a variety of infrastructure grants. This effort transformed 3D Slicer from an internal, Boston-based, academic research software application into a professionally maintained, robust, open source platform with an international leadership and developer and user communities. Critical improvements to the widely used underlying open source libraries and tools-VTK, ITK, CMake, CDash, DCMTK-were an additional consequence of this effort. This project has contributed to close to a thousand peer-reviewed publications and a growing portfolio of US and international funded efforts expanding the use of these tools in new medical computing applications every year. In this editorial, we discuss what we believe are gaps in the way medical image computing is pursued today; how a well-executed research platform can enable discovery, innovation and reproducible science ("Open Science"); and how our quest to build such a software platform has evolved into a productive and rewarding social engineering exercise in building an open-access community with a shared vision. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Design, implementation and practice of JBEI-ICE: an open source biological part registry platform and tools.

    PubMed

    Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D

    2012-10-01

    The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.

  5. Mousetrap: An integrated, open-source mouse-tracking package.

    PubMed

    Kieslich, Pascal J; Henninger, Felix

    2017-10-01

    Mouse-tracking - the analysis of mouse movements in computerized experiments - is becoming increasingly popular in the cognitive sciences. Mouse movements are taken as an indicator of commitment to or conflict between choice options during the decision process. Using mouse-tracking, researchers have gained insight into the temporal development of cognitive processes across a growing number of psychological domains. In the current article, we present software that offers easy and convenient means of recording and analyzing mouse movements in computerized laboratory experiments. In particular, we introduce and demonstrate the mousetrap plugin that adds mouse-tracking to OpenSesame, a popular general-purpose graphical experiment builder. By integrating with this existing experimental software, mousetrap allows for the creation of mouse-tracking studies through a graphical interface, without requiring programming skills. Thus, researchers can benefit from the core features of a validated software package and the many extensions available for it (e.g., the integration with auxiliary hardware such as eye-tracking, or the support of interactive experiments). In addition, the recorded data can be imported directly into the statistical programming language R using the mousetrap package, which greatly facilitates analysis. Mousetrap is cross-platform, open-source and available free of charge from https://github.com/pascalkieslich/mousetrap-os .

  6. OpenAQ: A Platform to Aggregate and Freely Share Global Air Quality Data

    NASA Astrophysics Data System (ADS)

    Hasenkopf, C. A.; Flasher, J. C.; Veerman, O.; DeWitt, H. L.

    2015-12-01

    Thousands of ground-based air quality monitors around the world publicly publish real-time air quality data; however, researchers and the public do not have access to this information in the ways most useful to them. Often, air quality data are posted on obscure websites showing only current values, are programmatically inaccessible, and/or are in inconsistent data formats across sites. Yet, historical and programmatic access to such a global dataset would be transformative to several scientific fields, from epidemiology to low-cost sensor technologies to estimates of ground-level aerosol by satellite retrievals. To increase accessibility and standardize this disparate dataset, we have built OpenAQ, an innovative, open platform created by a group of scientists and open data programmers. The source code for the platform is viewable at github.com/openaq. Currently, we are aggregating, storing, and making publicly available real-time air quality data (PM2.5, PM10, SO2, NO2, and O3) via an Application Program Interface (API). We will present the OpenAQ platform, which currently has the following specific capabilities: A continuous ingest mechanism for some of the most polluted cities, generalizable to more sources An API providing data-querying, including ability to filter by location, measurement type and value and date, as well as custom sort options A generalized, chart-based visualization tool to explore data accessible via the API At this stage, we are seeking wider participation and input from multiple research communities in expanding our data retrieval sites, standardizing our protocols, receiving feedback on quality issues, and creating tools that can be built on top of this open platform.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apte, A; Veeraraghavan, H; Oh, J

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less

  8. Research on Visualization of Ground Laser Radar Data Based on Osg

    NASA Astrophysics Data System (ADS)

    Huang, H.; Hu, C.; Zhang, F.; Xue, H.

    2018-04-01

    Three-dimensional (3D) laser scanning is a new advanced technology integrating light, machine, electricity, and computer technologies. It can conduct 3D scanning to the whole shape and form of space objects with high precision. With this technology, you can directly collect the point cloud data of a ground object and create the structure of it for rendering. People use excellent 3D rendering engine to optimize and display the 3D model in order to meet the higher requirements of real time realism rendering and the complexity of the scene. OpenSceneGraph (OSG) is an open source 3D graphics engine. Compared with the current mainstream 3D rendering engine, OSG is practical, economical, and easy to expand. Therefore, OSG is widely used in the fields of virtual simulation, virtual reality, science and engineering visualization. In this paper, a dynamic and interactive ground LiDAR data visualization platform is constructed based on the OSG and the cross-platform C++ application development framework Qt. In view of the point cloud data of .txt format and the triangulation network data file of .obj format, the functions of 3D laser point cloud and triangulation network data display are realized. It is proved by experiments that the platform is of strong practical value as it is easy to operate and provides good interaction.

  9. Open Targets: a platform for therapeutic target identification and validation

    PubMed Central

    Koscielny, Gautier; An, Peter; Carvalho-Silva, Denise; Cham, Jennifer A.; Fumis, Luca; Gasparyan, Rippa; Hasan, Samiul; Karamanis, Nikiforos; Maguire, Michael; Papa, Eliseo; Pierleoni, Andrea; Pignatelli, Miguel; Platt, Theo; Rowland, Francis; Wankar, Priyanka; Bento, A. Patrícia; Burdett, Tony; Fabregat, Antonio; Forbes, Simon; Gaulton, Anna; Gonzalez, Cristina Yenyxe; Hermjakob, Henning; Hersey, Anne; Jupe, Steven; Kafkas, Şenay; Keays, Maria; Leroy, Catherine; Lopez, Francisco-Javier; Magarinos, Maria Paula; Malone, James; McEntyre, Johanna; Munoz-Pomer Fuentes, Alfonso; O'Donovan, Claire; Papatheodorou, Irene; Parkinson, Helen; Palka, Barbara; Paschall, Justin; Petryszak, Robert; Pratanwanich, Naruemon; Sarntivijal, Sirarat; Saunders, Gary; Sidiropoulos, Konstantinos; Smith, Thomas; Sondka, Zbyslaw; Stegle, Oliver; Tang, Y. Amy; Turner, Edward; Vaughan, Brendan; Vrousgou, Olga; Watkins, Xavier; Martin, Maria-Jesus; Sanseau, Philippe; Vamathevan, Jessica; Birney, Ewan; Barrett, Jeffrey; Dunham, Ian

    2017-01-01

    We have designed and developed a data integration and visualization platform that provides evidence about the association of known and potential drug targets with diseases. The platform is designed to support identification and prioritization of biological targets for follow-up. Each drug target is linked to a disease using integrated genome-wide data from a broad range of data sources. The platform provides either a target-centric workflow to identify diseases that may be associated with a specific target, or a disease-centric workflow to identify targets that may be associated with a specific disease. Users can easily transition between these target- and disease-centric workflows. The Open Targets Validation Platform is accessible at https://www.targetvalidation.org. PMID:27899665

  10. A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms

    PubMed Central

    Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein

    2017-01-01

    Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts’ Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2–100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms. PMID:28487831

  11. A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms.

    PubMed

    Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein

    2017-01-01

    Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts' Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2-100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms.

  12. KNIME for reproducible cross-domain analysis of life science data.

    PubMed

    Fillbrunn, Alexander; Dietz, Christian; Pfeuffer, Julianus; Rahn, René; Landrum, Gregory A; Berthold, Michael R

    2017-11-10

    Experiments in the life sciences often involve tools from a variety of domains such as mass spectrometry, next generation sequencing, or image processing. Passing the data between those tools often involves complex scripts for controlling data flow, data transformation, and statistical analysis. Such scripts are not only prone to be platform dependent, they also tend to grow as the experiment progresses and are seldomly well documented, a fact that hinders the reproducibility of the experiment. Workflow systems such as KNIME Analytics Platform aim to solve these problems by providing a platform for connecting tools graphically and guaranteeing the same results on different operating systems. As an open source software, KNIME allows scientists and programmers to provide their own extensions to the scientific community. In this review paper we present selected extensions from the life sciences that simplify data exploration, analysis, and visualization and are interoperable due to KNIME's unified data model. Additionally, we name other workflow systems that are commonly used in the life sciences and highlight their similarities and differences to KNIME. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Development and validation of an open source quantification tool for DSC-MRI studies.

    PubMed

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peffer, Therese; Blumstein, Carl; Culler, David

    The Project uses state-of-the-art computer science to extend the benefits of Building Automation Systems (BAS) typically found in large buildings (>100,000 square foot) to medium-sized commercial buildings (<50,000 sq ft). The BAS developed in this project, termed OpenBAS, uses an open-source and open software architecture platform, user interface, and plug-and-play control devices to facilitate adoption of energy efficiency strategies in the commercial building sector throughout the United States. At the heart of this “turn key” BAS is the platform with three types of controllers—thermostat, lighting controller, and general controller—that are easily “discovered” by the platform in a plug-and-play fashion. Themore » user interface showcases the platform and provides the control system set-up, system status display and means of automatically mapping the control points in the system.« less

  15. PRGdb: a bioinformatics platform for plant resistance gene analysis

    PubMed Central

    Sanseverino, Walter; Roma, Guglielmo; De Simone, Marco; Faino, Luigi; Melito, Sara; Stupka, Elia; Frusciante, Luigi; Ercolano, Maria Raffaella

    2010-01-01

    PRGdb is a web accessible open-source (http://www.prgdb.org) database that represents the first bioinformatic resource providing a comprehensive overview of resistance genes (R-genes) in plants. PRGdb holds more than 16 000 known and putative R-genes belonging to 192 plant species challenged by 115 different pathogens and linked with useful biological information. The complete database includes a set of 73 manually curated reference R-genes, 6308 putative R-genes collected from NCBI and 10463 computationally predicted putative R-genes. Thanks to a user-friendly interface, data can be examined using different query tools. A home-made prediction pipeline called Disease Resistance Analysis and Gene Orthology (DRAGO), based on reference R-gene sequence data, was developed to search for plant resistance genes in public datasets such as Unigene and Genbank. New putative R-gene classes containing unknown domain combinations were discovered and characterized. The development of the PRG platform represents an important starting point to conduct various experimental tasks. The inferred cross-link between genomic and phenotypic information allows access to a large body of information to find answers to several biological questions. The database structure also permits easy integration with other data types and opens up prospects for future implementations. PMID:19906694

  16. Open-WiSe: A Solar Powered Wireless Sensor Network Platform

    PubMed Central

    González, Apolinar; Aquino, Raúl; Mata, Walter; Ochoa, Alberto; Saldaña, Pedro; Edwards, Arthur

    2012-01-01

    Because battery-powered nodes are required in wireless sensor networks and energy consumption represents an important design consideration, alternate energy sources are needed to provide more effective and optimal function. The main goal of this work is to present an energy harvesting wireless sensor network platform, the Open Wireless Sensor node (WiSe). The design and implementation of the solar powered wireless platform is described including the hardware architecture, firmware, and a POSIX Real-Time Kernel. A sleep and wake up strategy was implemented to prolong the lifetime of the wireless sensor network. This platform was developed as a tool for researchers investigating Wireless sensor network or system integrators. PMID:22969396

  17. The Role of Semantics in Open-World, Integrative, Collaborative Science Data Platforms

    NASA Astrophysics Data System (ADS)

    Fox, Peter; Chen, Yanning; Wang, Han; West, Patrick; Erickson, John; Ma, Marshall

    2014-05-01

    As collaborative science spreads into more and more Earth and space science fields, both participants and funders are expressing stronger needs for highly functional data and information capabilities. Characteristics include a) easy to use, b) highly integrated, c) leverage investments, d) accommodate rapid technical change, and e) do not incur undue expense or time to build or maintain - these are not a small set of requirements. Based on our accumulated experience over the last ~ decade and several key technical approaches, we adapt, extend, and integrate several open source applications and frameworks to handle major portions of functionality for these platforms. This includes: an object-type repository, collaboration tools, identity management, all within a portal managing diverse content and applications. In this contribution, we present our methods and results of information models, adaptation, integration and evolution of a networked data science architecture based on several open source technologies (Drupal, VIVO, the Comprehensive Knowledge Archive Network; CKAN, and the Global Handle System; GHS). In particular we present the Deep Carbon Observatory - a platform for international science collaboration. We present and discuss key functional and non-functional attributes, and discuss the general applicability of the platform.

  18. OpenDrop: An Integrated Do-It-Yourself Platform for Personal Use of Biochips

    PubMed Central

    Alistar, Mirela; Gaudenz, Urs

    2017-01-01

    Biochips, or digital labs-on-chip, are developed with the purpose of being used by laboratory technicians or biologists in laboratories or clinics. In this article, we expand this vision with the goal of enabling everyone, regardless of their expertise, to use biochips for their own personal purposes. We developed OpenDrop, an integrated electromicrofluidic platform that allows users to develop and program their own bio-applications. We address the main challenges that users may encounter: accessibility, bio-protocol design and interaction with microfluidics. OpenDrop consists of a do-it-yourself biochip, an automated software tool with visual interface and a detailed technique for at-home operations of microfluidics. We report on two years of use of OpenDrop, released as an open-source platform. Our platform attracted a highly diverse user base with participants originating from maker communities, academia and industry. Our findings show that 47% of attempts to replicate OpenDrop were successful, the main challenge remaining the assembly of the device. In terms of usability, the users managed to operate their platforms at home and are working on designing their own bio-applications. Our work provides a step towards a future in which everyone will be able to create microfluidic devices for their personal applications, thereby democratizing parts of health care. PMID:28952524

  19. tranSMART: An Open Source and Community-Driven Informatics and Data Sharing Platform for Clinical and Translational Research.

    PubMed

    Athey, Brian D; Braxenthaler, Michael; Haas, Magali; Guo, Yike

    2013-01-01

    tranSMART is an emerging global open source public private partnership community developing a comprehensive informatics-based analysis and data-sharing cloud platform for clinical and translational research. The tranSMART consortium includes pharmaceutical and other companies, not-for-profits, academic entities, patient advocacy groups, and government stakeholders. The tranSMART value proposition relies on the concept that the global community of users, developers, and stakeholders are the best source of innovation for applications and for useful data. Continued development and use of the tranSMART platform will create a means to enable "pre-competitive" data sharing broadly, saving money and, potentially accelerating research translation to cures. Significant transformative effects of tranSMART includes 1) allowing for all its user community to benefit from experts globally, 2) capturing the best of innovation in analytic tools, 3) a growing 'big data' resource, 4) convergent standards, and 5) new informatics-enabled translational science in the pharma, academic, and not-for-profit sectors.

  20. Bioclipse: an open source workbench for chemo- and bioinformatics.

    PubMed

    Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl E S

    2007-02-22

    There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no successful attempts have been made to integrate chemo- and bioinformatics into a single framework. Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at http://www.bioclipse.net.

  1. A Platform for Innovation and Standards Evaluation: a Case Study from the OpenMRS Open-Source Radiology Information System.

    PubMed

    Gichoya, Judy W; Kohli, Marc; Ivange, Larry; Schmidt, Teri S; Purkayastha, Saptarshi

    2018-05-10

    Open-source development can provide a platform for innovation by seeking feedback from community members as well as providing tools and infrastructure to test new standards. Vendors of proprietary systems may delay adoption of new standards until there are sufficient incentives such as legal mandates or financial incentives to encourage/mandate adoption. Moreover, open-source systems in healthcare have been widely adopted in low- and middle-income countries and can be used to bridge gaps that exist in global health radiology. Since 2011, the authors, along with a community of open-source contributors, have worked on developing an open-source radiology information system (RIS) across two communities-OpenMRS and LibreHealth. The main purpose of the RIS is to implement core radiology workflows, on which others can build and test new radiology standards. This work has resulted in three major releases of the system, with current architectural changes driven by changing technology, development of new standards in health and imaging informatics, and changing user needs. At their core, both these communities are focused on building general-purpose EHR systems, but based on user contributions from the fringes, we have been able to create an innovative system that has been used by hospitals and clinics in four different countries. We provide an overview of the history of the LibreHealth RIS, the architecture of the system, overview of standards integration, describe challenges of developing an open-source product, and future directions. Our goal is to attract more participation and involvement to further develop the LibreHealth RIS into an Enterprise Imaging System that can be used in other clinical imaging including pathology and dermatology.

  2. OpenICE medical device interoperability platform overview and requirement analysis.

    PubMed

    Arney, David; Plourde, Jeffrey; Goldman, Julian M

    2018-02-23

    We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.

  3. OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark

    NASA Astrophysics Data System (ADS)

    Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.

    2015-02-01

    We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.

  4. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.

    2014-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  5. An open-source wireless sensor stack: from Arduino to SDI-12 to Water One Flow

    NASA Astrophysics Data System (ADS)

    Hicks, S.; Damiano, S. G.; Smith, K. M.; Olexy, J.; Horsburgh, J. S.; Mayorga, E.; Aufdenkampe, A. K.

    2013-12-01

    Implementing a large-scale streaming environmental sensor network has previously been limited by the high cost of the datalogging and data communication infrastructure. The Christina River Basin Critical Zone Observatory (CRB-CZO) is overcoming the obstacles to large near-real-time data collection networks by using Arduino, an open source electronics platform, in combination with XBee ZigBee wireless radio modules. These extremely low-cost and easy-to-use open source electronics are at the heart of the new DIY movement and have provided solutions to countless projects by over half a million users worldwide. However, their use in environmental sensing is in its infancy. At present a primary limitation to widespread deployment of open-source electronics for environmental sensing is the lack of a simple, open-source software stack to manage streaming data from heterogeneous sensor networks. Here we present a functioning prototype software stack that receives sensor data over a self-meshing ZigBee wireless network from over a hundred sensors, stores the data locally and serves it on demand as a CUAHSI Water One Flow (WOF) web service. We highlight a few new, innovative components, including: (1) a versatile open data logger design based the Arduino electronics platform and ZigBee radios; (2) a software library implementing SDI-12 communication protocol between any Arduino platform and SDI12-enabled sensors without the need for additional hardware (https://github.com/StroudCenter/Arduino-SDI-12); and (3) 'midStream', a light-weight set of Python code that receives streaming sensor data, appends it with metadata on the fly by querying a relational database structured on an early version of the Observations Data Model version 2.0 (ODM2), and uses the WOFpy library to serve the data as WaterML via SOAP and REST web services.

  6. Sensing Slow Mobility and Interesting Locations for Lombardy Region (italy): a Case Study Using Pointwise Geolocated Open Data

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Oxoli, D.; Zurbarán, M. A.

    2016-06-01

    During the past years Web 2.0 technologies have caused the emergence of platforms where users can share data related to their activities which in some cases are then publicly released with open licenses. Popular categories for this include community platforms where users can upload GPS tracks collected during slow travel activities (e.g. hiking, biking and horse riding) and platforms where users share their geolocated photos. However, due to the high heterogeneity of the information available on the Web, the sole use of these user-generated contents makes it an ambitious challenge to understand slow mobility flows as well as to detect the most visited locations in a region. Exploiting the available data on community sharing websites allows to collect near real-time open data streams and enables rigorous spatial-temporal analysis. This work presents an approach for collecting, unifying and analysing pointwise geolocated open data available from different sources with the aim of identifying the main locations and destinations of slow mobility activities. For this purpose, we collected pointwise open data from the Wikiloc platform, Twitter, Flickr and Foursquare. The analysis was confined to the data uploaded in Lombardy Region (Northern Italy) - corresponding to millions of pointwise data. Collected data was processed through the use of Free and Open Source Software (FOSS) in order to organize them into a suitable database. This allowed to run statistical analyses on data distribution in both time and space by enabling the detection of users' slow mobility preferences as well as places of interest at a regional scale.

  7. Developing and Benchmarking Native Linux Applications on Android

    NASA Astrophysics Data System (ADS)

    Batyuk, Leonid; Schmidt, Aubrey-Derrick; Schmidt, Hans-Gunther; Camtepe, Ahmet; Albayrak, Sahin

    Smartphones get increasingly popular where more and more smartphone platforms emerge. Special attention was gained by the open source platform Android which was presented by the Open Handset Alliance (OHA) hosting members like Google, Motorola, and HTC. Android uses a Linux kernel and a stripped-down userland with a custom Java VM set on top. The resulting system joins the advantages of both environments, while third-parties are intended to develop only Java applications at the moment.

  8. Crowd Sourcing for Challenging Technical Problems and Business Model

    NASA Technical Reports Server (NTRS)

    Davis, Jeffrey R.; Richard, Elizabeth

    2011-01-01

    Crowd sourcing may be defined as the act of outsourcing tasks that are traditionally performed by an employee or contractor to an undefined, generally large group of people or community (a crowd) in the form of an open call. The open call may be issued by an organization wishing to find a solution to a particular problem or complete a task, or by an open innovation service provider on behalf of that organization. In 2008, the Space Life Sciences Directorate (SLSD), with the support of Wyle Integrated Science and Engineering, established and implemented pilot projects in open innovation (crowd sourcing) to determine if these new internet-based platforms could indeed find solutions to difficult technical challenges. These unsolved technical problems were converted to problem statements, also called "Challenges" or "Technical Needs" by the various open innovation service providers, and were then posted externally to seek solutions. In addition, an open call was issued internally to NASA employees Agency wide (10 Field Centers and NASA HQ) using an open innovation service provider crowd sourcing platform to post NASA challenges from each Center for the others to propose solutions). From 2008 to 2010, the SLSD issued 34 challenges, 14 externally and 20 internally. The 14 external problems or challenges were posted through three different vendors: InnoCentive, Yet2.com and TopCoder. The 20 internal challenges were conducted using the InnoCentive crowd sourcing platform designed for internal use by an organization. This platform was customized for NASA use and promoted as NASA@Work. The results were significant. Of the seven InnoCentive external challenges, two full and five partial awards were made in complex technical areas such as predicting solar flares and long-duration food packaging. Similarly, the TopCoder challenge yielded an optimization algorithm for designing a lunar medical kit. The Yet2.com challenges yielded many new industry and academic contacts in bone imaging, microbial detection and even the use of pharmaceuticals for radiation protection. The internal challenges through NASA@Work drew over 6000 participants across all NASA centers. Challenges conducted by each NASA center elicited ideas and solutions from several other NASA centers and demonstrated rapid and efficient participation from employees at multiple centers to contribute to problem solving. Finally, on January 19, 2011, the SLSD conducted a workshop on open collaboration and innovation strategies and best practices through the newly established NASA Human Health and Performance Center (NHHPC). Initial projects will be described leading to a new business model for SLSD.

  9. Viewpoints: A New Computer Program for Interactive Exploration of Large Multivariate Space Science and Astrophysics Data.

    NASA Astrophysics Data System (ADS)

    Levit, Creon; Gazis, P.

    2006-06-01

    The graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform (windows, linux, Apple OSX) application which leverages some of the power latent in the GPU to enable smooth interactive exploration and analysis of large high-dimensional data using a variety of classical and recent techniques. The targeted application area is the interactive analysis of complex, multivariate space science and astrophysics data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 10^6-10^8.

  10. FreeTure: A Free software to capTure meteors for FRIPON

    NASA Astrophysics Data System (ADS)

    Audureau, Yoan; Marmo, Chiara; Bouley, Sylvain; Kwon, Min-Kyung; Colas, François; Vaubaillon, Jérémie; Birlan, Mirel; Zanda, Brigitte; Vernazza, Pierre; Caminade, Stephane; Gattecceca, Jérôme

    2014-02-01

    The Fireball Recovery and Interplanetary Observation Network (FRIPON) is a French project started in 2014 which will monitor the sky, using 100 all-sky cameras to detect meteors and to retrieve related meteorites on the ground. There are several detection software all around. Some of them are proprietary. Also, some of them are hardware dependent. We present here the open source software for meteor detection to be installed on the FRIPON network's stations. The software will run on Linux with gigabit Ethernet cameras and we plan to make it cross platform. This paper is focused on the meteor detection method used for the pipeline development and the present capabilities.

  11. Plenario: An Open Data Discovery and Exploration Platform for Urban Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catlett, Charlie; Malik, Tanu; Goldstein, Brett J.

    2014-12-01

    The past decade has seen the widespread release of open data concerning city services, conditions, and activities by government bodies and public institutions of all sizes. Hundreds of open data portals now host thousands of datasets of many different types. These new data sources represent enormous po- tential for improved understanding of urban dynamics and processes—and, ultimately, for more livable, efficient, and prosperous communities. However, those who seek to realize this potential quickly discover that discovering and applying those data relevant to any particular question can be extraordinarily dif- ficult, due to decentralized storage, heterogeneous formats, and poor documentation. Inmore » this context, we introduce Plenario, a platform designed to automating time-consuming tasks associated with the discovery, exploration, and application of open city data—and, in so doing, reduce barriers to data use for researchers, policymakers, service providers, journalists, and members of the general public. Key innovations include a geospatial data warehouse that allows data from many sources to be registered into a common spatial and temporal frame; simple and intuitive interfaces that permit rapid discovery and exploration of data subsets pertaining to a particular area and time, regardless of type and source; easy export of such data subsets for further analysis; a user-configurable data ingest framework for automated importing and periodic updating of new datasets into the data warehouse; cloud hosting for elastic scaling and rapid creation of new Plenario instances; and an open source implementation to enable community contributions. We describe here the architecture and implementation of the Plenario platform, discuss lessons learned from its use by several communities, and outline plans for future work.« less

  12. Open source Modeling and optimization tools for Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peles, S.

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less

  13. Incomplete Intelligence: Is the Information Sharing Environment an Effective Platform?

    DTIC Science & Technology

    2012-09-01

    Initiative NYFD New York Fire Department NYPD New York Police Department OLAP On Line Analytics Processing OSINT Open Source Intelligence...Intelligence ( OSINT ), from public websites, media sources, and other unclassified events and reports. Although some of these sources do not have a direct

  14. ProXL (Protein Cross-Linking Database): A Platform for Analysis, Visualization, and Sharing of Protein Cross-Linking Mass Spectrometry Data

    PubMed Central

    2016-01-01

    ProXL is a Web application and accompanying database designed for sharing, visualizing, and analyzing bottom-up protein cross-linking mass spectrometry data with an emphasis on structural analysis and quality control. ProXL is designed to be independent of any particular software pipeline. The import process is simplified by the use of the ProXL XML data format, which shields developers of data importers from the relative complexity of the relational database schema. The database and Web interfaces function equally well for any software pipeline and allow data from disparate pipelines to be merged and contrasted. ProXL includes robust public and private data sharing capabilities, including a project-based interface designed to ensure security and facilitate collaboration among multiple researchers. ProXL provides multiple interactive and highly dynamic data visualizations that facilitate structural-based analysis of the observed cross-links as well as quality control. ProXL is open-source, well-documented, and freely available at https://github.com/yeastrc/proxl-web-app. PMID:27302480

  15. ProXL (Protein Cross-Linking Database): A Platform for Analysis, Visualization, and Sharing of Protein Cross-Linking Mass Spectrometry Data.

    PubMed

    Riffle, Michael; Jaschob, Daniel; Zelter, Alex; Davis, Trisha N

    2016-08-05

    ProXL is a Web application and accompanying database designed for sharing, visualizing, and analyzing bottom-up protein cross-linking mass spectrometry data with an emphasis on structural analysis and quality control. ProXL is designed to be independent of any particular software pipeline. The import process is simplified by the use of the ProXL XML data format, which shields developers of data importers from the relative complexity of the relational database schema. The database and Web interfaces function equally well for any software pipeline and allow data from disparate pipelines to be merged and contrasted. ProXL includes robust public and private data sharing capabilities, including a project-based interface designed to ensure security and facilitate collaboration among multiple researchers. ProXL provides multiple interactive and highly dynamic data visualizations that facilitate structural-based analysis of the observed cross-links as well as quality control. ProXL is open-source, well-documented, and freely available at https://github.com/yeastrc/proxl-web-app .

  16. Towards Networked Knowledge: The Learning Registry, an Infrastructure for Sharing Online Learning Resources

    ERIC Educational Resources Information Center

    Lee, Ashley; Hobson, Joe; Bienkowski, Marie; Midgley, Steve; Currier, Sarah; Campbell, Lorna M.; Novoselova, Tatiana

    2012-01-01

    In this article, the authors describe an open-source, open-data digital infrastructure for sharing information about open educational resources (OERs) across disparate systems and platforms. The Learning Registry, which began as a project funded by the U.S. Departments of Education and Defense, currently has an active international community…

  17. CosmoQuest: A Cyber-Infrastructure for Crowdsourcing Planetary Surface Mapping and More

    NASA Astrophysics Data System (ADS)

    Gay, P.; Lehan, C.; Moore, J.; Bracey, G.; Gugliucci, N.

    2014-04-01

    The design and implementation of programs to crowdsource science presents a unique set of challenges to system architects, programmers, and designers. The CosmoQuest Citizen Science Builder (CSB) is an open source platform designed to take advantage of crowd computing and open source platforms to solve crowdsourcing problems in Planetary Science. CSB combines a clean user interface with a powerful back end to allow the quick design and deployment of citizen science sites that meet the needs of both the random Joe Public, and the detail driven Albert Professional. In this talk, the software will be overviewed, and the results of usability testing and accuracy testing with both citizen and professional scientists will be discussed.

  18. The EDRN knowledge environment: an open source, scalable informatics platform for biological sciences research

    NASA Astrophysics Data System (ADS)

    Crichton, Daniel; Mahabal, Ashish; Anton, Kristen; Cinquini, Luca; Colbert, Maureen; Djorgovski, S. George; Kincaid, Heather; Kelly, Sean; Liu, David

    2017-05-01

    We describe here the Early Detection Research Network (EDRN) for Cancer's knowledge environment. It is an open source platform built by NASA's Jet Propulsion Laboratory with contributions from the California Institute of Technology, and Giesel School of Medicine at Dartmouth. It uses tools like Apache OODT, Plone, and Solr, and borrows heavily from JPL's Planetary Data System's ontological infrastructure. It has accumulated data on hundreds of thousands of biospecemens and serves over 1300 registered users across the National Cancer Institute (NCI). The scalable computing infrastructure is built such that we are being able to reach out to other agencies, provide homogeneous access, and provide seamless analytics support and bioinformatics tools through community engagement.

  19. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    NASA Technical Reports Server (NTRS)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  20. Efficient Comparison between Windows and Linux Platform Applicable in a Virtual Architectural Walkthrough Application

    NASA Astrophysics Data System (ADS)

    Thubaasini, P.; Rusnida, R.; Rohani, S. M.

    This paper describes Linux, an open source platform used to develop and run a virtual architectural walkthrough application. It proposes some qualitative reflections and observations on the nature of Linux in the concept of Virtual Reality (VR) and on the most popular and important claims associated with the open source approach. The ultimate goal of this paper is to measure and evaluate the performance of Linux used to build the virtual architectural walkthrough and develop a proof of concept based on the result obtain through this project. Besides that, this study reveals the benefits of using Linux in the field of virtual reality and reflects a basic comparison and evaluation between Windows and Linux base operating system. Windows platform is use as a baseline to evaluate the performance of Linux. The performance of Linux is measured based on three main criteria which is frame rate, image quality and also mouse motion.

  1. ArrayNinja: An Open Source Platform for Unified Planning and Analysis of Microarray Experiments.

    PubMed

    Dickson, B M; Cornett, E M; Ramjan, Z; Rothbart, S B

    2016-01-01

    Microarray-based proteomic platforms have emerged as valuable tools for studying various aspects of protein function, particularly in the field of chromatin biochemistry. Microarray technology itself is largely unrestricted in regard to printable material and platform design, and efficient multidimensional optimization of assay parameters requires fluidity in the design and analysis of custom print layouts. This motivates the need for streamlined software infrastructure that facilitates the combined planning and analysis of custom microarray experiments. To this end, we have developed ArrayNinja as a portable, open source, and interactive application that unifies the planning and visualization of microarray experiments and provides maximum flexibility to end users. Array experiments can be planned, stored to a private database, and merged with the imaged results for a level of data interaction and centralization that is not currently attainable with available microarray informatics tools. © 2016 Elsevier Inc. All rights reserved.

  2. Increasing Flight Software Reuse with OpenSatKit

    NASA Technical Reports Server (NTRS)

    McComas, David C.

    2018-01-01

    In January 2015 the NASA Goddard Space Flight Center (GSFC) released the Core Flight System (cFS) as open source under the NASA Open Source Agreement (NOSA) license. The cFS is based on flight software (FSW) developed for 12 spacecraft spanning nearly two decades of effort and it can provide about a third of the FSW functionality for a low-earth orbiting scientific spacecraft. The cFS is a FSW framework that is portable, configurable, and extendable using a product line deployment model. However, the components are maintained separately so the user must configure, integrate, and deploy them as a cohesive functional system. This can be very challenging especially for organizations such as universities building cubesats that have minimal experience developing FSW. Supporting universities was one of the primary motivators for releasing the cFS under NOSA. This paper describes the OpenSatKit that was developed to address the cFS deployment challenges and to serve as a cFS training platform for new users. It provides a fully functional out-of-the box software system that includes NASA's cFS, Ball Aerospace's command and control system COSMOS, and a NASA dynamic simulator called 42. The kit is freely available since all of the components have been released as open source. The kit runs on a Linux platform, includes 8 cFS applications, several kit-specific applications, and built in demos illustrating how to use key application features. It also includes the software necessary to port the cFS to a Raspberry Pi and instructions for configuring COSMOS to communicate with the target. All of the demos and test scripts can be rerun unchanged with the cFS running on the Raspberry Pi. The cFS uses a 3-tiered layered architecture including a platform abstraction layer, a Core Flight Executive (cFE) middle layer, and an application layer. Similar to smart phones, the cFS application layer is the key architectural feature for users to extend the FSW functionality to meet their mission-specific requirements. The platform abstraction layer and the cFE layers go a step further than smart phones by providing a platform-agnostic Application Programmer Interface (API) that allows applications to run unchanged on different platforms. OpenSatKit can serve two significant architectural roles that will further help the adoption of the cFS and help create a community of users that can share assets. First, the kit is being enhanced to automate the integration of applications with the goal of creating a virtual cFS "App Store".. Second, a platform certification test suite can be developed that would allow users to verify the port of the cFS to a new platform. This paper will describe the current state of these efforts and future plans.

  3. OpenSHS: Open Smart Home Simulator.

    PubMed

    Alshammari, Nasser; Alshammari, Talal; Sedky, Mohamed; Champion, Justin; Bauer, Carolin

    2017-05-02

    This paper develops a new hybrid, open-source, cross-platform 3D smart home simulator, OpenSHS, for dataset generation. OpenSHS offers an opportunity for researchers in the field of the Internet of Things (IoT) and machine learning to test and evaluate their models. Following a hybrid approach, OpenSHS combines advantages from both interactive and model-based approaches. This approach reduces the time and efforts required to generate simulated smart home datasets. We have designed a replication algorithm for extending and expanding a dataset. A small sample dataset produced, by OpenSHS, can be extended without affecting the logical order of the events. The replication provides a solution for generating large representative smart home datasets. We have built an extensible library of smart devices that facilitates the simulation of current and future smart home environments. Our tool divides the dataset generation process into three distinct phases: first design: the researcher designs the initial virtual environment by building the home, importing smart devices and creating contexts; second, simulation: the participant simulates his/her context-specific events; and third, aggregation: the researcher applies the replication algorithm to generate the final dataset. We conducted a study to assess the ease of use of our tool on the System Usability Scale (SUS).

  4. OpenSHS: Open Smart Home Simulator

    PubMed Central

    Alshammari, Nasser; Alshammari, Talal; Sedky, Mohamed; Champion, Justin; Bauer, Carolin

    2017-01-01

    This paper develops a new hybrid, open-source, cross-platform 3D smart home simulator, OpenSHS, for dataset generation. OpenSHS offers an opportunity for researchers in the field of the Internet of Things (IoT) and machine learning to test and evaluate their models. Following a hybrid approach, OpenSHS combines advantages from both interactive and model-based approaches. This approach reduces the time and efforts required to generate simulated smart home datasets. We have designed a replication algorithm for extending and expanding a dataset. A small sample dataset produced, by OpenSHS, can be extended without affecting the logical order of the events. The replication provides a solution for generating large representative smart home datasets. We have built an extensible library of smart devices that facilitates the simulation of current and future smart home environments. Our tool divides the dataset generation process into three distinct phases: first design: the researcher designs the initial virtual environment by building the home, importing smart devices and creating contexts; second, simulation: the participant simulates his/her context-specific events; and third, aggregation: the researcher applies the replication algorithm to generate the final dataset. We conducted a study to assess the ease of use of our tool on the System Usability Scale (SUS). PMID:28468330

  5. T-Check in Technologies for Interoperability: Web Services and Security--Single Sign-On

    DTIC Science & Technology

    2007-12-01

    following tools: • Apache Tomcat 6.0—a Java Servlet container to host the Web services and a simple Web client application [Apache 2007a] • Apache Axis...Eclipse. Eclipse – an open development platform. http://www.eclipse.org/ (2007) [Hunter 2001] Hunter, Jason. Java Servlet Programming, 2nd Edition...Citation SAML 1.1 Java Toolkit SAML Ping Identity’s SAML-1.1 implementation [SourceID 2006] OpenSAML SAML An open source implementation of SAML 1.1

  6. The successes and challenges of open-source biopharmaceutical innovation.

    PubMed

    Allarakhia, Minna

    2014-05-01

    Increasingly, open-source-based alliances seek to provide broad access to data, research-based tools, preclinical samples and downstream compounds. The challenge is how to create value from open-source biopharmaceutical innovation. This value creation may occur via transparency and usage of data across the biopharmaceutical value chain as stakeholders move dynamically between open source and open innovation. In this article, several examples are used to trace the evolution of biopharmaceutical open-source initiatives. The article specifically discusses the technological challenges associated with the integration and standardization of big data; the human capacity development challenges associated with skill development around big data usage; and the data-material access challenge associated with data and material access and usage rights, particularly as the boundary between open source and open innovation becomes more fluid. It is the author's opinion that the assessment of when and how value creation will occur, through open-source biopharmaceutical innovation, is paramount. The key is to determine the metrics of value creation and the necessary technological, educational and legal frameworks to support the downstream outcomes of now big data-based open-source initiatives. The continued focus on the early-stage value creation is not advisable. Instead, it would be more advisable to adopt an approach where stakeholders transform open-source initiatives into open-source discovery, crowdsourcing and open product development partnerships on the same platform.

  7. Implementation and performance test of cloud platform based on Hadoop

    NASA Astrophysics Data System (ADS)

    Xu, Jingxian; Guo, Jianhong; Ren, Chunlan

    2018-01-01

    Hadoop, as an open source project for the Apache foundation, is a distributed computing framework that deals with large amounts of data and has been widely used in the Internet industry. Therefore, it is meaningful to study the implementation of Hadoop platform and the performance of test platform. The purpose of this subject is to study the method of building Hadoop platform and to study the performance of test platform. This paper presents a method to implement Hadoop platform and a test platform performance method. Experimental results show that the proposed test performance method is effective and it can detect the performance of Hadoop platform.

  8. Fast and Efficient XML Data Access for Next-Generation Mass Spectrometry.

    PubMed

    Röst, Hannes L; Schmitt, Uwe; Aebersold, Ruedi; Malmström, Lars

    2015-01-01

    In mass spectrometry-based proteomics, XML formats such as mzML and mzXML provide an open and standardized way to store and exchange the raw data (spectra and chromatograms) of mass spectrometric experiments. These file formats are being used by a multitude of open-source and cross-platform tools which allow the proteomics community to access algorithms in a vendor-independent fashion and perform transparent and reproducible data analysis. Recent improvements in mass spectrometry instrumentation have increased the data size produced in a single LC-MS/MS measurement and put substantial strain on open-source tools, particularly those that are not equipped to deal with XML data files that reach dozens of gigabytes in size. Here we present a fast and versatile parsing library for mass spectrometric XML formats available in C++ and Python, based on the mature OpenMS software framework. Our library implements an API for obtaining spectra and chromatograms under memory constraints using random access or sequential access functions, allowing users to process datasets that are much larger than system memory. For fast access to the raw data structures, small XML files can also be completely loaded into memory. In addition, we have improved the parsing speed of the core mzML module by over 4-fold (compared to OpenMS 1.11), making our library suitable for a wide variety of algorithms that need fast access to dozens of gigabytes of raw mass spectrometric data. Our C++ and Python implementations are available for the Linux, Mac, and Windows operating systems. All proposed modifications to the OpenMS code have been merged into the OpenMS mainline codebase and are available to the community at https://github.com/OpenMS/OpenMS.

  9. Fast and Efficient XML Data Access for Next-Generation Mass Spectrometry

    PubMed Central

    Röst, Hannes L.; Schmitt, Uwe; Aebersold, Ruedi; Malmström, Lars

    2015-01-01

    Motivation In mass spectrometry-based proteomics, XML formats such as mzML and mzXML provide an open and standardized way to store and exchange the raw data (spectra and chromatograms) of mass spectrometric experiments. These file formats are being used by a multitude of open-source and cross-platform tools which allow the proteomics community to access algorithms in a vendor-independent fashion and perform transparent and reproducible data analysis. Recent improvements in mass spectrometry instrumentation have increased the data size produced in a single LC-MS/MS measurement and put substantial strain on open-source tools, particularly those that are not equipped to deal with XML data files that reach dozens of gigabytes in size. Results Here we present a fast and versatile parsing library for mass spectrometric XML formats available in C++ and Python, based on the mature OpenMS software framework. Our library implements an API for obtaining spectra and chromatograms under memory constraints using random access or sequential access functions, allowing users to process datasets that are much larger than system memory. For fast access to the raw data structures, small XML files can also be completely loaded into memory. In addition, we have improved the parsing speed of the core mzML module by over 4-fold (compared to OpenMS 1.11), making our library suitable for a wide variety of algorithms that need fast access to dozens of gigabytes of raw mass spectrometric data. Availability Our C++ and Python implementations are available for the Linux, Mac, and Windows operating systems. All proposed modifications to the OpenMS code have been merged into the OpenMS mainline codebase and are available to the community at https://github.com/OpenMS/OpenMS. PMID:25927999

  10. A multi-purpose open-source triggering platform for magnetic resonance

    NASA Astrophysics Data System (ADS)

    Ruytenberg, T.; Webb, A. G.; Beenakker, J. W. M.

    2014-10-01

    Many MR scans need to be synchronised with external events such as the cardiac or respiratory cycles. For common physiological functions commercial trigger equipment exists, but for more experimental inputs these are not available. This paper describes the design of a multi-purpose open-source trigger platform for MR systems. The heart of the system is an open-source Arduino Due microcontroller. This microcontroller samples an analogue input and digitally processes these data to determine the trigger. The output of the microcontroller is programmed to mimic a physiological signal which is fed into the electrocardiogram (ECG) or pulse oximeter port of MR scanner. The microcontroller is connected to a Bluetooth dongle that allows wireless monitoring and control outside the scanner room. This device can be programmed to generate a trigger based on various types of input. As one example, this paper describes how it can be used as an acoustic cardiac triggering unit. For this, a plastic stethoscope is connected to a microphone which is used as an input for the system. This test setup was used to acquire retrospectively-triggered cardiac scans in ten volunteers. Analysis showed that this platform produces a reliable trigger (>99% triggers are correct) with a small average 8 ms variation between the exact trigger points.

  11. A multi-purpose open-source triggering platform for magnetic resonance.

    PubMed

    Ruytenberg, T; Webb, A G; Beenakker, J W M

    2014-10-01

    Many MR scans need to be synchronised with external events such as the cardiac or respiratory cycles. For common physiological functions commercial trigger equipment exists, but for more experimental inputs these are not available. This paper describes the design of a multi-purpose open-source trigger platform for MR systems. The heart of the system is an open-source Arduino Due microcontroller. This microcontroller samples an analogue input and digitally processes these data to determine the trigger. The output of the microcontroller is programmed to mimic a physiological signal which is fed into the electrocardiogram (ECG) or pulse oximeter port of MR scanner. The microcontroller is connected to a Bluetooth dongle that allows wireless monitoring and control outside the scanner room. This device can be programmed to generate a trigger based on various types of input. As one example, this paper describes how it can be used as an acoustic cardiac triggering unit. For this, a plastic stethoscope is connected to a microphone which is used as an input for the system. This test setup was used to acquire retrospectively-triggered cardiac scans in ten volunteers. Analysis showed that this platform produces a reliable trigger (>99% triggers are correct) with a small average 8 ms variation between the exact trigger points. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    PubMed

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  13. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    PubMed Central

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  14. Rhamnogalacturonan-I Based Microcapsules for Targeted Drug Release

    PubMed Central

    Kusic, Anja; De Gobba, Cristian; Larsen, Flemming H.; Sassene, Philip; Zhou, Qi; van de Weert, Marco; Mullertz, Anette; Jørgensen, Bodil; Ulvskov, Peter

    2016-01-01

    Drug targeting to the colon via the oral administration route for local treatment of e.g. inflammatory bowel disease and colonic cancer has several advantages such as needle-free administration and low infection risk. A new source for delivery is plant-polysaccharide based delivery platforms such as Rhamnogalacturonan-I (RG-I). In the gastro-intestinal tract the RG-I is only degraded by the action of the colonic microflora. For assessment of potential drug delivery properties, RG-I based microcapsules (~1 μm in diameter) were prepared by an interfacial poly-addition reaction. The cross-linked capsules were loaded with a fluorescent dye (model drug). The capsules showed negligible and very little in vitro release when subjected to media simulating gastric and intestinal fluids, respectively. However, upon exposure to a cocktail of commercial RG-I cleaving enzymes, ~ 9 times higher release was observed, demonstrating that the capsules can be opened by enzymatic degradation. The combined results suggest a potential platform for targeted drug delivery in the terminal gastro-intestinal tract. PMID:27992455

  15. BEARS: a multi-mission anomaly response system

    NASA Astrophysics Data System (ADS)

    Roberts, Bryce A.

    2009-05-01

    The Mission Operations Group at UC Berkeley's Space Sciences Laboratory operates a highly automated ground station and presently a fleet of seven satellites, each with its own associated command and control console. However, the requirement for prompt anomaly detection and resolution is shared commonly between the ground segment and all spacecraft. The efficient, low-cost operation and "lights-out" staffing of the Mission Operations Group requires that controllers and engineers be notified of spacecraft and ground system problems around the clock. The Berkeley Emergency Anomaly and Response System (BEARS) is an in-house developed web- and paging-based software system that meets this need. BEARS was developed as a replacement for an existing emergency reporting software system that was too closedsource, platform-specific, expensive, and antiquated to expand or maintain. To avoid these limitations, the new system design leverages cross-platform, open-source software products such as MySQL, PHP, and Qt. Anomaly notifications and responses make use of the two-way paging capabilities of modern smart phones.

  16. The Prodiguer Messaging Platform

    NASA Astrophysics Data System (ADS)

    Greenslade, Mark; Denvil, Sebastien; Raciazek, Jerome; Carenton, Nicolas; Levavasseur, Guillame

    2014-05-01

    CONVERGENCE is a French multi-partner national project designed to gather HPC and informatics expertise to innovate in the context of running French climate models with differing grids and at differing resolutions. Efficient and reliable execution of these models and the management and dissemination of model output (data and meta-data) are just some of the complexities that CONVERGENCE aims to resolve. The Institut Pierre Simon Laplace (IPSL) is responsible for running climate simulations upon a set of heterogenous HPC environments within France. With heterogeneity comes added complexity in terms of simulation instrumentation and control. Obtaining a global perspective upon the state of all simulations running upon all HPC environments has hitherto been problematic. In this presentation we detail how, within the context of CONVERGENCE, the implementation of the Prodiguer messaging platform resolves complexity and permits the development of real-time applications such as: 1. a simulation monitoring dashboard; 2. a simulation metrics visualizer; 3. an automated simulation runtime notifier; 4. an automated output data & meta-data publishing pipeline; The Prodiguer messaging platform leverages a widely used open source message broker software called RabbitMQ. RabbitMQ itself implements the Advanced Message Queue Protocol (AMPQ). Hence it will be demonstrated that the Prodiguer messaging platform is built upon both open source and open standards.

  17. Applying open source data visualization tools to standard based medical data.

    PubMed

    Kopanitsa, Georgy; Taranik, Maxim

    2014-01-01

    Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.

  18. The dilemma of gaps in carbonate stratigraphic sequences: A case history from the Jurassic of the Venetian Alps, Italy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zempolich, W.G.

    The Lower Ammonitico Rosso (L.A.R.) of the western Venetian Alps is a thin, red nodular limestone that overlies a thick Late Triassic to Middle Jurassic (Aalenian) shallow-water platform sequence. It is thought to represent a Middle Jurassic (Bajocian) drowning event whereby the Trento Platform became a submerged plateau. The L.A.R. is problematic in that it (1) is rich in ammonites and other pelagic fauna; (2) contains stromatolites, oncolites, and shallow-water sedimentary structures; and (3) directly overlies platform sediments that contain complex brecciated fabrics filled by cement and Bajocian-age sediment. These seemingly contradictory features have generated much debate as to themore » sequence stratigraphy of the Trento Platform. New evidence suggests that the L.A.R. was deposited in a shallow-water environment following a period of subaerial exposure. In an east-west transect from the platform edge to platform interior, a clear transition is exhibited from high- and low-energy open-marine facies to restricted lagoonal facies. High-energy open-marine facies include well-sorted skeletal- and peloidal-rich sands possessing low-angle to planar cross stratification and thin, fairly sorted ammonite- and belemnite-rich gravels. Sands include both shallow-water and pelagic fauna; gravels commonly contain ripup clasts from underlying sediments. Low-energy open-marine facies consist of buff colored wackestones and packstones that contain ammonites, belemnites, pelagic bivalves, and peloids. With respect to paleogeography, the abrupt transition from open-marine facies at the platform edge to lagoonal facies in the platform interior indicates that thrombolites, stromatolites, and oncolites accumulated in a shallow restricted environment.« less

  19. The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot.

    PubMed

    Kitson, Philip J; Glatzel, Stefan; Cronin, Leroy

    2016-01-01

    An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic 'programs' which can run on similar low cost, user-constructed robotic platforms towards an 'open-source' regime in the area of chemical synthesis.

  20. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    PubMed

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  1. Tatool: a Java-based open-source programming framework for psychological studies.

    PubMed

    von Bastian, Claudia C; Locher, André; Ruflin, Michael

    2013-03-01

    Tatool (Training and Testing Tool) was developed to assist researchers with programming training software, experiments, and questionnaires. Tatool is Java-based, and thus is a platform-independent and object-oriented framework. The architecture was designed to meet the requirements of experimental designs and provides a large number of predefined functions that are useful in psychological studies. Tatool comprises features crucial for training studies (e.g., configurable training schedules, adaptive training algorithms, and individual training statistics) and allows for running studies online via Java Web Start. The accompanying "Tatool Online" platform provides the possibility to manage studies and participants' data easily with a Web-based interface. Tatool is published open source under the GNU Lesser General Public License, and is available at www.tatool.ch.

  2. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  3. The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science

    PubMed Central

    Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo

    2008-01-01

    The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570

  4. Managing multicentre clinical trials with open source.

    PubMed

    Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan

    2014-03-01

    Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.

  5. Low-cost, efficient wireless intelligent sensors (LEWIS) measuring real-time reference-free dynamic displacements

    NASA Astrophysics Data System (ADS)

    Ozdagli, A. I.; Liu, B.; Moreu, F.

    2018-07-01

    According to railroad managers, displacement of railroad bridges under service loads is an important parameter in the condition assessment and performance evaluation. However, measuring bridge responses in the field is often costly and labor-intensive. This paper proposes a low-cost, efficient wireless intelligent sensor (LEWIS) platform that can compute in real-time the dynamic transverse displacements of railroad bridges under service loads. This sensing platform drives on an open-source Arduino ecosystem and combines low-cost microcontrollers with affordable accelerometers and wireless transmission modules. The proposed LEWIS system is designed to reconstruct dynamic displacements from acceleration measurements onboard, eliminating the need for offline post-processing, and to transmit the data in real-time to a base station where the inspector at the bridge can see the displacements while the train is crossing, or to a remote office if so desired by internet. Researchers validated the effectiveness of the new LEWIS by conducting a series of laboratory experiments. A shake table setup simulated transverse bridge displacements measured on the field and excited the proposed platform, a commercially available wired expensive accelerometer, and reference LVDT displacement sensor. The responses obtained from the wireless system were compared to the displacements reconstructed from commercial accelerometer readings and the reference LVDT. The results of the laboratory experiments demonstrate that the proposed system is capable of reconstructing transverse displacements of railroad bridges under revenue service traffic accurately and transmitting the data in real-time wirelessly. In conclusion, the platform presented in this paper can be used in the performance assessment of railroad bridge network cost-effectively and accurately. Future work includes collecting real-time reference-free displacements of one railroad bridge in Colorado under train crossings to further prove LEWIS' suitability for engineering applications.

  6. Development of an Open Source, Air-Deployable Weather Station

    NASA Astrophysics Data System (ADS)

    Krejci, A.; Lopez Alcala, J. M.; Nelke, M.; Wagner, J.; Udell, C.; Higgins, C. W.; Selker, J. S.

    2017-12-01

    We created a packaged weather station intended to be deployed in the air on tethered systems. The device incorporates lightweight sensors and parts and runs for up to 24 hours off of lithium polymer batteries, allowing the entire package to be supported by a thin fiber. As the fiber does not provide a stable platform, additional data (pitch and roll) from typical weather parameters (e.g. temperature, pressure, humidity, wind speed, and wind direction) are determined using an embedded inertial motion unit. All designs are open sourced including electronics, CAD drawings, and descriptions of assembly and can be found on the OPEnS lab website at http://www.open-sensing.org/lowcost-weather-station/. The Openly Published Environmental Sensing Lab (OPEnS: Open-Sensing.org) expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting-edge technology. New OPEnS labs are now being established in India, France, Switzerland, the Netherlands, and Ghana.

  7. JobCenter: an open source, cross-platform, and distributed job queue management system optimized for scalability and versatility.

    PubMed

    Jaschob, Daniel; Riffle, Michael

    2012-07-30

    Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. JobCenter is a client-server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or "in the cloud") and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/.

  8. Breaking Away

    ERIC Educational Resources Information Center

    Panettieri, Joseph C.

    2007-01-01

    This article discusses open source projects which may free universities from expensive, rigid commercial software. But will the rewards outweigh the potential risks? The Kuali Project involves multiple universities writing and sharing code for their financial and operational systems. Another, the Sakai Project, is a community source platform for…

  9. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    NASA Astrophysics Data System (ADS)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  10. BioSmalltalk: a pure object system and library for bioinformatics.

    PubMed

    Morales, Hernán F; Giovambattista, Guillermo

    2013-09-15

    We have developed BioSmalltalk, a new environment system for pure object-oriented bioinformatics programming. Adaptive end-user programming systems tend to become more important for discovering biological knowledge, as is demonstrated by the emergence of open-source programming toolkits for bioinformatics in the past years. Our software is intended to bridge the gap between bioscientists and rapid software prototyping while preserving the possibility of scaling to whole-system biology applications. BioSmalltalk performs better in terms of execution time and memory usage than Biopython and BioPerl for some classical situations. BioSmalltalk is cross-platform and freely available (MIT license) through the Google Project Hosting at http://code.google.com/p/biosmalltalk hernan.morales@gmail.com Supplementary data are available at Bioinformatics online.

  11. The Bio-Community Perl toolkit for microbial ecology.

    PubMed

    Angly, Florent E; Fields, Christopher J; Tyson, Gene W

    2014-07-01

    The development of bioinformatic solutions for microbial ecology in Perl is limited by the lack of modules to represent and manipulate microbial community profiles from amplicon and meta-omics studies. Here we introduce Bio-Community, an open-source, collaborative toolkit that extends BioPerl. Bio-Community interfaces with commonly used programs using various file formats, including BIOM, and provides operations such as rarefaction and taxonomic summaries. Bio-Community will help bioinformaticians to quickly piece together custom analysis pipelines and develop novel software. Availability an implementation: Bio-Community is cross-platform Perl code available from http://search.cpan.org/dist/Bio-Community under the Perl license. A readme file describes software installation and how to contribute. © The Author 2014. Published by Oxford University Press.

  12. InDEx: Open Source iOS and Android Software for Self-Reporting and Monitoring of Alcohol Consumption.

    PubMed

    Leightley, Daniel; Puddephatt, Jo-Anne; Goodwin, Laura; Rona, Roberto; Fear, Nicola T

    2018-03-23

    InDEx is a software package for reporting and monitoring alcohol consumption via a smartphone application. Consumption of alcohol is self-reported by the user, and the app provides a visual representation of drinking behaviour and offers feedback on consumption levels compared to the general population. InDEx is intended as an exemplar app, operating as a standalone smartphone application and is highly customisable for a variety of research domains. InDEx is written in JavaScript, using IONIC framework which is cross-platform and is available under the liberal GNU General Public License (v3). The software is available from GitHub (https://github.com/DrDanL/index-app-public).

  13. InDEx: Open Source iOS and Android Software for Self-Reporting and Monitoring of Alcohol Consumption

    PubMed Central

    Leightley, Daniel; Puddephatt, Jo-Anne; Goodwin, Laura; Rona, Roberto; Fear, Nicola T.

    2018-01-01

    InDEx is a software package for reporting and monitoring alcohol consumption via a smartphone application. Consumption of alcohol is self-reported by the user, and the app provides a visual representation of drinking behaviour and offers feedback on consumption levels compared to the general population. InDEx is intended as an exemplar app, operating as a standalone smartphone application and is highly customisable for a variety of research domains. InDEx is written in JavaScript, using IONIC framework which is cross-platform and is available under the liberal GNU General Public License (v3). The software is available from GitHub (https://github.com/DrDanL/index-app-public). PMID:29795769

  14. Using CamiTK for rapid prototyping of interactive computer assisted medical intervention applications.

    PubMed

    Promayon, Emmanuel; Fouard, Céline; Bailet, Mathieu; Deram, Aurélien; Fiard, Gaëlle; Hungr, Nikolai; Luboz, Vincent; Payan, Yohan; Sarrazin, Johan; Saubat, Nicolas; Selmi, Sonia Yuki; Voros, Sandrine; Cinquin, Philippe; Troccaz, Jocelyne

    2013-01-01

    Computer Assisted Medical Intervention (CAMI hereafter) is a complex multi-disciplinary field. CAMI research requires the collaboration of experts in several fields as diverse as medicine, computer science, mathematics, instrumentation, signal processing, mechanics, modeling, automatics, optics, etc. CamiTK is a modular framework that helps researchers and clinicians to collaborate together in order to prototype CAMI applications by regrouping the knowledge and expertise from each discipline. It is an open-source, cross-platform generic and modular tool written in C++ which can handle medical images, surgical navigation, biomedicals simulations and robot control. This paper presents the Computer Assisted Medical Intervention ToolKit (CamiTK) and how it is used in various applications in our research team.

  15. Plenario: A Spatio-Temporal Platform for Discovery and Exploration of Urban Science Data

    NASA Astrophysics Data System (ADS)

    Engler, W. H.; Malik, T.; Catlett, C.; Foster, I.; Goldstein, B.

    2015-12-01

    The past decade has seen the widespread release of open data concerning city services, conditions, and activities by government bodies and public institutions of all sizes. Hundreds of open data portals now host thousands of datasets of many different types. These new data sources represent enormous potential for improved understanding of urban dynamics and processes—and, ultimately, for more livable, efficient, and prosperous communities. However, those who seek to realize this potential quickly discover that discovering and applying those data relevant to any particular question can be extraordinarily difficult, due to decentralized storage, heterogeneous formats, and poor documentation. In this context, we introduce Plenario, a platform designed to automating time-consuming tasks associated with the discovery, exploration, and application of open city data—and, in so doing, reduce barriers to data use for researchers, policymakers, service providers, journalists, and members of the general public. Key innovations include a geospatial data warehouse that allows data from many sources to be registered into a common spatial and temporal frame; simple and intuitive interfaces that permit rapid discovery and exploration of data subsets pertaining to a particular area and time, regardless of type and source; easy export of such data subsets for further analysis; a user-configurable data ingest framework for automated importing and periodic updating of new datasets into the data warehouse; cloud hosting for elastic scaling and rapid creation of new Plenario instances; and an open source implementation to enable community contributions. We describe here the architecture and implementation of the Plenario platform, discuss lessons learned from its use by several communities, and outline plans for future work.

  16. VideoHacking: Automated Tracking and Quantification of Locomotor Behavior with Open Source Software and Off-the-Shelf Video Equipment.

    PubMed

    Conklin, Emily E; Lee, Kathyann L; Schlabach, Sadie A; Woods, Ian G

    2015-01-01

    Differences in nervous system function can result in differences in behavioral output. Measurements of animal locomotion enable the quantification of these differences. Automated tracking of animal movement is less labor-intensive and bias-prone than direct observation, and allows for simultaneous analysis of multiple animals, high spatial and temporal resolution, and data collection over extended periods of time. Here, we present a new video-tracking system built on Python-based software that is free, open source, and cross-platform, and that can analyze video input from widely available video capture devices such as smartphone cameras and webcams. We validated this software through four tests on a variety of animal species, including larval and adult zebrafish (Danio rerio), Siberian dwarf hamsters (Phodopus sungorus), and wild birds. These tests highlight the capacity of our software for long-term data acquisition, parallel analysis of multiple animals, and application to animal species of different sizes and movement patterns. We applied the software to an analysis of the effects of ethanol on thigmotaxis (wall-hugging) behavior on adult zebrafish, and found that acute ethanol treatment decreased thigmotaxis behaviors without affecting overall amounts of motion. The open source nature of our software enables flexibility, customization, and scalability in behavioral analyses. Moreover, our system presents a free alternative to commercial video-tracking systems and is thus broadly applicable to a wide variety of educational settings and research programs.

  17. PaaS for web applications with OpenShift Origin

    NASA Astrophysics Data System (ADS)

    Lossent, A.; Rodriguez Peon, A.; Wagner, A.

    2017-10-01

    The CERN Web Frameworks team has deployed OpenShift Origin to facilitate deployment of web applications and to improving efficiency in terms of computing resource usage. OpenShift leverages Docker containers and Kubernetes orchestration to provide a Platform-as-a-service solution oriented for web applications. We will review use cases and how OpenShift was integrated with other services such as source control, web site management and authentication services.

  18. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    PubMed

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications.

  19. Increasing Flight Software Reuse with OpenSatKit

    NASA Technical Reports Server (NTRS)

    McComas, David

    2018-01-01

    In January 2015 the NASA Goddard Space Flight Center (GSFC) released the Core Flight System (cFS) as open source under the NASA Open Source Agreement (NOSA) license. The cFS is based on flight software (FSW) developed for 12 spacecraft spanning nearly two decades of effort and it can provide about a third of the FSW functionality for a low-earth orbiting scientific spacecraft. The cFS is a FSW framework that is portable, configurable, and extendable using a product line deployment model. However, the components are maintained separately so the user must configure, integrate, and deploy them as a cohesive functional system. This can be very challenging especially for organizations such as universities building cubesats that have minimal experience developing FSW. Supporting universities was one of the primary motivators for releasing the cFS under NOSA. This paper describes the OpenSatKit that was developed to address the cFS deployment challenges and to serve as a cFS training platform for new users. It provides a fully functional out-of-the box software system that includes NASA's cFS, Ball Aerospaceâ€"TM"s command and control system COSMOS, and a NASA dynamic simulator called 42. The kit is freely available since all of the components have been released as open source. The kit runs on a Linux platform, includes 8 cFS applications, several kit-specific applications, and built in demos illustrating how to use key application features. It also includes the software necessary to port the cFS to a Raspberry Pi and instructions for configuring COSMOS to communicate with the target. All of the demos and test scripts can be rerun unchanged with the cFS running on the Raspberry Pi. The cFS uses a 3-tiered layered architecture including a platform abstraction layer, a Core Flight Executive (cFE) middle layer, and an application layer. Similar to smart phones, the cFS application layer is the key architectural feature for userâ€"TM"s to extend the FSW functionality to meet their mission-specific requirements. The platform abstraction layer and the cFE layers go a step further than smart phones by providing a platform-agnostic Application Programmer Interface (API) that allows applications to run unchanged on different platforms. OpenSatKit can serve two significant architectural roles that will further help the adoption of the cFS and help create a community of users that can share assets. First, the kit is being enhanced to automate the integration of applications with the goal of creating a virtual cFS 'App Store'. Second, a platform certification test suite can be developed that would allow users to verify the port of the cFS to a new platform. This paper will describe the current state of these efforts and future plans.

  20. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  1. 78 FR 45565 - Notice Pursuant to the National Cooperative Research and Production Act of 1993 -- tranSMART...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-29

    ... activities are to enable effective sharing, integration, standardization, and analysis of heterogeneous data from collaborative translational research by mobilizing the tranSMART open- source and open-data...: (a) Establish and sustain tranSMART as the preferred data sharing and analytics platform for...

  2. Web Platform for Sharing Modeling Software in the Field of Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Dubenskaya, Julia; Kryukov, Alexander; Demichev, Andrey

    2018-02-01

    We describe the prototype of a Web platform intended for sharing software programs for computer modeling in the rapidly developing field of the nonlinear optics phenomena. The suggested platform is built on the top of the HUBZero open-source middleware. In addition to the basic HUBZero installation we added to our platform the capability to run Docker containers via an external application server and to send calculation programs to those containers for execution. The presented web platform provides a wide range of features and might be of benefit to nonlinear optics researchers.

  3. RADIOMICS.io | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    RADIOMICS.io is a open source platform for informatics developments for radiographic phenotyping using automated algorithms, such as engineered features or using deep learning technologies. With this platform, we aim to establish a reference standard for radiomic analyses, provide a tested and maintained resource, and to grow the community of radiomic developers addressing critical needs in cancer research.

  4. WaveformECG: A Platform for Visualizing, Annotating, and Analyzing ECG Data

    PubMed Central

    Winslow, Raimond L.; Granite, Stephen; Jurado, Christian

    2017-01-01

    The electrocardiogram (ECG) is the most commonly collected data in cardiovascular research because of the ease with which it can be measured and because changes in ECG waveforms reflect underlying aspects of heart disease. Accessed through a browser, WaveformECG is an open source platform supporting interactive analysis, visualization, and annotation of ECGs. PMID:28642673

  5. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research.

    PubMed

    Campagnola, Luke; Kratz, Megan B; Manis, Paul B

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  6. Pulseq-Graphical Programming Interface: Open source visual environment for prototyping pulse sequences and integrated magnetic resonance imaging algorithm development.

    PubMed

    Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam

    2018-03-11

    To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Platform for Quantitative Evaluation of Spatial Intratumoral Heterogeneity in Multiplexed Fluorescence Images.

    PubMed

    Spagnolo, Daniel M; Al-Kofahi, Yousef; Zhu, Peihong; Lezon, Timothy R; Gough, Albert; Stern, Andrew M; Lee, Adrian V; Ginty, Fiona; Sarachan, Brion; Taylor, D Lansing; Chennubhotla, S Chakra

    2017-11-01

    We introduce THRIVE (Tumor Heterogeneity Research Interactive Visualization Environment), an open-source tool developed to assist cancer researchers in interactive hypothesis testing. The focus of this tool is to quantify spatial intratumoral heterogeneity (ITH), and the interactions between different cell phenotypes and noncellular constituents. Specifically, we foresee applications in phenotyping cells within tumor microenvironments, recognizing tumor boundaries, identifying degrees of immune infiltration and epithelial/stromal separation, and identification of heterotypic signaling networks underlying microdomains. The THRIVE platform provides an integrated workflow for analyzing whole-slide immunofluorescence images and tissue microarrays, including algorithms for segmentation, quantification, and heterogeneity analysis. THRIVE promotes flexible deployment, a maintainable code base using open-source libraries, and an extensible framework for customizing algorithms with ease. THRIVE was designed with highly multiplexed immunofluorescence images in mind, and, by providing a platform to efficiently analyze high-dimensional immunofluorescence signals, we hope to advance these data toward mainstream adoption in cancer research. Cancer Res; 77(21); e71-74. ©2017 AACR . ©2017 American Association for Cancer Research.

  8. radR: an open-source platform for acquiring and analysing data on biological targets observed by surveillance radar.

    PubMed

    Taylor, Philip D; Brzustowski, John M; Matkovich, Carolyn; Peckford, Michael L; Wilson, Dave

    2010-10-26

    Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets.

  9. radR: an open-source platform for acquiring and analysing data on biological targets observed by surveillance radar

    PubMed Central

    2010-01-01

    Background Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Results Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Conclusions Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets. PMID:20977735

  10. GIS-Based Noise Simulation Open Source Software: N-GNOIS

    NASA Astrophysics Data System (ADS)

    Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh

    2015-12-01

    Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.

  11. PACS for Bhutan: a cost effective open source architecture for emerging countries.

    PubMed

    Ratib, Osman; Roduit, Nicolas; Nidup, Dechen; De Geer, Gerard; Rosset, Antoine; Geissbuhler, Antoine

    2016-10-01

    This paper reports the design and implementation of an innovative and cost-effective imaging management infrastructure suitable for radiology centres in emerging countries. It was implemented in the main referring hospital of Bhutan equipped with a CT, an MRI, digital radiology, and a suite of several ultrasound units. They lacked the necessary informatics infrastructure for image archiving and interpretation and needed a system for distribution of images to clinical wards. The solution developed for this project combines several open source software platforms in a robust and versatile archiving and communication system connected to analysis workstations equipped with a FDA-certified version of the highly popular Open-Source software. The whole system was implemented on standard off-the-shelf hardware. The system was installed in three days, and training of the radiologists as well as the technical and IT staff was provided onsite to ensure full ownership of the system by the local team. Radiologists were rapidly capable of reading and interpreting studies on the diagnostic workstations, which had a significant benefit on their workflow and ability to perform diagnostic tasks more efficiently. Furthermore, images were also made available to several clinical units on standard desktop computers through a web-based viewer. • Open source imaging informatics platforms can provide cost-effective alternatives for PACS • Robust and cost-effective open architecture can provide adequate solutions for emerging countries • Imaging informatics is often lacking in hospitals equipped with digital modalities.

  12. Language Geography from Microblogging Platforms

    NASA Astrophysics Data System (ADS)

    Mocanu, Delia; Baronchelli, Andrea; Perra, Nicola; Gonçalves, Bruno; Vespignani, Alessandro

    2013-03-01

    Microblogging platforms have now become major open source indicators for complex social interactions. With the advent of smartphones, the everincreasing mobile Internet traffic gives us the unprecedented opportunity to complement studies of complex social phenomena with real-time location information. In this work, we show that the data nowadays accessible allows for detailed studies at different scales, ranging from country-level aggregate analysis to the analysis of linguistic communities withing specific neighborhoods. The high resolution and coverage of this data permits us to investigate such issues as the linguistic homogeneity of different countries, touristic seasonal patterns within countries, and the geographical distribution of different languages in bilingual regions. This work highlights the potentialities of geolocalized studies of open data sources that can provide an extremely detailed picture of the language geography.

  13. Open-source colorimeter.

    PubMed

    Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M

    2013-04-19

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.

  14. Open-Source Colorimeter

    PubMed Central

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032

  15. SPIM-fluid: open source light-sheet based platform for high-throughput imaging

    PubMed Central

    Gualda, Emilio J.; Pereira, Hugo; Vale, Tiago; Estrada, Marta Falcão; Brito, Catarina; Moreno, Nuno

    2015-01-01

    Light sheet fluorescence microscopy has recently emerged as the technique of choice for obtaining high quality 3D images of whole organisms/embryos with low photodamage and fast acquisition rates. Here we present an open source unified implementation based on Arduino and Micromanager, which is capable of operating Light Sheet Microscopes for automatized 3D high-throughput imaging on three-dimensional cell cultures and model organisms like zebrafish, oriented to massive drug screening. PMID:26601007

  16. Web-Based Portfolio Assessment: Validation of an Open Source Platform

    ERIC Educational Resources Information Center

    Collins, Regina; Elliot, Norbert; Klobucar, Andrew; Deek, Fadi P.

    2013-01-01

    Assessment of educational outcomes through purchased tests is commonplace in the evaluation of individual student ability and of educational programs. Focusing on the assessment of writing performance in a longitudinal study of first-time, full-time students (n = 598), this research describes the design, use, and assessment of an open-source…

  17. Biosignal PI, an Affordable Open-Source ECG and Respiration Measurement System

    PubMed Central

    Abtahi, Farhad; Snäll, Jonatan; Aslamy, Benjamin; Abtahi, Shirin; Seoane, Fernando; Lindecrantz, Kaj

    2015-01-01

    Bioimedical pilot projects e.g., telemedicine, homecare, animal and human trials usually involve several physiological measurements. Technical development of these projects is time consuming and in particular costly. A versatile but affordable biosignal measurement platform can help to reduce time and risk while keeping the focus on the important goal and making an efficient use of resources. In this work, an affordable and open source platform for development of physiological signals is proposed. As a first step an 8–12 leads electrocardiogram (ECG) and respiration monitoring system is developed. Chips based on iCoupler technology have been used to achieve electrical isolation as required by IEC 60601 for patient safety. The result shows the potential of this platform as a base for prototyping compact, affordable, and medically safe measurement systems. Further work involves both hardware and software development to develop modules. These modules may require development of front-ends for other biosignals or just collect data wirelessly from different devices e.g., blood pressure, weight, bioimpedance spectrum, blood glucose, e.g., through Bluetooth. All design and development documents, files and source codes will be available for non-commercial use through project website, BiosignalPI.org. PMID:25545268

  18. Macedonian journal of chemistry and chemical engineering: open journal systems--editor's perspective.

    PubMed

    Zdravkovski, Zoran

    2014-01-01

    The development and availability of personal computers and software as well as printing techniques in the last twenty years have made a profound change in the publication of scientific journals. Additionally, the Internet in the last decade has revolutionized the publication process to the point of changing the basic paradigm of printed journals. The Macedonian Journal of Chemistry and Chemical Engineering in its 40-year history has adopted and adapted to all these transformations. In order to keep up with the inevitable changes, as editor-in-chief I felt my responsibility was to introduce an electronic editorial managing of the journal. The choice was between commercial and open source platforms, and because of the limited funding of the journal we chose the latter. We decided on Open Journal Systems, which provided online submission and management of all content, had flexible configuration--requirements, sections, review process, etc., had options for comprehensive indexing, offered various reading tools, had email notification and commenting ability for readers, had an option for thesis abstracts and was installed locally. However, since there is limited support it requires a moderate computer knowledge/skills and effort in order to set up. Overall, it is an excellent editorial platform and a convenient solution for journals with a low budget or journals that do not want to spend their resources on commercial platforms or simply support the idea of open source software.

  19. The Use of Distance Learning in the Educational Process Content and Structure of Educational Platforms--Analysis of the Platform Educans

    ERIC Educational Resources Information Center

    Igado, Manuel Fandos

    2010-01-01

    This work provides some considerations that complements the scarcity of researches this field of knowledge of the e-learning specifically referred to secondary education. Distance training programmes (both open source code and not) are becoming increasingly more popular, especially in higher level education. However, there are very few cases of…

  20. A Real-Time Linux for Multicore Platforms

    DTIC Science & Technology

    2013-12-20

    under ARO support) to obtain a fully-functional OS for supporting real-time workloads on multicore platforms. This system, called LITMUS -RT...to be specified as plugin components. LITMUS -RT is open-source software (available at The views, opinions and/or findings contained in this report... LITMUS -RT (LInux Testbed for MUltiprocessor Scheduling in Real-Time systems), allows different multiprocessor real-time scheduling and

  1. Open-source telemedicine platform for wireless medical video communication.

    PubMed

    Panayides, A; Eleftheriou, I; Pantziaris, M

    2013-01-01

    An m-health system for real-time wireless communication of medical video based on open-source software is presented. The objective is to deliver a low-cost telemedicine platform which will allow for reliable remote diagnosis m-health applications such as emergency incidents, mass population screening, and medical education purposes. The performance of the proposed system is demonstrated using five atherosclerotic plaque ultrasound videos. The videos are encoded at the clinically acquired resolution, in addition to lower, QCIF, and CIF resolutions, at different bitrates, and four different encoding structures. Commercially available wireless local area network (WLAN) and 3.5G high-speed packet access (HSPA) wireless channels are used to validate the developed platform. Objective video quality assessment is based on PSNR ratings, following calibration using the variable frame delay (VFD) algorithm that removes temporal mismatch between original and received videos. Clinical evaluation is based on atherosclerotic plaque ultrasound video assessment protocol. Experimental results show that adequate diagnostic quality wireless medical video communications are realized using the designed telemedicine platform. HSPA cellular networks provide for ultrasound video transmission at the acquired resolution, while VFD algorithm utilization bridges objective and subjective ratings.

  2. An Open Source Software and Web-GIS Based Platform for Airborne SAR Remote Sensing Data Management, Distribution and Sharing

    NASA Astrophysics Data System (ADS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; Ming, Liu

    2014-03-01

    With more and more Earth observation data available to the community, how to manage and sharing these valuable remote sensing datasets is becoming an urgent issue to be solved. The web based Geographical Information Systems (GIS) technology provides a convenient way for the users in different locations to share and make use of the same dataset. In order to efficiently use the airborne Synthetic Aperture Radar (SAR) remote sensing data acquired in the Airborne Remote Sensing Center of the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS), a Web-GIS based platform for airborne SAR data management, distribution and sharing was designed and developed. The major features of the system include map based navigation search interface, full resolution imagery shown overlaid the map, and all the software adopted in the platform are Open Source Software (OSS). The functions of the platform include browsing the imagery on the map navigation based interface, ordering and downloading data online, image dataset and user management, etc. At present, the system is under testing in RADI and will come to regular operation soon.

  3. Open-Source Telemedicine Platform for Wireless Medical Video Communication

    PubMed Central

    Panayides, A.; Eleftheriou, I.; Pantziaris, M.

    2013-01-01

    An m-health system for real-time wireless communication of medical video based on open-source software is presented. The objective is to deliver a low-cost telemedicine platform which will allow for reliable remote diagnosis m-health applications such as emergency incidents, mass population screening, and medical education purposes. The performance of the proposed system is demonstrated using five atherosclerotic plaque ultrasound videos. The videos are encoded at the clinically acquired resolution, in addition to lower, QCIF, and CIF resolutions, at different bitrates, and four different encoding structures. Commercially available wireless local area network (WLAN) and 3.5G high-speed packet access (HSPA) wireless channels are used to validate the developed platform. Objective video quality assessment is based on PSNR ratings, following calibration using the variable frame delay (VFD) algorithm that removes temporal mismatch between original and received videos. Clinical evaluation is based on atherosclerotic plaque ultrasound video assessment protocol. Experimental results show that adequate diagnostic quality wireless medical video communications are realized using the designed telemedicine platform. HSPA cellular networks provide for ultrasound video transmission at the acquired resolution, while VFD algorithm utilization bridges objective and subjective ratings. PMID:23573082

  4. Open Source Next Generation Visualization Software for Interplanetary Missions

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  5. Free and open source software for the manipulation of digital images.

    PubMed

    Solomon, Robert W

    2009-06-01

    Free and open source software is a type of software that is nearly as powerful as commercial software but is freely downloadable. This software can do almost everything that the expensive programs can. GIMP (gnu image manipulation program) is the free program that is comparable to Photoshop, and versions are available for Windows, Macintosh, and Linux platforms. This article briefly describes how GIMP can be installed and used to manipulate radiology images. It is no longer necessary to budget large amounts of money for high-quality software to achieve the goals of image processing and document creation because free and open source software is available for the user to download at will.

  6. Low Cost and Flexible UAV Deployment of Sensors

    PubMed Central

    Sørensen, Lars Yndal; Jacobsen, Lars Toft; Hansen, John Paulin

    2017-01-01

    This paper presents a platform for airborne sensor applications using low-cost, open-source components carried by an easy-to-fly unmanned aircraft vehicle (UAV). The system, available in open-source , is designed for researchers, students and makers for a broad range of exploration and data-collection needs. The main contribution is the extensible architecture for modularized airborne sensor deployment and real-time data visualisation. Our open-source Android application provides data collection, flight path definition and map tools. Total cost of the system is below 800 dollars. The flexibility of the system is illustrated by mapping the location of Bluetooth beacons (iBeacons) on a ground field and by measuring water temperature in a lake. PMID:28098819

  7. Low Cost and Flexible UAV Deployment of Sensors.

    PubMed

    Sørensen, Lars Yndal; Jacobsen, Lars Toft; Hansen, John Paulin

    2017-01-14

    This paper presents a platform for airborne sensor applications using low-cost, open-source components carried by an easy-to-fly unmanned aircraft vehicle (UAV). The system, available in open-source , is designed for researchers, students and makers for a broad range of exploration and data-collection needs. The main contribution is the extensible architecture for modularized airborne sensor deployment and real-time data visualisation. Our open-source Android application provides data collection, flight path definition and map tools. Total cost of the system is below 800 dollars. The flexibility of the system is illustrated by mapping the location of Bluetooth beacons (iBeacons) on a ground field and by measuring water temperature in a lake.

  8. Development of an IHE MRRT-compliant open-source web-based reporting platform.

    PubMed

    Pinto Dos Santos, Daniel; Klos, G; Kloeckner, R; Oberle, R; Dueber, C; Mildenberger, P

    2017-01-01

    To develop a platform that uses structured reporting templates according to the IHE Management of Radiology Report Templates (MRRT) profile, and to implement this platform into clinical routine. The reporting platform uses standard web technologies (HTML / JavaScript and PHP / MySQL) only. Several freely available external libraries were used to simplify the programming. The platform runs on a standard web server, connects with the radiology information system (RIS) and PACS, and is easily accessible via a standard web browser. A prototype platform that allows structured reporting to be easily incorporated into the clinical routine was developed and successfully tested. To date, 797 reports were generated using IHE MRRT-compliant templates (many of them downloaded from the RSNA's radreport.org website). Reports are stored in a MySQL database and are easily accessible for further analyses. Development of an IHE MRRT-compliant platform for structured reporting is feasible using only standard web technologies. All source code will be made available upon request under a free license, and the participation of other institutions in further development is welcome. • A platform for structured reporting using IHE MRRT-compliant templates is presented. • Incorporating structured reporting into clinical routine is feasible. • Full source code will be provided upon request under a free license.

  9. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    PubMed

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  10. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    PubMed Central

    2011-01-01

    Background Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. Results To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  11. BEASTling: A software tool for linguistic phylogenetics using BEAST 2

    PubMed Central

    Forkel, Robert; Kaiping, Gereon A.; Atkinson, Quentin D.

    2017-01-01

    We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts. PMID:28796784

  12. BEASTling: A software tool for linguistic phylogenetics using BEAST 2.

    PubMed

    Maurits, Luke; Forkel, Robert; Kaiping, Gereon A; Atkinson, Quentin D

    2017-01-01

    We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts.

  13. Technical Communications in OSS Content Management Systems: An Academic Institutional Case Study

    ERIC Educational Resources Information Center

    Cripps, Michael J.

    2011-01-01

    Single sourcing through a content management system (CMS) is altering technical communication practices in many organizations, including institutions of higher education. Open source software (OSS) solutions are currently among the most popular content management platforms adopted by colleges and universities in the United States and abroad. The…

  14. Adopting Open-Source Software Applications in U. S. Higher Education: A Cross-Disciplinary Review of the Literature

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2009-01-01

    Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…

  15. Clinician accessible tools for GUI computational models of transcranial electrical stimulation: BONSAI and SPHERES.

    PubMed

    Truong, Dennis Q; Hüber, Mathias; Xie, Xihe; Datta, Abhishek; Rahman, Asif; Parra, Lucas C; Dmochowski, Jacek P; Bikson, Marom

    2014-01-01

    Computational models of brain current flow during transcranial electrical stimulation (tES), including transcranial direct current stimulation (tDCS) and transcranial alternating current stimulation (tACS), are increasingly used to understand and optimize clinical trials. We propose that broad dissemination requires a simple graphical user interface (GUI) software that allows users to explore and design montages in real-time, based on their own clinical/experimental experience and objectives. We introduce two complimentary open-source platforms for this purpose: BONSAI and SPHERES. BONSAI is a web (cloud) based application (available at neuralengr.com/bonsai) that can be accessed through any flash-supported browser interface. SPHERES (available at neuralengr.com/spheres) is a stand-alone GUI application that allow consideration of arbitrary montages on a concentric sphere model by leveraging an analytical solution. These open-source tES modeling platforms are designed go be upgraded and enhanced. Trade-offs between open-access approaches that balance ease of access, speed, and flexibility are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Knowledge Discovery for Smart Grid Operation, Control, and Situation Awareness -- A Big Data Visualization Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gu, Yi; Jiang, Huaiguang; Zhang, Yingchen

    In this paper, a big data visualization platform is designed to discover the hidden useful knowledge for smart grid (SG) operation, control and situation awareness. The spawn of smart sensors at both grid side and customer side can provide large volume of heterogeneous data that collect information in all time spectrums. Extracting useful knowledge from this big-data poll is still challenging. In this paper, the Apache Spark, an open source cluster computing framework, is used to process the big-data to effectively discover the hidden knowledge. A high-speed communication architecture utilizing the Open System Interconnection (OSI) model is designed to transmitmore » the data to a visualization platform. This visualization platform uses Google Earth, a global geographic information system (GIS) to link the geological information with the SG knowledge and visualize the information in user defined fashion. The University of Denver's campus grid is used as a SG test bench and several demonstrations are presented for the proposed platform.« less

  17. The Orthanc Ecosystem for Medical Imaging.

    PubMed

    Jodogne, Sébastien

    2018-05-03

    This paper reviews the components of Orthanc, a free and open-source, highly versatile ecosystem for medical imaging. At the core of the Orthanc ecosystem, the Orthanc server is a lightweight vendor neutral archive that provides PACS managers with a powerful environment to automate and optimize the imaging flows that are very specific to each hospital. The Orthanc server can be extended with plugins that provide solutions for teleradiology, digital pathology, or enterprise-ready databases. It is shown how software developers and research engineers can easily develop external software or Web portals dealing with medical images, with minimal knowledge of the DICOM standard, thanks to the advanced programming interface of the Orthanc server. The paper concludes by introducing the Stone of Orthanc, an innovative toolkit for the cross-platform rendering of medical images.

  18. "Just Another Tool for Online Studies” (JATOS): An Easy Solution for Setup and Management of Web Servers Supporting Online Studies

    PubMed Central

    Lange, Kristian; Kühn, Simone; Filevich, Elisa

    2015-01-01

    We present here “Just Another Tool for Online Studies” (JATOS): an open source, cross-platform web application with a graphical user interface (GUI) that greatly simplifies setting up and communicating with a web server to host online studies that are written in JavaScript. JATOS is easy to install in all three major platforms (Microsoft Windows, Mac OS X, and Linux), and seamlessly pairs with a database for secure data storage. It can be installed on a server or locally, allowing researchers to try the application and feasibility of their studies within a browser environment, before engaging in setting up a server. All communication with the JATOS server takes place via a GUI (with no need to use a command line interface), making JATOS an especially accessible tool for researchers without a strong IT background. We describe JATOS’ main features and implementation and provide a detailed tutorial along with example studies to help interested researchers to set up their online studies. JATOS can be found under the Internet address: www.jatos.org. PMID:26114751

  19. An open-source java platform for automated reaction mapping.

    PubMed

    Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M

    2010-09-27

    This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.

  20. CFD and Neutron codes coupling on a computational platform

    NASA Astrophysics Data System (ADS)

    Cerroni, D.; Da Vià, R.; Manservisi, S.; Menghini, F.; Scardovelli, R.

    2017-01-01

    In this work we investigate the thermal-hydraulics behavior of a PWR nuclear reactor core, evaluating the power generation distribution taking into account the local temperature field. The temperature field, evaluated using a self-developed CFD module, is exchanged with a neutron code, DONJON-DRAGON, which updates the macroscopic cross sections and evaluates the new neutron flux. From the updated neutron flux the new peak factor is evaluated and the new temperature field is computed. The exchange of data between the two codes is obtained thanks to their inclusion into the computational platform SALOME, an open-source tools developed by the collaborative project NURESAFE. The numerical libraries MEDmem, included into the SALOME platform, are used in this work, for the projection of computational fields from one problem to another. The two problems are driven by a common supervisor that can access to the computational fields of both systems, in every time step, the temperature field, is extracted from the CFD problem and set into the neutron problem. After this iteration the new power peak factor is projected back into the CFD problem and the new time step can be computed. Several computational examples, where both neutron and thermal-hydraulics quantities are parametrized, are finally reported in this work.

  1. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    PubMed

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  2. OPAL: An Open-Source MPI-IO Library over Cray XT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Weikuan; Vetter, Jeffrey S; Canon, Richard Shane

    Parallel IO over Cray XT is supported by a vendor-supplied MPI-IO package. This package contains a proprietary ADIO implementation built on top of the sysio library. While it is reasonable to maintain a stable code base for application scientists' convenience, it is also very important to the system developers and researchers to analyze and assess the effectiveness of parallel IO software, and accordingly, tune and optimize the MPI-IO implementation. A proprietary parallel IO code base relinquishes such flexibilities. On the other hand, a generic UFS-based MPI-IO implementation is typically used on many Linux-based platforms. We have developed an open-source MPI-IOmore » package over Lustre, referred to as OPAL (OPportunistic and Adaptive MPI-IO Library over Lustre). OPAL provides a single source-code base for MPI-IO over Lustre on Cray XT and Linux platforms. Compared to Cray implementation, OPAL provides a number of good features, including arbitrary specification of striping patterns and Lustre-stripe aligned file domain partitioning. This paper presents the performance comparisons between OPAL and Cray's proprietary implementation. Our evaluation demonstrates that OPAL achieves the performance comparable to the Cray implementation. We also exemplify the benefits of an open source package in revealing the underpinning of the parallel IO performance.« less

  3. AZOrange - High performance open source machine learning for QSAR modeling in a graphical programming environment

    PubMed Central

    2011-01-01

    Background Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. Results This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. Conclusions AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements. PMID:21798025

  4. AZOrange - High performance open source machine learning for QSAR modeling in a graphical programming environment.

    PubMed

    Stålring, Jonna C; Carlsson, Lars A; Almeida, Pedro; Boyer, Scott

    2011-07-28

    Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements.

  5. ROS-IGTL-Bridge: an open network interface for image-guided therapy using the ROS environment.

    PubMed

    Frank, Tobias; Krieger, Axel; Leonard, Simon; Patel, Niravkumar A; Tokuda, Junichi

    2017-08-01

    With the growing interest in advanced image-guidance for surgical robot systems, rapid integration and testing of robotic devices and medical image computing software are becoming essential in the research and development. Maximizing the use of existing engineering resources built on widely accepted platforms in different fields, such as robot operating system (ROS) in robotics and 3D Slicer in medical image computing could simplify these tasks. We propose a new open network bridge interface integrated in ROS to ensure seamless cross-platform data sharing. A ROS node named ROS-IGTL-Bridge was implemented. It establishes a TCP/IP network connection between the ROS environment and external medical image computing software using the OpenIGTLink protocol. The node exports ROS messages to the external software over the network and vice versa simultaneously, allowing seamless and transparent data sharing between the ROS-based devices and the medical image computing platforms. Performance tests demonstrated that the bridge could stream transforms, strings, points, and images at 30 fps in both directions successfully. The data transfer latency was <1.2 ms for transforms, strings and points, and 25.2 ms for color VGA images. A separate test also demonstrated that the bridge could achieve 900 fps for transforms. Additionally, the bridge was demonstrated in two representative systems: a mock image-guided surgical robot setup consisting of 3D slicer, and Lego Mindstorms with ROS as a prototyping and educational platform for IGT research; and the smart tissue autonomous robot surgical setup with 3D Slicer. The study demonstrated that the bridge enabled cross-platform data sharing between ROS and medical image computing software. This will allow rapid and seamless integration of advanced image-based planning/navigation offered by the medical image computing software such as 3D Slicer into ROS-based surgical robot systems.

  6. A Far Cry from School History: Massive Online Open Courses as a Generative Source for Historical Research

    ERIC Educational Resources Information Center

    Gallagher, Silvia; Wallace, Ciaran

    2016-01-01

    Current research into Massive Online Open Courses (MOOCs) has neglected the potential of using learner comments for discipline-specific analysis. This article explores how MOOCs, within the historical discipline, can be used to generate, investigate, and document personal narratives, and argues that they serve as a rich platform for historical…

  7. Tools to Analyze Morphology and Spatially Mapped Molecular Data | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    This project is to develop, deploy, and disseminate a suite of open source tools and integrated informatics platform that will facilitate multi-scale, correlative analyses of high resolution whole slide tissue image data, spatially mapped genetics and molecular data for cancer research. This platform will play an essential role in supporting studies of tumor initiation, development, heterogeneity, invasion, and metastasis.

  8. TH-C-12A-12: Veritas: An Open Source Tool to Facilitate User Interaction with TrueBeam Developer Mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, P; Varian Medical Systems, Palo Alto, CA; Lewis, J

    2014-06-15

    Purpose: To address the challenges of creating delivery trajectories and imaging sequences with TrueBeam Developer Mode, a new open-source graphical XML builder, Veritas, has been developed, tested and made freely available. Veritas eliminates most of the need to understand the underlying schema and write XML scripts, by providing a graphical menu for each control point specifying the state of 30 mechanical/dose axes. All capabilities of Developer Mode are accessible in Veritas. Methods: Veritas was designed using QT Designer, a ‘what-you-is-what-you-get’ (WYSIWIG) tool for building graphical user interfaces (GUI). Different components of the GUI are integrated using QT's signals and slotsmore » mechanism. Functionalities are added using PySide, an open source, cross platform Python binding for the QT framework. The XML code generated is immediately visible, making it an interactive learning tool. A user starts from an anonymized DICOM file or XML example and introduces delivery modifications, or begins their experiment from scratch, then uses the GUI to modify control points as desired. The software automatically generates XML plans following the appropriate schema. Results: Veritas was tested by generating and delivering two XML plans at Brigham and Women's Hospital. The first example was created to irradiate the letter ‘B’ with a narrow MV beam using dynamic couch movements. The second was created to acquire 4D CBCT projections for four minutes. The delivery of the letter ‘B’ was observed using a 2D array of ionization chambers. Both deliveries were generated quickly in Veritas by non-expert Developer Mode users. Conclusion: We introduced a new open source tool Veritas for generating XML plans (delivery trajectories and imaging sequences). Veritas makes Developer Mode more accessible by reducing the learning curve for quick translation of research ideas into XML plans. Veritas is an open source initiative, creating the possibility for future developments and collaboration with other researchers. I am an employee of Varian Medical Systems.« less

  9. XPAT: a toolkit to conduct cross-platform association studies with heterogeneous sequencing datasets.

    PubMed

    Yu, Yao; Hu, Hao; Bohlender, Ryan J; Hu, Fulan; Chen, Jiun-Sheng; Holt, Carson; Fowler, Jerry; Guthery, Stephen L; Scheet, Paul; Hildebrandt, Michelle A T; Yandell, Mark; Huff, Chad D

    2018-04-06

    High-throughput sequencing data are increasingly being made available to the research community for secondary analyses, providing new opportunities for large-scale association studies. However, heterogeneity in target capture and sequencing technologies often introduce strong technological stratification biases that overwhelm subtle signals of association in studies of complex traits. Here, we introduce the Cross-Platform Association Toolkit, XPAT, which provides a suite of tools designed to support and conduct large-scale association studies with heterogeneous sequencing datasets. XPAT includes tools to support cross-platform aware variant calling, quality control filtering, gene-based association testing and rare variant effect size estimation. To evaluate the performance of XPAT, we conducted case-control association studies for three diseases, including 783 breast cancer cases, 272 ovarian cancer cases, 205 Crohn disease cases and 3507 shared controls (including 1722 females) using sequencing data from multiple sources. XPAT greatly reduced Type I error inflation in the case-control analyses, while replicating many previously identified disease-gene associations. We also show that association tests conducted with XPAT using cross-platform data have comparable performance to tests using matched platform data. XPAT enables new association studies that combine existing sequencing datasets to identify genetic loci associated with common diseases and other complex traits.

  10. SmartR: an open-source platform for interactive visual analytics for translational research data

    PubMed Central

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-01-01

    Abstract Summary: In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR, a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. Availability and Implementation: The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR. Contact: reinhard.schneider@uni.lu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28334291

  11. SmartR: an open-source platform for interactive visual analytics for translational research data.

    PubMed

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  12. Open source platform for collaborative construction of wearable sensor datasets for human motion analysis and an application for gait analysis.

    PubMed

    Llamas, César; González, Manuel A; Hernández, Carmen; Vegas, Jesús

    2016-10-01

    Nearly every practical improvement in modeling human motion is well founded in a properly designed collection of data or datasets. These datasets must be made publicly available for the community could validate and accept them. It is reasonable to concede that a collective, guided enterprise could serve to devise solid and substantial datasets, as a result of a collaborative effort, in the same sense as the open software community does. In this way datasets could be complemented, extended and expanded in size with, for example, more individuals, samples and human actions. For this to be possible some commitments must be made by the collaborators, being one of them sharing the same data acquisition platform. In this paper, we offer an affordable open source hardware and software platform based on inertial wearable sensors in a way that several groups could cooperate in the construction of datasets through common software suitable for collaboration. Some experimental results about the throughput of the overall system are reported showing the feasibility of acquiring data from up to 6 sensors with a sampling frequency no less than 118Hz. Also, a proof-of-concept dataset is provided comprising sampled data from 12 subjects suitable for gait analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. An inexpensive Arduino-based LED stimulator system for vision research.

    PubMed

    Teikari, Petteri; Najjar, Raymond P; Malkki, Hemi; Knoblauch, Kenneth; Dumortier, Dominique; Gronfier, Claude; Cooper, Howard M

    2012-11-15

    Light emitting diodes (LEDs) are being used increasingly as light sources in life sciences applications such as in vision research, fluorescence microscopy and in brain-computer interfacing. Here we present an inexpensive but effective visual stimulator based on light emitting diodes (LEDs) and open-source Arduino microcontroller prototyping platform. The main design goal of our system was to use off-the-shelf and open-source components as much as possible, and to reduce design complexity allowing use of the system to end-users without advanced electronics skills. The main core of the system is a USB-connected Arduino microcontroller platform designed initially with a specific emphasis on the ease-of-use creating interactive physical computing environments. The pulse-width modulation (PWM) signal of Arduino was used to drive LEDs allowing linear light intensity control. The visual stimulator was demonstrated in applications such as murine pupillometry, rodent models for cognitive research, and heterochromatic flicker photometry in human psychophysics. These examples illustrate some of the possible applications that can be easily implemented and that are advantageous for students, educational purposes and universities with limited resources. The LED stimulator system was developed as an open-source project. Software interface was developed using Python with simplified examples provided for Matlab and LabVIEW. Source code and hardware information are distributed under the GNU General Public Licence (GPL, version 3). Copyright © 2012 Elsevier B.V. All rights reserved.

  14. PLUS: open-source toolkit for ultrasound-guided intervention systems.

    PubMed

    Lasso, Andras; Heffter, Tamas; Rankin, Adam; Pinter, Csaba; Ungi, Tamas; Fichtinger, Gabor

    2014-10-01

    A variety of advanced image analysis methods have been under the development for ultrasound-guided interventions. Unfortunately, the transition from an image analysis algorithm to clinical feasibility trials as part of an intervention system requires integration of many components, such as imaging and tracking devices, data processing algorithms, and visualization software. The objective of our paper is to provide a freely available open-source software platform-PLUS: Public software Library for Ultrasound-to facilitate rapid prototyping of ultrasound-guided intervention systems for translational clinical research. PLUS provides a variety of methods for interventional tool pose and ultrasound image acquisition from a wide range of tracking and imaging devices, spatial and temporal calibration, volume reconstruction, simulated image generation, and recording and live streaming of the acquired data. This paper introduces PLUS, explains its functionality and architecture, and presents typical uses and performance in ultrasound-guided intervention systems. PLUS fulfills the essential requirements for the development of ultrasound-guided intervention systems and it aspires to become a widely used translational research prototyping platform. PLUS is freely available as open source software under BSD license and can be downloaded from http://www.plustoolkit.org.

  15. Low-cost embedded systems for democratizing ocean sensor technology in the coastal zone

    NASA Astrophysics Data System (ADS)

    Glazer, B. T.; Lio, H. I.

    2017-12-01

    Environmental sciences suffer from undersampling. Enabling sustained and unattended data collection in the coastal zone typically involves expensive instrumentation and infrastructure deployed as cabled observatories or moorings with little flexibility in deployment location following initial installation. High costs of commercially-available or custom instruments have limited the number of sensor sites that can be targeted by academic researchers, and have also limited engagement with the public. We have developed a novel, low-cost, open-source sensor and software platform to enable wireless data transfer of biogeochemical sensors in the coastal zone. The platform is centered upon widely available, low-cost, single board computers and microcontrollers. We have used a blend of on-hand research-grade sensors and low-cost open-source electronics that can be assembled by tech-savvy non-engineers. Robust, open-source code that remains customizable for specific miniNode configurations can match a specific site's measurement needs, depending on the scientific research priorities. We have demonstrated prototype capabilities and versatility through lab testing and field deployments of multiple sensor nodes with multiple sensor inputs, all of which are streaming near-real-time data from Kaneohe Bay over wireless RF links to a shore-based base station.

  16. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    PubMed Central

    Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org. PMID:24523692

  17. JobCenter: an open source, cross-platform, and distributed job queue management system optimized for scalability and versatility

    PubMed Central

    2012-01-01

    Background Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. Results JobCenter is a client–server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or “in the cloud”) and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. Conclusions JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/. PMID:22846423

  18. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    NASA Astrophysics Data System (ADS)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  19. openBIS ELN-LIMS: an open-source database for academic laboratories.

    PubMed

    Barillari, Caterina; Ottoz, Diana S M; Fuentes-Serna, Juan Mariano; Ramakrishnan, Chandrasekhar; Rinn, Bernd; Rudolf, Fabian

    2016-02-15

    The open-source platform openBIS (open Biology Information System) offers an Electronic Laboratory Notebook and a Laboratory Information Management System (ELN-LIMS) solution suitable for the academic life science laboratories. openBIS ELN-LIMS allows researchers to efficiently document their work, to describe materials and methods and to collect raw and analyzed data. The system comes with a user-friendly web interface where data can be added, edited, browsed and searched. The openBIS software, a user guide and a demo instance are available at https://openbis-eln-lims.ethz.ch. The demo instance contains some data from our laboratory as an example to demonstrate the possibilities of the ELN-LIMS (Ottoz et al., 2014). For rapid local testing, a VirtualBox image of the ELN-LIMS is also available. © The Author 2015. Published by Oxford University Press.

  20. PLUME and research sotware

    NASA Astrophysics Data System (ADS)

    Baudin, Veronique; Gomez-Diaz, Teresa

    2013-04-01

    The PLUME open platform (https://www.projet-plume.org) has as first goal to share competences and to value the knowledge of software experts within the French higher education and research communities. The project proposes in its platform the access to more than 380 index cards describing useful and economic software for this community, with open access to everybody. The second goal of PLUME focuses on to improve the visibility of software produced by research laboratories within the higher education and research communities. The "development-ESR" index cards briefly describe the main features of the software, including references to research publications associated to it. The platform counts more than 300 cards describing research software, where 89 cards have an English version. In this talk we describe the theme classification and the taxonomy of the index cards and the evolution with new themes added to the project. We will also focus on the organisation of PLUME as an open project and its interests in the promotion of free/open source software from and for research, contributing to the creation of a community of shared knowledge.

  1. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C.; Somerville, P.; Jordan, T. H.

    2013-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving SCEC researchers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Broadband Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms of a historical earthquake for which observed strong ground motion data is available. Also in validation mode, the Broadband Platform calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. During the past year, we have modified the software to enable the addition of a large number of historical events, and we are now adding validation simulation inputs and observational data for 23 historical events covering the Eastern and Western United States, Japan, Taiwan, Turkey, and Italy. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. By establishing an interface between scientific modules with a common set of input and output files, the Broadband Platform facilitates the addition of new scientific methods, which are written by earth scientists in a number of languages such as C, C++, Fortran, and Python. The Broadband Platform's modular design also supports the reuse of existing software modules as building blocks to create new scientific methods. Additionally, the Platform implements a wrapper around each scientific module, converting input and output files to and from the specific formats required (or produced) by individual scientific codes. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes the addition of 3 new simulation methods and several new data products, such as map and distance-based goodness of fit plots. Finally, as the number and complexity of scenarios simulated using the Broadband Platform increase, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  2. SMART Platforms: Building the App Store for Biosurveillance

    PubMed Central

    Mandl, Kenneth D.

    2013-01-01

    Objective To enable public health departments to develop “apps” to run on electronic health records (EHRs) for (1) biosurveillance and case reporting and (2) delivering alerts to the point of care. We describe a novel health information technology platform with substitutable apps constructed around core services enabling EHRs to function as iPhone-like platforms. Introduction Health care information is a fundamental source of data for biosurveillance, yet configuring EHRs to report relevant data to health departments is technically challenging, labor intensive, and often requires custom solutions for each installation. Public health agencies wishing to deliver alerts to clinicians also must engage in an endless array of one-off systems integrations. Despite a $48B investment in HIT, and meaningful use criteria requiring reporting to biosurveillance systems, most vendor electronic health records are architected monolithically, making modification difficult for hospitals and physician practices. An alternative approach is to reimagine EHRs as iPhone-like platforms supporting substitutable apps-based functionality. Substitutability is the capability inherent in a system of replacing one application with another of similar functionality. Methods Substitutability requires that the purchaser of an app can replace one application with another without being technically expert, without requiring re-engineering other applications that they are using, and without having to consult or require assistance of any of the vendors of previously installed or currently installed applications. Apps necessarily compete with each other promoting progress and adaptability. The Substitutable Medical Applications, Reusable Technologies (SMART) Platforms project is funded by a $15M grant from Office of the National Coordinator of Health Information Technology’s Strategic Health IT Advanced Research Projects (SHARP) Program. All SMART standards are open and the core software is open source. The SMART project promotes substitutability through an application programming interface (API) that can be adopted as part of a “container” built around by a wide variety of HIT, providing readonly access to the underlying data model and a software development toolkit to readily create apps. SMART containers are HIT systems, that have implemented the SMART API or a portion of it. Containers marshal data sources and present them consistently across the SMART API. SMART applications consume the API and are substitutable. Results SMART provides a common platform supporting an “app store for biosurveillance” as an approach to enabling one stop shopping for public health departments—to create an app once, and distribute it everywhere. Further, such apps can be readily updated or created—for example, in the case of an emerging infection, an app may be designed to collect additional data at emergency department triage. Or a public health department may widely distribute an app, interoperable with any SMART-enabled EMR, that delivers contextualized alerts when patient electronic records are opened, or through background processes. SMART has sparked an ecosystem of apps developers and attracted existing health information technology platforms to adopt the SMART API—including, traditional, open source, and next generation EHRs, patient-facing platforms and health information exchanges. SMART-enabled platforms to date include the Cerner EMR, the WorldVista EHR, the OpenMRS EHR, the i2b2 analytic platform, and the Indivo X personal health record. The SMART team is working with the Mirth Corporation, to SMART-enable the HealthBridge and Redwood MedNet Health Information Exchanges. We have demonstrated that a single SMART app can run, unmodified, in all of these environments, as long as the underlying platform collects the required data types. Major EHR vendors are currently adapting the SMART API for their products. Conclusions The SMART system enables nimble customization of any electronic health record system to create either a reporting function (outgoing communication) or an alerting function (incoming communication) establishing a technology for a robust linkage between public health and clinical environments.

  3. From Oss CAD to Bim for Cultural Heritage Digital Representation

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Karachaliou, E.; Stylianidis, E.

    2017-02-01

    The paper illustrates the use of open source Computer-aided design (CAD) environments in order to develop Building Information Modelling (BIM) tools able to manage 3D models in the field of cultural heritage. Nowadays, the development of Free and Open Source Software (FOSS) has been rapidly growing and their use tends to be consolidated. Although BIM technology is widely known and used, there is a lack of integrated open source platforms able to support all stages of Historic Building Information Modelling (HBIM) processes. The present research aims to use a FOSS CAD environment in order to develop BIM plug-ins which will be able to import and edit digital representations of cultural heritage models derived by photogrammetric methods.

  4. Development of an open-source web-based intervention for Brazilian smokers - Viva sem Tabaco.

    PubMed

    Gomide, H P; Bernardino, H S; Richter, K; Martins, L F; Ronzani, T M

    2016-08-02

    Web-based interventions for smoking cessation available in Portuguese do not adhere to evidence-based treatment guidelines. Besides, all existing web-based interventions are built on proprietary platforms that developing countries often cannot afford. We aimed to describe the development of "Viva sem Tabaco", an open-source web-based intervention. The development of the intervention included the selection of content from evidence-based guidelines for smoking cessation, the design of the first layout, conduction of 2 focus groups to identify potential features, refinement of the layout based on focus groups and correction of content based on feedback provided by specialists on smoking cessation. At the end, we released the source-code and intervention on the Internet and translated it into Spanish and English. The intervention developed fills gaps in the information available in Portuguese and the lack of open-source interventions for smoking cessation. The open-source licensing format and its translation system may help researchers from different countries deploying evidence-based interventions for smoking cessation.

  5. Open Source Platform Application to Groundwater Characterization and Monitoring

    NASA Astrophysics Data System (ADS)

    Ntarlagiannis, D.; Day-Lewis, F. D.; Falzone, S.; Lane, J. W., Jr.; Slater, L. D.; Robinson, J.; Hammett, S.

    2017-12-01

    Groundwater characterization and monitoring commonly rely on the use of multiple point sensors and human labor. Due to the number of sensors, labor, and other resources needed, establishing and maintaining an adequate groundwater monitoring network can be both labor intensive and expensive. To improve and optimize the monitoring network design, open source software and hardware components could potentially provide the platform to control robust and efficient sensors thereby reducing costs and labor. This work presents early attempts to create a groundwater monitoring system incorporating open-source software and hardware that will control the remote operation of multiple sensors along with data management and file transfer functions. The system is built around a Raspberry PI 3, that controls multiple sensors in order to perform on-demand, continuous or `smart decision' measurements while providing flexibility to incorporate additional sensors to meet the demands of different projects. The current objective of our technology is to monitor exchange of ionic tracers between mobile and immobile porosity using a combination of fluid and bulk electrical-conductivity measurements. To meet this objective, our configuration uses four sensors (pH, specific conductance, pressure, temperature) that can monitor the fluid electrical properties of interest and guide the bulk electrical measurement. This system highlights the potential of using open source software and hardware components for earth sciences applications. The versatility of the system makes it ideal for use in a large number of applications, and the low cost allows for high resolution (spatially and temporally) monitoring.

  6. AIMBAT: A Python/Matplotlib Tool for Measuring Teleseismic Arrival Times

    NASA Astrophysics Data System (ADS)

    Lou, X.; van der Lee, S.; Lloyd, S.

    2013-12-01

    Python is an open-source, platform-independent, and object-oriented scripting language. It became more popular in the seismologist community since the appearance of ObsPy (Beyreuther et al. 2010, Megies et al. 2011), which provides a powerful framework for seismic data access and processing. This study introduces a new Python-based tool named AIMBAT (Automated and Interactive Measurement of Body-wave Arrival Times) for measuring teleseismic body-wave arrival times on large-scale seismic event data (Lou et al. 2013). Compared to ObsPy, AIMBAT is a lighter tool that is more focused on a particular aspect of seismic data processing. It originates from the widely used MCCC (Multi-Channel Cross-Correlation) method developed by VanDecar and Crosson (1990). On top of the original MCCC procedure, AIMBAT is automated in initial phase picking and is interactive in quality control. The core cross-correlation function is implemented in Fortran to boost up performance in addition to Python. The GUI (graphical user interface) of AIMBAT depends on Matplotlib's GUI-neutral widgets and event-handling API. A number of sorting and (de)selecting options are designed to facilitate the quality control of seismograms. By using AIMBAT, both relative and absolute teleseismic body-wave arrival times are measured. AIMBAT significantly improves efficiency and quality of the measurements. User interaction is needed only to pick the target phase arrival and to set a time window on the array stack. The package is easy to install and use, open-source, and is publicly available. Graphical user interface of AIMBAT.

  7. MSiReader: an open-source interface to view and analyze high resolving power MS imaging files on Matlab platform.

    PubMed

    Robichaud, Guillaume; Garrard, Kenneth P; Barry, Jeremy A; Muddiman, David C

    2013-05-01

    During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.

  8. MSiReader: An Open-Source Interface to View and Analyze High Resolving Power MS Imaging Files on Matlab Platform

    NASA Astrophysics Data System (ADS)

    Robichaud, Guillaume; Garrard, Kenneth P.; Barry, Jeremy A.; Muddiman, David C.

    2013-05-01

    During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.

  9. Adaptation of a web-based, open source electronic medical record system platform to support a large study of tuberculosis epidemiology

    PubMed Central

    2012-01-01

    Background In 2006, we were funded by the US National Institutes of Health to implement a study of tuberculosis epidemiology in Peru. The study required a secure information system to manage data from a target goal of 16,000 subjects who needed to be followed for at least one year. With previous experience in the development and deployment of web-based medical record systems for TB treatment in Peru, we chose to use the OpenMRS open source electronic medical record system platform to develop the study information system. Supported by a core technical and management team and a large and growing worldwide community, OpenMRS is now being used in more than 40 developing countries. We adapted the OpenMRS platform to better support foreign languages. We added a new module to support double data entry, linkage to an existing laboratory information system, automatic upload of GPS data from handheld devices, and better security and auditing of data changes. We added new reports for study managers, and developed data extraction tools for research staff and statisticians. Further adaptation to handle direct entry of laboratory data occurred after the study was launched. Results Data collection in the OpenMRS system began in September 2009. By August 2011 a total of 9,256 participants had been enrolled, 102,274 forms and 13,829 laboratory results had been entered, and there were 208 users. The system is now entirely supported by the Peruvian study staff and programmers. Conclusions The information system served the study objectives well despite requiring some significant adaptations mid-stream. OpenMRS has more tools and capabilities than it did in 2008, and requires less adaptations for future projects. OpenMRS can be an effective research data system in resource poor environments, especially for organizations using or considering it for clinical care as well as research. PMID:23131180

  10. MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.

    PubMed

    Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M

    2002-05-30

    Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.

  11. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  12. Open Source Hbim for Cultural Heritage: a Project Proposal

    NASA Astrophysics Data System (ADS)

    Diara, F.; Rinaudo, F.

    2018-05-01

    Actual technologies are changing Cultural Heritage research, analysis, conservation and development ways, allowing new innovative approaches. The possibility of integrating Cultural Heritage data, like archaeological information, inside a three-dimensional environment system (like a Building Information Modelling) involve huge benefits for its management, monitoring and valorisation. Nowadays there are many commercial BIM solutions. However, these tools are thought and developed mostly for architecture design or technical installations. An example of better solution could be a dynamic and open platform that might consider Cultural Heritage needs as priority. Suitable solution for better and complete data usability and accessibility could be guaranteed by open source protocols. This choice would allow adapting software to Cultural Heritage needs and not the opposite, thus avoiding methodological stretches. This work will focus exactly on analysis and experimentations about specific characteristics of these kind of open source software (DBMS, CAD, Servers) applied to a Cultural Heritage example, in order to verifying their flexibility, reliability and then creating a dynamic HBIM open source prototype. Indeed, it might be a starting point for a future creation of a complete HBIM open source solution that we could adapt to others Cultural Heritage researches and analysis.

  13. Mapping urban green open space in Bontang city using QGIS and cloud computing

    NASA Astrophysics Data System (ADS)

    Agus, F.; Ramadiani; Silalahi, W.; Armanda, A.; Kusnandar

    2018-04-01

    Digital mapping techniques are available freely and openly so that map-based application development is easier, faster and cheaper. A rapid development of Cloud Computing Geographic Information System makes this system can help the needs of the community for the provision of geospatial information online. The presence of urban Green Open Space (GOS) provide great benefits as an oxygen supplier, carbon-binding agent and can contribute to providing comfort and beauty of city life. This study aims to propose a platform application of GIS Cloud Computing (CC) of Bontang City GOS mapping. The GIS-CC platform uses the basic map available that’s free and open source. The research used survey method to collect GOS data obtained from Bontang City Government, while application developing works Quantum GIS-CC. The result section describes the existence of GOS Bontang City and the design of GOS mapping application.

  14. IDEPI: rapid prediction of HIV-1 antibody epitopes and other phenotypic features from sequence data using a flexible machine learning platform.

    PubMed

    Hepler, N Lance; Scheffler, Konrad; Weaver, Steven; Murrell, Ben; Richman, Douglas D; Burton, Dennis R; Poignard, Pascal; Smith, Davey M; Kosakovsky Pond, Sergei L

    2014-09-01

    Since its identification in 1983, HIV-1 has been the focus of a research effort unprecedented in scope and difficulty, whose ultimate goals--a cure and a vaccine--remain elusive. One of the fundamental challenges in accomplishing these goals is the tremendous genetic variability of the virus, with some genes differing at as many as 40% of nucleotide positions among circulating strains. Because of this, the genetic bases of many viral phenotypes, most notably the susceptibility to neutralization by a particular antibody, are difficult to identify computationally. Drawing upon open-source general-purpose machine learning algorithms and libraries, we have developed a software package IDEPI (IDentify EPItopes) for learning genotype-to-phenotype predictive models from sequences with known phenotypes. IDEPI can apply learned models to classify sequences of unknown phenotypes, and also identify specific sequence features which contribute to a particular phenotype. We demonstrate that IDEPI achieves performance similar to or better than that of previously published approaches on four well-studied problems: finding the epitopes of broadly neutralizing antibodies (bNab), determining coreceptor tropism of the virus, identifying compartment-specific genetic signatures of the virus, and deducing drug-resistance associated mutations. The cross-platform Python source code (released under the GPL 3.0 license), documentation, issue tracking, and a pre-configured virtual machine for IDEPI can be found at https://github.com/veg/idepi.

  15. MAPPI-DAT: data management and analysis for protein-protein interaction data from the high-throughput MAPPIT cell microarray platform.

    PubMed

    Gupta, Surya; De Puysseleyr, Veronic; Van der Heyden, José; Maddelein, Davy; Lemmens, Irma; Lievens, Sam; Degroeve, Sven; Tavernier, Jan; Martens, Lennart

    2017-05-01

    Protein-protein interaction (PPI) studies have dramatically expanded our knowledge about cellular behaviour and development in different conditions. A multitude of high-throughput PPI techniques have been developed to achieve proteome-scale coverage for PPI studies, including the microarray based Mammalian Protein-Protein Interaction Trap (MAPPIT) system. Because such high-throughput techniques typically report thousands of interactions, managing and analysing the large amounts of acquired data is a challenge. We have therefore built the MAPPIT cell microArray Protein Protein Interaction-Data management & Analysis Tool (MAPPI-DAT) as an automated data management and analysis tool for MAPPIT cell microarray experiments. MAPPI-DAT stores the experimental data and metadata in a systematic and structured way, automates data analysis and interpretation, and enables the meta-analysis of MAPPIT cell microarray data across all stored experiments. MAPPI-DAT is developed in Python, using R for data analysis and MySQL as data management system. MAPPI-DAT is cross-platform and can be ran on Microsoft Windows, Linux and OS X/macOS. The source code and a Microsoft Windows executable are freely available under the permissive Apache2 open source license at https://github.com/compomics/MAPPI-DAT. jan.tavernier@vib-ugent.be or lennart.martens@vib-ugent.be. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  16. Virtual Labs (Science Gateways) as platforms for Free and Open Source Science

    NASA Astrophysics Data System (ADS)

    Lescinsky, David; Car, Nicholas; Fraser, Ryan; Friedrich, Carsten; Kemp, Carina; Squire, Geoffrey

    2016-04-01

    The Free and Open Source Software (FOSS) movement promotes community engagement in software development, as well as provides access to a range of sophisticated technologies that would be prohibitively expensive if obtained commercially. However, as geoinformatics and eResearch tools and services become more dispersed, it becomes more complicated to identify and interface between the many required components. Virtual Laboratories (VLs, also known as Science Gateways) simplify the management and coordination of these components by providing a platform linking many, if not all, of the steps in particular scientific processes. These enable scientists to focus on their science, rather than the underlying supporting technologies. We describe a modular, open source, VL infrastructure that can be reconfigured to create VLs for a wide range of disciplines. Development of this infrastructure has been led by CSIRO in collaboration with Geoscience Australia and the National Computational Infrastructure (NCI) with support from the National eResearch Collaboration Tools and Resources (NeCTAR) and the Australian National Data Service (ANDS). Initially, the infrastructure was developed to support the Virtual Geophysical Laboratory (VGL), and has subsequently been repurposed to create the Virtual Hazards Impact and Risk Laboratory (VHIRL) and the reconfigured Australian National Virtual Geophysics Laboratory (ANVGL). During each step of development, new capabilities and services have been added and/or enhanced. We plan on continuing to follow this model using a shared, community code base. The VL platform facilitates transparent and reproducible science by providing access to both the data and methodologies used during scientific investigations. This is further enhanced by the ability to set up and run investigations using computational resources accessed through the VL. Data is accessed using registries pointing to catalogues within public data repositories (notably including the NCI National Environmental Research Data Interoperability Platform), or by uploading data directly from user supplied addresses or files. Similarly, scientific software is accessed through registries pointing to software repositories (e.g., GitHub). Runs are configured by using or modifying default templates designed by subject matter experts. After the appropriate computational resources are identified by the user, Virtual Machines (VMs) are spun up and jobs are submitted to service providers (currently the NeCTAR public cloud or Amazon Web Services). Following completion of the jobs the results can be reviewed and downloaded if desired. By providing a unified platform for science, the VL infrastructure enables sophisticated provenance capture and management. The source of input data (including both collection and queries), user information, software information (version and configuration details) and output information are all captured and managed as a VL resource which can be linked to output data sets. This provenance resource provides a mechanism for publication and citation for Free and Open Source Science.

  17. Development of a cloud-based Bioinformatics Training Platform.

    PubMed

    Revote, Jerico; Watson-Haigh, Nathan S; Quenette, Steve; Bethwaite, Blair; McGrath, Annette; Shang, Catherine A

    2017-05-01

    The Bioinformatics Training Platform (BTP) has been developed to provide access to the computational infrastructure required to deliver sophisticated hands-on bioinformatics training courses. The BTP is a cloud-based solution that is in active use for delivering next-generation sequencing training to Australian researchers at geographically dispersed locations. The BTP was built to provide an easy, accessible, consistent and cost-effective approach to delivering workshops at host universities and organizations with a high demand for bioinformatics training but lacking the dedicated bioinformatics training suites required. To support broad uptake of the BTP, the platform has been made compatible with multiple cloud infrastructures. The BTP is an open-source and open-access resource. To date, 20 training workshops have been delivered to over 700 trainees at over 10 venues across Australia using the BTP. © The Author 2016. Published by Oxford University Press.

  18. Development of a cloud-based Bioinformatics Training Platform

    PubMed Central

    Revote, Jerico; Watson-Haigh, Nathan S.; Quenette, Steve; Bethwaite, Blair; McGrath, Annette

    2017-01-01

    Abstract The Bioinformatics Training Platform (BTP) has been developed to provide access to the computational infrastructure required to deliver sophisticated hands-on bioinformatics training courses. The BTP is a cloud-based solution that is in active use for delivering next-generation sequencing training to Australian researchers at geographically dispersed locations. The BTP was built to provide an easy, accessible, consistent and cost-effective approach to delivering workshops at host universities and organizations with a high demand for bioinformatics training but lacking the dedicated bioinformatics training suites required. To support broad uptake of the BTP, the platform has been made compatible with multiple cloud infrastructures. The BTP is an open-source and open-access resource. To date, 20 training workshops have been delivered to over 700 trainees at over 10 venues across Australia using the BTP. PMID:27084333

  19. An open source web interface for linking models to infrastructure system databases

    NASA Astrophysics Data System (ADS)

    Knox, S.; Mohamed, K.; Harou, J. J.; Rheinheimer, D. E.; Medellin-Azuara, J.; Meier, P.; Tilmant, A.; Rosenberg, D. E.

    2016-12-01

    Models of networked engineered resource systems such as water or energy systems are often built collaboratively with developers from different domains working at different locations. These models can be linked to large scale real world databases, and they are constantly being improved and extended. As the development and application of these models becomes more sophisticated, and the computing power required for simulations and/or optimisations increases, so has the need for online services and tools which enable the efficient development and deployment of these models. Hydra Platform is an open source, web-based data management system, which allows modellers of network-based models to remotely store network topology and associated data in a generalised manner, allowing it to serve multiple disciplines. Hydra Platform uses a web API using JSON to allow external programs (referred to as `Apps') to interact with its stored networks and perform actions such as importing data, running models, or exporting the networks to different formats. Hydra Platform supports multiple users accessing the same network and has a suite of functions for managing users and data. We present ongoing development in Hydra Platform, the Hydra Web User Interface, through which users can collaboratively manage network data and models in a web browser. The web interface allows multiple users to graphically access, edit and share their networks, run apps and view results. Through apps, which are located on the server, the web interface can give users access to external data sources and models without the need to install or configure any software. This also ensures model results can be reproduced by removing platform or version dependence. Managing data and deploying models via the web interface provides a way for multiple modellers to collaboratively manage data, deploy and monitor model runs and analyse results.

  20. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    DOE PAGES

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; ...

    2015-07-08

    We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies.

  1. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  2. A prospective international cooperative information technology platform built using open-source tools for improving the access to and safety of bone marrow transplantation in low- and middle-income countries.

    PubMed

    Agarwal, Rajat Kumar; Sedai, Amit; Dhimal, Sunil; Ankita, Kumari; Clemente, Luigi; Siddique, Sulman; Yaqub, Naila; Khalid, Sadaf; Itrat, Fatima; Khan, Anwar; Gilani, Sarah Khan; Marwah, Priya; Soni, Rajpreet; Missiry, Mohamed El; Hussain, Mohamed Hamed; Uderzo, Cornelio; Faulkner, Lawrence

    2014-01-01

    Jagriti Innovations developed a collaboration tool in partnership with the Cure2Children Foundation that has been used by health professionals in Italy, Pakistan, and India for the collaborative management of patients undergoing bone marrow transplantation (BMT) for thalassemia major since August 2008. This online open-access database covers data recording, analyzing, and reporting besides enabling knowledge exchange, telemedicine, capacity building, and quality assurance. As of February 2014, over 2400 patients have been registered and 112 BMTs have been performed with outcomes comparable to international standards, but at a fraction of the cost. This approach avoids medical emigration and contributes to local healthcare strengthening and competitiveness. This paper presents the experience and clinical outcomes associated with the use of this platform built using open-source tools and focusing on a locally pertinent tertiary care procedure-BMT. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  3. Hard real-time closed-loop electrophysiology with the Real-Time eXperiment Interface (RTXI)

    PubMed Central

    George, Ansel; Dorval, Alan D.; Christini, David J.

    2017-01-01

    The ability to experimentally perturb biological systems has traditionally been limited to static pre-programmed or operator-controlled protocols. In contrast, real-time control allows dynamic probing of biological systems with perturbations that are computed on-the-fly during experimentation. Real-time control applications for biological research are available; however, these systems are costly and often restrict the flexibility and customization of experimental protocols. The Real-Time eXperiment Interface (RTXI) is an open source software platform for achieving hard real-time data acquisition and closed-loop control in biological experiments while retaining the flexibility needed for experimental settings. RTXI has enabled users to implement complex custom closed-loop protocols in single cell, cell network, animal, and human electrophysiology studies. RTXI is also used as a free and open source, customizable electrophysiology platform in open-loop studies requiring online data acquisition, processing, and visualization. RTXI is easy to install, can be used with an extensive range of external experimentation and data acquisition hardware, and includes standard modules for implementing common electrophysiology protocols. PMID:28557998

  4. Free-access open-source e-learning in comprehensive neurosurgery skills training.

    PubMed

    Jotwani, Payal; Srivastav, Vinkle; Tripathi, Manjul; Deo, Rama Chandra; Baby, Britty; Damodaran, Natesan; Singh, Ramandeep; Suri, Ashish; Bettag, Martin; Roy, Tara Sankar; Busert, Christoph; Mehlitz, Marcus; Lalwani, Sanjeev; Garg, Kanwaljeet; Paul, Kolin; Prasad, Sanjiva; Banerjee, Subhashis; Kalra, Prem; Kumar, Subodh; Sharma, Bhavani Shankar; Mahapatra, Ashok Kumar

    2014-01-01

    Since the end of last century, technology has taken a front seat in dispersion of medical education. Advancements of technology in neurosurgery and traditional training methods are now being challenged by legal and ethical concerns of patient safety, resident work-hour restriction and cost of operating-room time. To supplement the existing neurosurgery education pattern, various e-learning platforms are introduced as structured, interactive learning system. This study focuses on the concept, formulation, development and impact of web based learning platforms dedicated to neurosurgery discipline to disseminate education, supplement surgical knowledge and improve skills of neurosurgeons. 'Neurosurgery Education and Training School (NETS), e-learning platform' has integration of web-based technologies like 'Content Management System' for organizing the education material and 'Learning Management System' for updating neurosurgeons. NETS discussion forum networks neurosurgeons, neuroscientists and neuro-technologists across the globe facilitating collaborative translational research. Multi-authored neurosurgical e-learning material supplements the deficiencies of regular time-bound education. Interactive open-source, global, free-access e-learning platform of NETS has around 1) 425 visitors/month from 73 countries; ratio of new visitors to returning visitors 42.3; 57.7 (2); 64,380 views from 190 subscribers for surgical videos, 3-D animation, graphics based training modules (3); average 402 views per post. The e-Learning platforms provide updated educational content that make them "quick, surf, find and extract" resources. e-Learning tools like web-based education, social interactive platform and question-answer forum will save unnecessary expenditure of time and travel of neurosurgeons seeking knowledge. The need for free access platforms is more pronounced for the neurosurgeons and patients in developing nations.

  5. Open, Cross Platform Chemistry Application Unifying Structure Manipulation, External Tools, Databases and Visualization

    DTIC Science & Technology

    2012-11-27

    with powerful analysis tools and an informatics approach leveraging best-of-breed NoSQL databases, in order to store, search and retrieve relevant...dictionaries, and JavaScript also has good support. The MongoDB project[15] was chosen as a scalable NoSQL data store for the cheminfor- matics components

  6. Open-Source as a strategy for operational software - the case of Enki

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur; Bruland, Oddbjørn

    2014-05-01

    Since 2002, SINTEF Energy has been developing what is now known as the Enki modelling system. This development has been financed by Norway's largest hydropower producer Statkraft, motivated by a desire for distributed hydrological models in operational use. As the owner of the source code, Statkraft has recently decided on Open Source as a strategy for further development, and for migration from an R&D context to operational use. A current cooperation project is currently carried out between SINTEF Energy, 7 large Norwegian hydropower producers including Statkraft, three universities and one software company. Of course, the most immediate task is that of software maturing. A more important challenge, however, is one of gaining experience within the operational hydropower industry. A transition from lumped to distributed models is likely to also require revision of measurement program, calibration strategy, use of GIS and modern data sources like weather radar and satellite imagery. On the other hand, map based visualisations enable a richer information exchange between hydrologic forecasters and power market traders. The operating context of a distributed hydrology model within hydropower planning is far from settled. Being both a modelling framework and a library of plugin-routines to build models from, Enki supports the flexibility needed in this situation. Recent development has separated the core from the user interface, paving the way for a scripting API, cross-platform compilation, and front-end programs serving different degrees of flexibility, robustness and security. The open source strategy invites anyone to use Enki and to develop and contribute new modules. Once tested, the same modules are available for the operational versions of the program. A core challenge is to offer rigid testing procedures and mechanisms to reject routines in an operational setting, without limiting the experimentation with new modules. The Open Source strategy also has implications for building and maintaining competence around the source code and the advanced hydrological and statistical routines in Enki. Originally developed by hydrologists, the Enki code is now approaching a state where maintenance requires a background in professional software development. Without the advantage of proprietary source code, both hydrologic improvements and software maintenance depend on donations or development support on a case-to-case basis, a situation well known within the open source community. It remains to see whether these mechanisms suffice to keep Enki at the maintenance level required by the hydropower sector. ENKI is available from www.opensource-enki.org.

  7. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  8. The EarthServer Federation: State, Role, and Contribution to GEOSS

    NASA Astrophysics Data System (ADS)

    Merticariu, Vlad; Baumann, Peter

    2016-04-01

    The intercontinental EarthServer initiative has established a European datacube platform with proven scalability: known databases exceed 100 TB, and single queries have been split across more than 1,000 cloud nodes. Its service interface being rigorously based on the OGC "Big Geo Data" standards, Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS), a series of clients can dock into the services, ranging from open-source OpenLayers and QGIS over open-source NASA WorldWind to proprietary ESRI ArcGIS. Datacube fusion in a "mix and match" style is supported by the platform technolgy, the rasdaman Array Database System, which transparently federates queries so that users simply approach any node of the federation to access any data item, internally optimized for minimal data transfer. Notably, rasdaman is part of GEOSS GCI. NASA is contributing its Web WorldWind virtual globe for user-friendly data extraction, navigation, and analysis. Integrated datacube / metadata queries are contributed by CITE. Current federation members include ESA (managed by MEEO sr.l.), Plymouth Marine Laboratory (PML), the European Centre for Medium-Range Weather Forecast (ECMWF), Australia's National Computational Infrastructure, and Jacobs University (adding in Planetary Science). Further data centers have expressed interest in joining. We present the EarthServer approach, discuss its underlying technology, and illustrate the contribution this datacube platform can make to GEOSS.

  9. The characteristics and dynamics of wave-driven flow across a platform coral reef in the Red Sea

    NASA Astrophysics Data System (ADS)

    Lentz, S. J.; Churchill, J. H.; Davis, K. A.; Farrar, J. T.; Pineda, J.; Starczak, V.

    2016-02-01

    Current dynamics across a platform reef in the Red Sea near Jeddah, Saudi Arabia, are examined using 18 months of current profile, pressure, surface wave, and wind observations. The platform reef is 700 m long, 200 m across with spatial and temporal variations in water depth over the reef ranging from 0.6 to 1.6 m. Surface waves breaking at the seaward edge of the reef cause a 2-10 cm setup of sea level that drives cross-reef currents of 5-20 cm s-1. Bottom stress is a significant component of the wave setup balance in the surf zone. Over the reef flat, where waves are not breaking, the cross-reef pressure gradient associated with wave setup is balanced by bottom stress. The quadratic drag coefficient for the depth-average flow decreases with increasing water depth from Cda = 0.17 in 0.4 m of water to Cda = 0.03 in 1.2 m of water. The observed dependence of the drag coefficient on water depth is consistent with open-channel flow theory and a hydrodynamic roughness of zo = 0.06 m. A simple one-dimensional model driven by incident surface waves and wind stress accurately reproduces the observed depth-averaged cross-reef currents and a portion of the weaker along-reef currents over the focus reef and two other Red Sea platform reefs. The model indicates the cross-reef current is wave forced and the along-reef current is partially wind forced.

  10. Neural ensemble communities: open-source approaches to hardware for large-scale electrophysiology.

    PubMed

    Siegle, Joshua H; Hale, Gregory J; Newman, Jonathan P; Voigts, Jakob

    2015-06-01

    One often-overlooked factor when selecting a platform for large-scale electrophysiology is whether or not a particular data acquisition system is 'open' or 'closed': that is, whether or not the system's schematics and source code are available to end users. Open systems have a reputation for being difficult to acquire, poorly documented, and hard to maintain. With the arrival of more powerful and compact integrated circuits, rapid prototyping services, and web-based tools for collaborative development, these stereotypes must be reconsidered. We discuss some of the reasons why multichannel extracellular electrophysiology could benefit from open-source approaches and describe examples of successful community-driven tool development within this field. In order to promote the adoption of open-source hardware and to reduce the need for redundant development efforts, we advocate a move toward standardized interfaces that connect each element of the data processing pipeline. This will give researchers the flexibility to modify their tools when necessary, while allowing them to continue to benefit from the high-quality products and expertise provided by commercial vendors. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Summary and evaluation of the Strategic Defense Initiative Space Power Architecture Study

    NASA Technical Reports Server (NTRS)

    Edenburn, M. (Editor); Smith, J. M. (Editor)

    1989-01-01

    The Space Power Architecture Study (SPAS) identified and evaluated power subsystem options for multimegawatt electric (MMWE) space based weapons and surveillance platforms for the Strategic Defense Initiative (SDI) applications. Steady state requirements of less than 1 MMWE are adequately covered by the SP-100 nuclear space power program and hence were not addressed in the SPAS. Four steady state power systems less than 1 MMWE were investigated with little difference between them on a mass basis. The majority of the burst power systems utilized H(2) from the weapons and were either closed (no effluent), open (effluent release) or steady state with storage (no effluent). Closed systems used nuclear or combustion heat source with thermionic, Rankine, turboalternator, fuel cell and battery conversion devices. Open systems included nuclear or combustion heat sources using turboalternator, magnetohydrodynamic, fuel cell or battery power conversion devices. The steady state systems with storage used the SP-100 or Star-M reactors as energy sources and flywheels, fuel cells or batteries to store energy for burst applications. As with other studies the open systems are by far the lightest, most compact and simplist (most reliable) systems. However, unlike other studies the SPAS studied potential platform operational problems caused by effluents or vibration.

  12. FreeSASA: An open source C library for solvent accessible surface area calculations.

    PubMed

    Mitternacht, Simon

    2016-01-01

    Calculating solvent accessible surface areas (SASA) is a run-of-the-mill calculation in structural biology. Although there are many programs available for this calculation, there are no free-standing, open-source tools designed for easy tool-chain integration. FreeSASA is an open source C library for SASA calculations that provides both command-line and Python interfaces in addition to its C API. The library implements both Lee and Richards' and Shrake and Rupley's approximations, and is highly configurable to allow the user to control molecular parameters, accuracy and output granularity. It only depends on standard C libraries and should therefore be easy to compile and install on any platform. The library is well-documented, stable and efficient. The command-line interface can easily replace closed source legacy programs, with comparable or better accuracy and speed, and with some added functionality.

  13. SU-C-BRC-06: OpenCL-Based Cross-Platform Monte Carlo Simulation Package for Carbon Ion Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, N; Tian, Z; Pompos, A

    2016-06-15

    Purpose: Monte Carlo (MC) simulation is considered to be the most accurate method for calculation of absorbed dose and fundamental physical quantities related to biological effects in carbon ion therapy. Its long computation time impedes clinical and research applications. We have developed an MC package, goCMC, on parallel processing platforms, aiming at achieving accurate and efficient simulations for carbon therapy. Methods: goCMC was developed under OpenCL framework. It supported transport simulation in voxelized geometry with kinetic energy up to 450 MeV/u. Class II condensed history algorithm was employed for charged particle transport with stopping power computed via Bethe-Bloch equation. Secondarymore » electrons were not transported with their energy locally deposited. Energy straggling and multiple scattering were modeled. Production of secondary charged particles from nuclear interactions was implemented based on cross section and yield data from Geant4. They were transported via the condensed history scheme. goCMC supported scoring various quantities of interest e.g. physical dose, particle fluence, spectrum, linear energy transfer, and positron emitting nuclei. Results: goCMC has been benchmarked against Geant4 with different phantoms and beam energies. For 100 MeV/u, 250 MeV/u and 400 MeV/u beams impinging to a water phantom, range difference was 0.03 mm, 0.20 mm and 0.53 mm, and mean dose difference was 0.47%, 0.72% and 0.79%, respectively. goCMC can run on various computing devices. Depending on the beam energy and voxel size, it took 20∼100 seconds to simulate 10{sup 7} carbons on an AMD Radeon GPU card. The corresponding CPU time for Geant4 with the same setup was 60∼100 hours. Conclusion: We have developed an OpenCL-based cross-platform carbon MC simulation package, goCMC. Its accuracy, efficiency and portability make goCMC attractive for research and clinical applications in carbon therapy.« less

  14. Empowering open systems through cross-platform interoperability

    NASA Astrophysics Data System (ADS)

    Lyke, James C.

    2014-06-01

    Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.

  15. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    PubMed Central

    2012-01-01

    Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach. PMID:23033878

  16. OpenSim: open-source software to create and analyze dynamic simulations of movement.

    PubMed

    Delp, Scott L; Anderson, Frank C; Arnold, Allison S; Loan, Peter; Habib, Ayman; John, Chand T; Guendelman, Eran; Thelen, Darryl G

    2007-11-01

    Dynamic simulations of movement allow one to study neuromuscular coordination, analyze athletic performance, and estimate internal loading of the musculoskeletal system. Simulations can also be used to identify the sources of pathological movement and establish a scientific basis for treatment planning. We have developed a freely available, open-source software system (OpenSim) that lets users develop models of musculoskeletal structures and create dynamic simulations of a wide variety of movements. We are using this system to simulate the dynamics of individuals with pathological gait and to explore the biomechanical effects of treatments. OpenSim provides a platform on which the biomechanics community can build a library of simulations that can be exchanged, tested, analyzed, and improved through a multi-institutional collaboration. Developing software that enables a concerted effort from many investigators poses technical and sociological challenges. Meeting those challenges will accelerate the discovery of principles that govern movement control and improve treatments for individuals with movement pathologies.

  17. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  18. Specifications and implementation of the RT MHD control system for the EC launcher of FTU

    NASA Astrophysics Data System (ADS)

    Galperti, C.; Alessi, E.; Boncagni, L.; Bruschi, A.; Granucci, G.; Grosso, A.; Iannone, F.; Marchetto, C.; Nowak, S.; Panella, M.; Sozzi, C.; Tilia, B.

    2012-09-01

    To perform real time plasma control experiments using EC heating waves by using the new fast launcher installed on FTU a dedicated data acquisition and elaboration system has been designed recently. A prototypical version of the acquisition/control system has been recently developed and will be tested on FTU machine in its next experimental campaign. The open-source framework MARTe (Multi-threaded Application Real-Time executor) on Linux/RTAI real-time operating system has been chosen as software platform to realize the control system. Standard open-architecture industrial PCs, based either on VME bus and CompactPCI bus equipped with standard input/output cards are the chosen hardware platform.

  19. RAPID and DDS

    NASA Technical Reports Server (NTRS)

    Utz, Hans Heinrich

    2011-01-01

    This talk gives an overview of the the Robot Applications Programmers Interface Delegate (RAPID) as well as the distributed systems middleware Data Distribution Service (DDS). DDS is an open software standard, RAPID is cleared for open-source release under NOSA. RAPID specifies data-structures and semantics for high-level telemetry published by NASA robotic software. These data-structures are supported by multiple robotic platforms at Johnson Space Center (JSC), Jet Propulsion Laboratory (JPL) and Ames Research Center (ARC), providing high-level interoperability between those platforms. DDS is used as the middleware for data transfer. The feature set of the middleware heavily influences the design decision made in the RAPID specification. So it is appropriate to discuss both in this introductory talk.

  20. Integrated genome browser: visual analytics platform for genomics.

    PubMed

    Freese, Nowlan H; Norris, David C; Loraine, Ann E

    2016-07-15

    Genome browsers that support fast navigation through vast datasets and provide interactive visual analytics functions can help scientists achieve deeper insight into biological systems. Toward this end, we developed Integrated Genome Browser (IGB), a highly configurable, interactive and fast open source desktop genome browser. Here we describe multiple updates to IGB, including all-new capabilities to display and interact with data from high-throughput sequencing experiments. To demonstrate, we describe example visualizations and analyses of datasets from RNA-Seq, ChIP-Seq and bisulfite sequencing experiments. Understanding results from genome-scale experiments requires viewing the data in the context of reference genome annotations and other related datasets. To facilitate this, we enhanced IGB's ability to consume data from diverse sources, including Galaxy, Distributed Annotation and IGB-specific Quickload servers. To support future visualization needs as new genome-scale assays enter wide use, we transformed the IGB codebase into a modular, extensible platform for developers to create and deploy all-new visualizations of genomic data. IGB is open source and is freely available from http://bioviz.org/igb aloraine@uncc.edu. © The Author 2016. Published by Oxford University Press.

  1. Neuroimaging Data Sharing on the Neuroinformatics Database Platform

    PubMed Central

    Book, Gregory A; Stevens, Michael; Assaf, Michal; Glahn, David; Pearlson, Godfrey D

    2015-01-01

    We describe the Neuroinformatics Database (NiDB), an open-source database platform for archiving, analysis, and sharing of neuroimaging data. Data from the multi-site projects Autism Brain Imaging Data Exchange (ABIDE), Bipolar-Schizophrenia Network on Intermediate Phenotypes parts one and two (B-SNIP1, B-SNIP2), and Monetary Incentive Delay task (MID) are available for download from the public instance of NiDB, with more projects sharing data as it becomes available. As demonstrated by making several large datasets available, NiDB is an extensible platform appropriately suited to archive and distribute shared neuroimaging data. PMID:25888923

  2. Moodle 2.0: Shifting from a Learning Toolkit to a Open Learning Platform

    NASA Astrophysics Data System (ADS)

    Alier, Marc; Casañ, María José; Piguillem, Jordi

    Learning Management Systems (LMS) have reached a plateau of maturity in features, application to teaching practices and wide adoption by learning institutions. But the Web 2.0 carries new kinds of tools, services and ways of using the web; personally and socially. Some educators and learners have started to advocate for a new approach to frame one's learning sources, from the LMS course space towards Personal Learning Environments (PLE). But PLE's are characterized by its absence of structure, just what is provided by open standards and mashup techniques. Based on 5 years of participative observation research, this article explains the changes in architecture performed on the second version of Moodle, why did these changes happen and what should be the next steps so Moodle can shift from being a learning tool to a true open learning platform.

  3. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  4. OpenMebius: an open source software for isotopically nonstationary 13C-based metabolic flux analysis.

    PubMed

    Kajihata, Shuichi; Furusawa, Chikara; Matsuda, Fumio; Shimizu, Hiroshi

    2014-01-01

    The in vivo measurement of metabolic flux by (13)C-based metabolic flux analysis ((13)C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a (13)C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas (13)C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary (13)C metabolic flux analysis (INST-(13)C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-(13)C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-(13)C-MFA. Confidence intervals determined by INST-(13)C-MFA were less than those determined by conventional methods, indicating the potential of INST-(13)C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-(13)C-MFA.

  5. The Application of the Open Pharmacological Concepts Triple Store (Open PHACTS) to Support Drug Discovery Research

    PubMed Central

    Ratnam, Joseline; Zdrazil, Barbara; Digles, Daniela; Cuadrado-Rodriguez, Emiliano; Neefs, Jean-Marc; Tipney, Hannah; Siebes, Ronald; Waagmeester, Andra; Bradley, Glyn; Chau, Chau Han; Richter, Lars; Brea, Jose; Evelo, Chris T.; Jacoby, Edgar; Senger, Stefan; Loza, Maria Isabel; Ecker, Gerhard F.; Chichester, Christine

    2014-01-01

    Integration of open access, curated, high-quality information from multiple disciplines in the Life and Biomedical Sciences provides a holistic understanding of the domain. Additionally, the effective linking of diverse data sources can unearth hidden relationships and guide potential research strategies. However, given the lack of consistency between descriptors and identifiers used in different resources and the absence of a simple mechanism to link them, gathering and combining relevant, comprehensive information from diverse databases remains a challenge. The Open Pharmacological Concepts Triple Store (Open PHACTS) is an Innovative Medicines Initiative project that uses semantic web technology approaches to enable scientists to easily access and process data from multiple sources to solve real-world drug discovery problems. The project draws together sources of publicly-available pharmacological, physicochemical and biomolecular data, represents it in a stable infrastructure and provides well-defined information exploration and retrieval methods. Here, we highlight the utility of this platform in conjunction with workflow tools to solve pharmacological research questions that require interoperability between target, compound, and pathway data. Use cases presented herein cover 1) the comprehensive identification of chemical matter for a dopamine receptor drug discovery program 2) the identification of compounds active against all targets in the Epidermal growth factor receptor (ErbB) signaling pathway that have a relevance to disease and 3) the evaluation of established targets in the Vitamin D metabolism pathway to aid novel Vitamin D analogue design. The example workflows presented illustrate how the Open PHACTS Discovery Platform can be used to exploit existing knowledge and generate new hypotheses in the process of drug discovery. PMID:25522365

  6. ERDDAP: Reducing Data Friction with an Open Source Data Platform

    NASA Astrophysics Data System (ADS)

    O'Brien, K.

    2017-12-01

    Data friction is not just an issue facing interdisciplinary research. Often times, even within disciplines, significant data friction can exist. Issues of differing formats, limited metadata and non-existent machine-to-machine data access are all issues that exist within disciplines and make it that much harder for successful interdisciplinary cooperation. Therefore, reducing data friction within disciplines is crucial first step in providing better overall collaboration. ERDDAP, an open source data platform developed at NOAA's Southwest Fisheries Center, is well poised to improve data useability and understanding and reduce data friction, both in single and multi-disciplinary research. By virtue of its ability to integrate data of varying formats and provide RESTful-based user access to data and metadata, use of ERDDAP has grown substantially throughout the ocean data community. ERDDAP also supports standards such as the DAP data protocol, the Climate and Forecast (CF) metadata conventions and the Bagit document standard for data archival. In this presentation, we will discuss the advantages of using ERDDAP as a data platform. We will also show specific use cases where utilizing ERDDAP has reduced friction within a single discipline (physical oceanography) and improved interdisciplinary collaboration as well.

  7. Slope climbing challenges, fear of heights, anxiety and time of the day.

    PubMed

    Ennaceur, A; Hussain, M D; Abuhamdah, R M; Mostafa, R M; Chazot, P L

    2017-01-01

    When exposed to an unfamiliar open space, animals experience fear and attempt to find an escape route. Anxiety emerges when animals are confronted with a challenging obstacle to this fear motivated escape. High anxiety animals do not take risks; they avoid the challenge. The present experiments investigated this risk avoidant behavior in mice. In experiment 1, BALB/c, C57BL/6J and CD-1 mice were exposed to a large platform with downward inclined steep slopes attached on two opposite sides. The platform was elevated 75 and 100cm from the ground, in a standard (SPDS) and in a raised (RPDS) configuration, respectively. In experiment 2, the platform was elevated 75cm from the ground. Mice had to climb onto a stand at the top of upward inclined slopes (SPUS). In experiment 3, BALB/c mice were exposed to SPDS with steep or shallow slopes either in early morning or in late afternoon. In all 3 test configurations, mice spent more time in the areas adjacent to the slopes than in the areas adjacent to void, however only C57BL/6J and CD-1 crossed onto the slopes in SPDS, and crossed onto the stands in SPUS whereas BALB/c remained on the platform in SPDS and explored the slopes in SPUS. Elevation of the platform from the ground reduced the crossings onto the slopes in C57BL/6J and CD-1, and no differences were observed between BALB/c and C57BL/6J. BALB/c mice demonstrated no difference in anxiety when tested early morning or late afternoon; they crossed onto shallow slopes and avoided the steep one. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. phpMs: A PHP-Based Mass Spectrometry Utilities Library.

    PubMed

    Collins, Andrew; Jones, Andrew R

    2018-03-02

    The recent establishment of cloud computing, high-throughput networking, and more versatile web standards and browsers has led to a renewed interest in web-based applications. While traditionally big data has been the domain of optimized desktop and server applications, it is now possible to store vast amounts of data and perform the necessary calculations offsite in cloud storage and computing providers, with the results visualized in a high-quality cross-platform interface via a web browser. There are number of emerging platforms for cloud-based mass spectrometry data analysis; however, there is limited pre-existing code accessible to web developers, especially for those that are constrained to a shared hosting environment where Java and C applications are often forbidden from use by the hosting provider. To remedy this, we provide an open-source mass spectrometry library for one of the most commonly used web development languages, PHP. Our new library, phpMs, provides objects for storing and manipulating spectra and identification data as well as utilities for file reading, file writing, calculations, peptide fragmentation, and protein digestion as well as a software interface for controlling search engines. We provide a working demonstration of some of the capabilities at http://pgb.liv.ac.uk/phpMs .

  9. The scheme and research of TV series multidimensional comprehensive evaluation on cross-platform

    NASA Astrophysics Data System (ADS)

    Chai, Jianping; Bai, Xuesong; Zhou, Hongjun; Yin, Fulian

    2016-10-01

    As for shortcomings of the comprehensive evaluation system on traditional TV programs such as single data source, ignorance of new media as well as the high time cost and difficulty of making surveys, a new evaluation of TV series is proposed in this paper, which has a perspective in cross-platform multidimensional evaluation after broadcasting. This scheme considers the data directly collected from cable television and the Internet as research objects. It's based on TOPSIS principle, after preprocessing and calculation of the data, they become primary indicators that reflect different profiles of the viewing of TV series. Then after the process of reasonable empowerment and summation by the six methods(PCA, AHP, etc.), the primary indicators form the composite indices on different channels or websites. The scheme avoids the inefficiency and difficulty of survey and marking; At the same time, it not only reflects different dimensions of viewing, but also combines TV media and new media, completing the multidimensional comprehensive evaluation of TV series on cross-platform.

  10. Open-source platform to benchmark fingerprints for ligand-based virtual screening

    PubMed Central

    2013-01-01

    Similarity-search methods using molecular fingerprints are an important tool for ligand-based virtual screening. A huge variety of fingerprints exist and their performance, usually assessed in retrospective benchmarking studies using data sets with known actives and known or assumed inactives, depends largely on the validation data sets used and the similarity measure used. Comparing new methods to existing ones in any systematic way is rather difficult due to the lack of standard data sets and evaluation procedures. Here, we present a standard platform for the benchmarking of 2D fingerprints. The open-source platform contains all source code, structural data for the actives and inactives used (drawn from three publicly available collections of data sets), and lists of randomly selected query molecules to be used for statistically valid comparisons of methods. This allows the exact reproduction and comparison of results for future studies. The results for 12 standard fingerprints together with two simple baseline fingerprints assessed by seven evaluation methods are shown together with the correlations between methods. High correlations were found between the 12 fingerprints and a careful statistical analysis showed that only the two baseline fingerprints were different from the others in a statistically significant way. High correlations were also found between six of the seven evaluation methods, indicating that despite their seeming differences, many of these methods are similar to each other. PMID:23721588

  11. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  12. Open Source Hardware for DIY Environmental Sensing

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Hicks, S. D.; Damiano, S. G.; Montgomery, D. S.

    2014-12-01

    The Arduino open source electronics platform has been very popular within the DIY (Do It Yourself) community for several years, and it is now providing environmental science researchers with an inexpensive alternative to commercial data logging and transmission hardware. Here we present the designs for our latest series of custom Arduino-based dataloggers, which include wireless communication options like self-meshing radio networks and cellular phone modules. The main Arduino board uses a custom interface board to connect to various research-grade sensors to take readings of turbidity, dissolved oxygen, water depth and conductivity, soil moisture, solar radiation, and other parameters. Sensors with SDI-12 communications can be directly interfaced to the logger using our open Arduino-SDI-12 software library (https://github.com/StroudCenter/Arduino-SDI-12). Different deployment options are shown, like rugged enclosures to house the loggers and rigs for mounting the sensors in both fresh water and marine environments. After the data has been collected and transmitted by the logger, the data is received by a mySQL-PHP stack running on a web server that can be accessed from anywhere in the world. Once there, the data can be visualized on web pages or served though REST requests and Water One Flow (WOF) services. Since one of the main benefits of using open source hardware is the easy collaboration between users, we are introducing a new web platform for discussion and sharing of ideas and plans for hardware and software designs used with DIY environmental sensors and data loggers.

  13. psiTurk: An open-source framework for conducting replicable behavioral experiments online.

    PubMed

    Gureckis, Todd M; Martin, Jay; McDonnell, John; Rich, Alexander S; Markant, Doug; Coenen, Anna; Halpern, David; Hamrick, Jessica B; Chan, Patricia

    2016-09-01

    Online data collection has begun to revolutionize the behavioral sciences. However, conducting carefully controlled behavioral experiments online introduces a number of new of technical and scientific challenges. The project described in this paper, psiTurk, is an open-source platform which helps researchers develop experiment designs which can be conducted over the Internet. The tool primarily interfaces with Amazon's Mechanical Turk, a popular crowd-sourcing labor market. This paper describes the basic architecture of the system and introduces new users to the overall goals. psiTurk aims to reduce the technical hurdles for researchers developing online experiments while improving the transparency and collaborative nature of the behavioral sciences.

  14. DeNovoGUI: An Open Source Graphical User Interface for de Novo Sequencing of Tandem Mass Spectra

    PubMed Central

    2013-01-01

    De novo sequencing is a popular technique in proteomics for identifying peptides from tandem mass spectra without having to rely on a protein sequence database. Despite the strong potential of de novo sequencing algorithms, their adoption threshold remains quite high. We here present a user-friendly and lightweight graphical user interface called DeNovoGUI for running parallelized versions of the freely available de novo sequencing software PepNovo+, greatly simplifying the use of de novo sequencing in proteomics. Our platform-independent software is freely available under the permissible Apache2 open source license. Source code, binaries, and additional documentation are available at http://denovogui.googlecode.com. PMID:24295440

  15. DeNovoGUI: an open source graphical user interface for de novo sequencing of tandem mass spectra.

    PubMed

    Muth, Thilo; Weilnböck, Lisa; Rapp, Erdmann; Huber, Christian G; Martens, Lennart; Vaudel, Marc; Barsnes, Harald

    2014-02-07

    De novo sequencing is a popular technique in proteomics for identifying peptides from tandem mass spectra without having to rely on a protein sequence database. Despite the strong potential of de novo sequencing algorithms, their adoption threshold remains quite high. We here present a user-friendly and lightweight graphical user interface called DeNovoGUI for running parallelized versions of the freely available de novo sequencing software PepNovo+, greatly simplifying the use of de novo sequencing in proteomics. Our platform-independent software is freely available under the permissible Apache2 open source license. Source code, binaries, and additional documentation are available at http://denovogui.googlecode.com .

  16. The Biopsychology-Toolbox: a free, open-source Matlab-toolbox for the control of behavioral experiments.

    PubMed

    Rose, Jonas; Otto, Tobias; Dittrich, Lars

    2008-10-30

    The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.

  17. BioContainers: an open-source and community-driven framework for software standardization.

    PubMed

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  18. BioContainers: an open-source and community-driven framework for software standardization

    PubMed Central

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  19. Mashup Scheme Design of Map Tiles Using Lightweight Open Source Webgis Platform

    NASA Astrophysics Data System (ADS)

    Hu, T.; Fan, J.; He, H.; Qin, L.; Li, G.

    2018-04-01

    To address the difficulty involved when using existing commercial Geographic Information System platforms to integrate multi-source image data fusion, this research proposes the loading of multi-source local tile data based on CesiumJS and examines the tile data organization mechanisms and spatial reference differences of the CesiumJS platform, as well as various tile data sources, such as Google maps, Map World, and Bing maps. Two types of tile data loading schemes have been designed for the mashup of tiles, the single data source loading scheme and the multi-data source loading scheme. The multi-sources of digital map tiles used in this paper cover two different but mainstream spatial references, the WGS84 coordinate system and the Web Mercator coordinate system. According to the experimental results, the single data source loading scheme and the multi-data source loading scheme with the same spatial coordinate system showed favorable visualization effects; however, the multi-data source loading scheme was prone to lead to tile image deformation when loading multi-source tile data with different spatial references. The resulting method provides a low cost and highly flexible solution for small and medium-scale GIS programs and has a certain potential for practical application values. The problem of deformation during the transition of different spatial references is an important topic for further research.

  20. An Open Source approach to automated hydrological analysis of ungauged drainage basins in Serbia using R and SAGA

    NASA Astrophysics Data System (ADS)

    Zlatanovic, Nikola; Milovanovic, Irina; Cotric, Jelena

    2014-05-01

    Drainage basins are for the most part ungauged or poorly gauged not only in Serbia but in most parts of the world, usually due to insufficient funds, but also the decommission of river gauges in upland catchments to focus on downstream areas which are more populated. Very often, design discharges are needed for these streams or rivers where no streamflow data is available, for various applications. Examples include river training works for flood protection measures or erosion control, design of culverts, water supply facilities, small hydropower plants etc. The estimation of discharges in ungauged basins is most often performed using rainfall-runoff models, whose parameters heavily rely on geomorphometric attributes of the basin (e.g. catchment area, elevation, slopes of channels and hillslopes etc.). The calculation of these, as well as other paramaters, is most often done in GIS (Geographic Information System) software environments. This study deals with the application of freely available and open source software and datasets for automating rainfall-runoff analysis of ungauged basins using methodologies currently in use hydrological practice. The R programming language was used for scripting and automating the hydrological calculations, coupled with SAGA GIS (System for Automated Geoscientivic Analysis) for geocomputing functions and terrain analysis. Datasets used in the analyses include the freely available SRTM (Shuttle Radar Topography Mission) terrain data, CORINE (Coordination of Information on the Environment) Land Cover data, as well as soil maps and rainfall data. The choice of free and open source software and datasets makes the project ideal for academic and research purposes and cross-platform projects. The geomorphometric module was tested on more than 100 catchments throughout Serbia and compared to manually calculated values (using topographic maps). The discharge estimation module was tested on 21 catchments where data were available and compared to results obtained by frequency analysis of annual maximum discharge. The geomorphometric module of the calculation system showed excellent results, saving a great deal of time that would otherwise have been spent on manual processing of geospatial data. This type of automated analysis presented in this study will enable a much quicker hydrologic analysis on multiple watersheds, providing the platform for further research into spatial variability of runoff.

  1. Neurite guidance and three-dimensional confinement via compliant semiconductor scaffolds.

    PubMed

    Cavallo, Francesca; Huang, Yu; Dent, Erik W; Williams, Justin C; Lagally, Max G

    2014-12-23

    Neurons are often cultured in vitro on a flat, open, and rigid substrate, a platform that does not reflect well the native microenvironment of the brain. To address this concern, we have developed a culturing platform containing arrays of microchannels, formed in a crystalline-silicon nanomembrane (NM) resting on polydimethylsiloxane; this platform will additionally enable active sensing and stimulation at the local scale, via devices fabricated in the silicon. The mechanical properties of the composite Si/compliant substrate nanomaterial approximate those of neural tissue. The microchannels, created in the NM by strain engineering, demonstrate strong guidance of neurite outgrowth. Using plasma techniques, we developed a means to coat just the inside surface of these channels with an adhesion promoter (poly-d-lysine). For NM channels with openings larger than the cross-sectional area of a single axon, strong physical confinement and guidance of axons through the channels are observed. Imaging of axons that grow in channels with openings that approximate the size of an axon suggests that a tight seal exists between the cell membrane and the inner surface of the channel, mimicking a myelin sheath. Such a tight seal of the cell membrane with the channel surface would make this platform an attractive candidate for future neuronal repair. Results of measurements of impedance and photoluminescence of bare NM channels are comparable to those on a flat NM, demonstrating electrical and optical modalities of our platform and suggesting that this scaffold can be expanded for active sensing and monitoring of neuron cellular processes in conditions in which they exist naturally.

  2. CompactPCI/Linux Platform in FTU Slow Control System

    NASA Astrophysics Data System (ADS)

    Iannone, F.; Wang, L.; Centioli, C.; Panella, M.; Mazza, G.; Vitale, V.

    2004-12-01

    In large fusion experiments, such as tokamak devices, there is a common trend for slow control systems. Because of complexity of the plants, the so-called `Standard Model' (SM) in slow control has been adopted on several tokamak machines. This model is based on a three-level hierarchical control: 1) High-Level Control (HLC) with a supervisory function; 2) Medium-Level Control (MLC) to interface and concentrate I/O field equipments; 3) Low-Level Control (LLC) with hard real-time I/O function, often managed by PLCs. FTU control system designed with SM concepts has underwent several stages of developments in its fifteen years duration of runs. The latest evolution was inevitable, due to the obsolescence of the MLC CPUs, based on VME-MOTOROLA 68030 with OS9 operating system. A large amount of C code was developed for that platform to route the data flow from LLC, which is constituted by 24 Westinghouse Numalogic PC-700 PLCs with about 8000 field-points, to HLC, based on a commercial Object-Oriented Real-Time database on Alpha/CompaqTru64 platform. Therefore, we have to look for cost-effective solutions and finally a CompactPCI-Intel x86 platform with Linux operating system was chosen. A software porting has been done, taking into account the differences between OS9 and Linux operating system in terms of Inter/Network Processes Communications and I/O multi-ports serial driver. This paper describes the hardware/software architecture of the new MLC system, emphasizing the reliability and the low costs of the open source solutions. Moreover, a huge amount of software packages available in open source environment will assure a less painful maintenance, and will open the way to further improvements of the system itself.

  3. Enhancing participatory approach in water resources management: development of a survey to evaluate stakeholders needs and priorities related to software capabilities

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Josef, S.; Boukalova, Z.; Triana, F.; Ghetta, M.; Sabbatini, T.; Bonari, E.; Cannata, M.; De Filippis, G.

    2016-12-01

    The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management) aims at simplifying the application of EU-water related Directives, by developing an open source and public domain, GIS-integrated platform for planning and management of ground- and surface-water resources. The FREEWAT platform is conceived as a canvas, where several distributed and physically-based simulation codes are virtually integrated. The choice of such codes was supported by the result of a survey performed by means of questionnaires distributed to 14 case study FREEWAT project partners and several stakeholders. This was performed in the first phase of the project within the WP 6 (Enhanced science and participatory approach evidence-based decision making), Task 6.1 (Definition of a "needs/tools" evaluation grid). About 30% among all the invited entities and institutions from several EU and non-EU Countries expressed their interest in contributing to the survey. Most of them were research institutions, government and geoenvironmental companies and river basin authorities.The result of the questionnaire provided a spectrum of needs and priorities of partners/stakeholders, which were addressed during the development phase of the FREEWAT platform. The main needs identified were related to ground- and surface-water quality, sustainable water management, interaction between groundwater/surface-water bodies, and design and management of Managed Aquifer Recharge schemes. Needs and priorities were then connected to the specific EU Directives and Regulations to be addressed.One of the main goals of the questionnaires was to collect information and suggestions regarding the use of existing commercial/open-source software tools to address needs and priorities, and regarding the needs to address specific water-related processes/problems.

  4. Open source 3D printers: an appropriate technology for building low cost optics labs for the developing communities

    NASA Astrophysics Data System (ADS)

    Gwamuri, J.; Pearce, Joshua M.

    2017-08-01

    The recent introduction of RepRap (self-replicating rapid prototyper) 3-D printers and the resultant open source technological improvements have resulted in affordable 3-D printing, enabling low-cost distributed manufacturing for individuals. This development and others such as the rise of open source-appropriate technology (OSAT) and solar powered 3-D printing are moving 3-D printing from an industry based technology to one that could be used in the developing world for sustainable development. In this paper, we explore some specific technological improvements and how distributed manufacturing with open-source 3-D printing can be used to provide open-source 3-D printable optics components for developing world communities through the ability to print less expensive and customized products. This paper presents an open-source low cost optical equipment library which enables relatively easily adapted customizable designs with the potential of changing the way optics is taught in resource constraint communities. The study shows that this method of scientific hardware development has a potential to enables a much broader audience to participate in optical experimentation both as research and teaching platforms. Conclusions on the technical viability of 3-D printing to assist in development and recommendations on how developing communities can fully exploit this technology to improve the learning of optics through hands-on methods have been outlined.

  5. A Scalable, Open Source Platform for Data Processing, Archiving and Dissemination

    DTIC Science & Technology

    2016-01-01

    Object Oriented Data Technology (OODT) big data toolkit developed by NASA and the Work-flow INstance Generation and Selection (WINGS) scientific work...to several challenge big data problems and demonstrated the utility of OODT-WINGS in addressing them. Specific demonstrated analyses address i...source software, Apache, Object Oriented Data Technology, OODT, semantic work-flows, WINGS, big data , work- flow management 16. SECURITY CLASSIFICATION OF

  6. Building Energy Management Open Source Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This is the repository for Building Energy Management Open Source Software (BEMOSS), which is an open source operating system that is engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. BEMOSS offers the following key features: (1) Open source, open architecture – BEMOSS is an open source operating system that is built upon VOLTTRON – a distributed agent platform developed by Pacific Northwest National Laboratory (PNNL). BEMOSS was designed to make it easy for hardware manufacturers to seamlessly interface their devices with BEMOSS. Software developers can also contribute to adding additional BEMOSS functionalities and applications.more » (2) Plug & play – BEMOSS was designed to automatically discover supported load controllers (including smart thermostats, VAV/RTUs, lighting load controllers and plug load controllers) in commercial buildings. (3) Interoperability – BEMOSS was designed to work with load control devices form different manufacturers that operate on different communication technologies and data exchange protocols. (4) Cost effectiveness – Implementation of BEMOSS deemed to be cost-effective as it was built upon a robust open source platform that can operate on a low-cost single-board computer, such as Odroid. This feature could contribute to its rapid deployment in small- or medium-sized commercial buildings. (5) Scalability and ease of deployment – With its multi-node architecture, BEMOSS provides a distributed architecture where load controllers in a multi-floor and high occupancy building could be monitored and controlled by multiple single-board computers hosting BEMOSS. This makes it possible for a building engineer to deploy BEMOSS in one zone of a building, be comfortable with its operation, and later on expand the deployment to the entire building to make it more energy efficient. (6) Ability to provide local and remote monitoring – BEMOSS provides both local and remote monitoring ability with role-based access control. (7) Security – In addition to built-in security features provided by VOLTTRON, BEMOSS provides enhanced security features, including BEMOSS discovery approval process, encrypted core-to-node communication, thermostat anti-tampering feature and many more. (8) Support from the Advisory Committee – BEMOSS was developed in consultation with an advisory committee from the beginning of the project. BEMOSS advisory committee comprises representatives from 22 organizations from government and industry.« less

  7. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  8. Navigating the unfolding open data landscape in ecology and evolution.

    PubMed

    Culina, Antica; Baglioni, Miriam; Crowther, Tom W; Visser, Marcel E; Woutersen-Windhouwer, Saskia; Manghi, Paolo

    2018-03-01

    Open access to data is revolutionizing the sciences. To allow ecologists and evolutionary biologists to confidently find and use the existing data, we provide an overview of the landscape of online data infrastructures, and highlight the key points to consider when using open data. We introduce an online collaborative platform to keep a community-driven, updated list of the best sources that enable search for data in one interface. In doing so, our aim is to lower the barrier to accessing open data, and encourage its use by researchers hoping to increase the scope, reliability and value of their findings.

  9. Steady-state capabilities for hydroturbines with OpenFOAM

    NASA Astrophysics Data System (ADS)

    Page, M.; Beaudoin, M.; Giroux, A. M.

    2010-08-01

    The availability of a high quality Open Source CFD simulation platform like OpenFOAM offers new R&D opportunities by providing direct access to models and solver implementation details. Efforts have been made by Hydro-Québec to adapt OpenFOAM to hydroturbines for the development of steady-state capabilities. The paper describes the developments that have been made to implement new turbomachinery related capabilities: Multiple Frame of Reference solver, domain coupling interfaces (GGI, cyclicGGI and mixing plane) and specialized boundary conditions. Practical use of the new turbomachinery capabilities are demonstrated for the analysis of a 195-MW Francis hydroturbine.

  10. Monitoring Error Rates In Illumina Sequencing.

    PubMed

    Manley, Leigh J; Ma, Duanduan; Levine, Stuart S

    2016-12-01

    Guaranteeing high-quality next-generation sequencing data in a rapidly changing environment is an ongoing challenge. The introduction of the Illumina NextSeq 500 and the depreciation of specific metrics from Illumina's Sequencing Analysis Viewer (SAV; Illumina, San Diego, CA, USA) have made it more difficult to determine directly the baseline error rate of sequencing runs. To improve our ability to measure base quality, we have created an open-source tool to construct the Percent Perfect Reads (PPR) plot, previously provided by the Illumina sequencers. The PPR program is compatible with HiSeq 2000/2500, MiSeq, and NextSeq 500 instruments and provides an alternative to Illumina's quality value (Q) scores for determining run quality. Whereas Q scores are representative of run quality, they are often overestimated and are sourced from different look-up tables for each platform. The PPR's unique capabilities as a cross-instrument comparison device, as a troubleshooting tool, and as a tool for monitoring instrument performance can provide an increase in clarity over SAV metrics that is often crucial for maintaining instrument health. These capabilities are highlighted.

  11. Climate Signals: An On-Line Digital Platform for Mapping Climate Change Impacts in Real Time

    NASA Astrophysics Data System (ADS)

    Cutting, H.

    2016-12-01

    Climate Signals is an on-line digital platform for cataloging and mapping the impacts of climate change. The CS platform specifies and details the chains of connections between greenhouse gas emissions and individual climate events. Currently in open-beta release, the platform is designed to to engage and serve the general public, news media, and policy-makers, particularly in real-time during extreme climate events. Climate Signals consists of a curated relational database of events and their links to climate change, a mapping engine, and a gallery of climate change monitors offering real-time data. For each event in the database, an infographic engine provides a custom attribution "tree" that illustrates the connections to climate change. In addition, links to key contextual resources are aggregated and curated for each event. All event records are fully annotated with detailed source citations and corresponding hyper links. The system of attribution used to link events to climate change in real-time is detailed here. This open-beta release is offered for public user testing and engagement. Launched in May 2016, the operation of this platform offers lessons for public engagement in climate change impacts.

  12. Performance evaluation of OpenFOAM on many-core architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brzobohatý, Tomáš; Říha, Lubomír; Karásek, Tomáš, E-mail: tomas.karasek@vsb.cz

    In this article application of Open Source Field Operation and Manipulation (OpenFOAM) C++ libraries for solving engineering problems on many-core architectures is presented. Objective of this article is to present scalability of OpenFOAM on parallel platforms solving real engineering problems of fluid dynamics. Scalability test of OpenFOAM is performed using various hardware and different implementation of standard PCG and PBiCG Krylov iterative methods. Speed up of various implementations of linear solvers using GPU and MIC accelerators are presented in this paper. Numerical experiments of 3D lid-driven cavity flow for several cases with various number of cells are presented.

  13. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  14. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  15. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  16. Development of an open-source cloud-connected sensor-monitoring platform

    USDA-ARS?s Scientific Manuscript database

    Rapid advances in electronics and communications technologies offer continuously evolving options for sensing and awareness of the physical environment. Many of these advances are becoming increasingly available to “non-professionals,” that is, those without formal training or expertise in discipli...

  17. Design of a Community-Engaged Health Informatics Platform with an Architecture of Participation.

    PubMed

    Millery, Mari; Ramos, Wilson; Lien, Chueh; Aguirre, Alejandra N; Kukafka, Rita

    2015-01-01

    Community-engaged health informatics (CEHI) applies information technology and participatory approaches to improve the health of communities. Our objective was to translate the concept of CEHI into a usable and replicable informatics platform that will facilitate community-engaged practice and research. The setting is a diverse urban neighborhood in New York City. The methods included community asset mapping, stakeholder interviews, logic modeling, analysis of affordances in open-source tools, elicitation of use cases and requirements, and a survey of early adopters. Based on synthesis of data collected, GetHealthyHeigths.org (GHH) was developed using open-source LAMP stack and Drupal content management software. Drupal's organic groups module was used for novel participatory functionality, along with detailed user roles and permissions. Future work includes evaluation of GHH and its impact on agency and service networks. We plan to expand GHH with additional functionality to further support CEHI by combining informatics solutions with community engagement to improve health.

  18. Design of a Community-Engaged Health Informatics Platform with an Architecture of Participation

    PubMed Central

    Millery, Mari; Ramos, Wilson; Lien, Chueh; Aguirre, Alejandra N.; Kukafka, Rita

    2015-01-01

    Community-engaged health informatics (CEHI) applies information technology and participatory approaches to improve the health of communities. Our objective was to translate the concept of CEHI into a usable and replicable informatics platform that will facilitate community-engaged practice and research. The setting is a diverse urban neighborhood in New York City. The methods included community asset mapping, stakeholder interviews, logic modeling, analysis of affordances in open-source tools, elicitation of use cases and requirements, and a survey of early adopters. Based on synthesis of data collected, GetHealthyHeigths.org (GHH) was developed using open-source LAMP stack and Drupal content management software. Drupal’s organic groups module was used for novel participatory functionality, along with detailed user roles and permissions. Future work includes evaluation of GHH and its impact on agency and service networks. We plan to expand GHH with additional functionality to further support CEHI by combining informatics solutions with community engagement to improve health. PMID:26958227

  19. The Digital Slide Archive: A Software Platform for Management, Integration, and Analysis of Histology for Cancer Research.

    PubMed

    Gutman, David A; Khalilia, Mohammed; Lee, Sanghoon; Nalisnik, Michael; Mullen, Zach; Beezley, Jonathan; Chittajallu, Deepak R; Manthey, David; Cooper, Lee A D

    2017-11-01

    Tissue-based cancer studies can generate large amounts of histology data in the form of glass slides. These slides contain important diagnostic, prognostic, and biological information and can be digitized into expansive and high-resolution whole-slide images using slide-scanning devices. Effectively utilizing digital pathology data in cancer research requires the ability to manage, visualize, share, and perform quantitative analysis on these large amounts of image data, tasks that are often complex and difficult for investigators with the current state of commercial digital pathology software. In this article, we describe the Digital Slide Archive (DSA), an open-source web-based platform for digital pathology. DSA allows investigators to manage large collections of histologic images and integrate them with clinical and genomic metadata. The open-source model enables DSA to be extended to provide additional capabilities. Cancer Res; 77(21); e75-78. ©2017 AACR . ©2017 American Association for Cancer Research.

  20. PIVOT: platform for interactive analysis and visualization of transcriptomics data.

    PubMed

    Zhu, Qin; Fisher, Stephen A; Dueck, Hannah; Middleton, Sarah; Khaladkar, Mugdha; Kim, Junhyong

    2018-01-05

    Many R packages have been developed for transcriptome analysis but their use often requires familiarity with R and integrating results of different packages requires scripts to wrangle the datatypes. Furthermore, exploratory data analyses often generate multiple derived datasets such as data subsets or data transformations, which can be difficult to track. Here we present PIVOT, an R-based platform that wraps open source transcriptome analysis packages with a uniform user interface and graphical data management that allows non-programmers to interactively explore transcriptomics data. PIVOT supports more than 40 popular open source packages for transcriptome analysis and provides an extensive set of tools for statistical data manipulations. A graph-based visual interface is used to represent the links between derived datasets, allowing easy tracking of data versions. PIVOT further supports automatic report generation, publication-quality plots, and program/data state saving, such that all analysis can be saved, shared and reproduced. PIVOT will allow researchers with broad background to easily access sophisticated transcriptome analysis tools and interactively explore transcriptome datasets.

  1. Supporting Building Portfolio Investment and Policy Decision Making through an Integrated Building Utility Data Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena

    The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less

  2. Operational use of open satellite data for marine water quality monitoring

    NASA Astrophysics Data System (ADS)

    Symeonidis, Panagiotis; Vakkas, Theodoros

    2017-09-01

    The purpose of this study was to develop an operational platform for marine water quality monitoring using near real time satellite data. The developed platform utilizes free and open satellite data available from different data sources like COPERNICUS, the European Earth Observation Initiative, or NASA, from different satellites and instruments. The quality of the marine environment is operationally evaluated using parameters like chlorophyll-a concentration, water color and Sea Surface Temperature (SST). For each parameter, there are more than one dataset available, from different data sources or satellites, to allow users to select the most appropriate dataset for their area or time of interest. The above datasets are automatically downloaded from the data provider's services and ingested to the central, spatial engine. The spatial data platform uses the Postgresql database with the PostGIS extension for spatial data storage and Geoserver for the provision of the spatial data services. The system provides daily, 10 days and monthly maps and time series of the above parameters. The information is provided using a web client which is based on the GET SDI PORTAL, an easy to use and feature rich geospatial visualization and analysis platform. The users can examine the temporal variation of the parameters using a simple time animation tool. In addition, with just one click on the map, the system provides an interactive time series chart for any of the parameters of the available datasets. The platform can be offered as Software as a Service (SaaS) to any area in the Mediterranean region.

  3. Power Watch - A global, open database of power plants that supports research on climate, water and air pollution impact of the global power sector.

    NASA Astrophysics Data System (ADS)

    Friedrich, J.; Kressig, A.; Van Groenou, S.; McCormick, C.

    2017-12-01

    Challenge The lack of transparent, accessible, and centralized power sector data inhibits the ability to research the impact of the global power sector. information gaps for citizens, analysts, and decision makers worldwide create barriers to sustainable development efforts. The need for transparent, accessible, and centralized information is especially important to enhance the commitments outlined in the recently adopted Paris Agreement and Sustainable Development Goals. Offer Power Watch will address this challenge by creating a comprehensive, open-source platform on the world's power systems. The platform hosts data on 85% of global installed electrical capacity and for each power plant will include data points on installed capacity, fuel type, annual generation, commissioning year, with more characteristics like emissions, particulate matter, annual water demand and more added over time. Most of the data is reported from national level sources, but annual generation and other operational characteristiscs are estimated via Machine Learning modeling and remotely sensed data when not officially reported. In addition, Power Watch plans to provide a suite of tools that address specific decision maker needs, such as water risk assessments and air pollution modeling. Impact Through open data, the platform and its tools will allow reserachers to do more analysis of power sector impacts and perform energy modeling. It will help catalyze accountability for policy makers, businesses, and investors and will inform and drive the transition to a clean energy future while reaching development targets.

  4. Real time wind farm emulation using SimWindFarm toolbox

    NASA Astrophysics Data System (ADS)

    Topor, Marcel

    2016-06-01

    This paper presents a wind farm emulation solution using an open source Matlab/Simulink toolbox and the National Instruments cRIO platform. This work is based on the Aeolus SimWindFarm (SWF) toolbox models developed at Aalborg university, Denmark. Using the Matlab Simulink models developed in SWF, the modeling code can be exported to a real time model using the NI Veristand model framework and the resulting code is integrated as a hardware in the loop control on the NI 9068 platform.

  5. The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot

    PubMed Central

    Kitson, Philip J; Glatzel, Stefan

    2016-01-01

    An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic ‘programs’ which can run on similar low cost, user-constructed robotic platforms towards an ‘open-source’ regime in the area of chemical synthesis. PMID:28144350

  6. Application of Emerging Open-source Embedded Systems for Enabling Low-cost Wireless Mini-observatory Nodes in the Coastal Zone

    NASA Astrophysics Data System (ADS)

    Glazer, B. T.

    2016-02-01

    Here, we describe the development of novel, low-cost, open-source instrumentation to enable wireless data transfer of biogeochemical sensors in the coastal zone. The platform is centered upon the Beaglebone Black single board computer. Process-inquiry in environmental sciences suffers from undersampling; enabling sustained and unattended data collection typically involves expensive instrumentation and infrastructure deployed as cabled observatories with little flexibility in deployment location following initial installation. High cost of commercially-available or custom electronic packages have not only limited the number of sensor node sites that can be targeted by reasonably well-funded academic researchers, but have also entirely prohibited widespread engagement with K-12, public non-profit, and `citizen scientist' STEM audiences. The new platform under development represents a balanced blend of research-grade sensors and low-cost open-source electronics that are easily assembled. Custom, robust, open-source code that remains customizable for specific node configurations can match a specific deployment's measurement needs, depending on the scientific research priorities. We have demonstrated prototype capabilities and versatility through lab testing and field deployments of multiple sensor nodes with multiple sensor inputs, all of which are streaming near-real-time data over wireless RF links to a shore-based base station. On shore, first-pass data processing QA/QC takes place and near-real-time plots are made available on the World Wide Web. Specifically, we have worked closely with an environmental and cultural management and restoration non-profit organization, and middle and high school science classes, engaging their interest in STEM application to local watershed processes. Ultimately, continued successful development of this pilot project can lead to a coastal oceanographic analogue of the popular Weather Underground personal weather station model.

  7. LOVD: easy creation of a locus-specific sequence variation database using an "LSDB-in-a-box" approach.

    PubMed

    Fokkema, Ivo F A C; den Dunnen, Johan T; Taschner, Peter E M

    2005-08-01

    The completion of the human genome project has initiated, as well as provided the basis for, the collection and study of all sequence variation between individuals. Direct access to up-to-date information on sequence variation is currently provided most efficiently through web-based, gene-centered, locus-specific databases (LSDBs). We have developed the Leiden Open (source) Variation Database (LOVD) software approaching the "LSDB-in-a-Box" idea for the easy creation and maintenance of a fully web-based gene sequence variation database. LOVD is platform-independent and uses PHP and MySQL open source software only. The basic gene-centered and modular design of the database follows the recommendations of the Human Genome Variation Society (HGVS) and focuses on the collection and display of DNA sequence variations. With minimal effort, the LOVD platform is extendable with clinical data. The open set-up should both facilitate and promote functional extension with scripts written by the community. The LOVD software is freely available from the Leiden Muscular Dystrophy pages (www.DMD.nl/LOVD/). To promote the use of LOVD, we currently offer curators the possibility to set up an LSDB on our Leiden server. (c) 2005 Wiley-Liss, Inc.

  8. The Importance of Artificial Intelligence for Naval Intelligence Training Simulations

    DTIC Science & Technology

    2006-09-01

    experimental investigation described later. B. SYSTEM ARCHITECTURE The game-based simulator was created using NetBeans , which is an open source integrated...development environment (IDE) written entirely in Java using the NetBeans Platform. NetBeans is based upon the Java language which contains the...involved within the simulation are conducted in a GUI built within the NetBeans IDE. The opening display allows the user to setup the simulation

  9. Neural ensemble communities: Open-source approaches to hardware for large-scale electrophysiology

    PubMed Central

    Siegle, Joshua H.; Hale, Gregory J.; Newman, Jonathan P.; Voigts, Jakob

    2014-01-01

    One often-overlooked factor when selecting a platform for large-scale electrophysiology is whether or not a particular data acquisition system is “open” or “closed”: that is, whether or not the system’s schematics and source code are available to end users. Open systems have a reputation for being difficult to acquire, poorly documented, and hard to maintain. With the arrival of more powerful and compact integrated circuits, rapid prototyping services, and web-based tools for collaborative development, these stereotypes must be reconsidered. We discuss some of the reasons why multichannel extracellular electrophysiology could benefit from open-source approaches and describe examples of successful community-driven tool development within this field. In order to promote the adoption of open-source hardware and to reduce the need for redundant development efforts, we advocate a move toward standardized interfaces that connect each element of the data processing pipeline. This will give researchers the flexibility to modify their tools when necessary, while allowing them to continue to benefit from the high-quality products and expertise provided by commercial vendors. PMID:25528614

  10. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  11. GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models

    NASA Astrophysics Data System (ADS)

    Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.

    2015-11-01

    We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.

  12. Leveraging Open Standards and Technologies to Enhance Community Access to Earth Science Lidar Data

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Cowart, C.; Baru, C.; Arrowsmith, R.

    2011-12-01

    Lidar (Light Detection and Ranging) data, collected from space, airborne and terrestrial platforms, have emerged as an invaluable tool for a variety of Earth science applications ranging from ice sheet monitoring to modeling of earth surface processes. However, lidar present a unique suite of challenges from the perspective of building cyberinfrastructure systems that enable the scientific community to access these valuable research datasets. Lidar data are typically characterized by millions to billions of individual measurements of x,y,z position plus attributes; these "raw" data are also often accompanied by derived raster products and are frequently terabytes in size. As a relatively new and rapidly evolving data collection technology, relevant open data standards and software projects are immature compared to those for other remote sensing platforms. The NSF-funded OpenTopography Facility project has developed an online lidar data access and processing system that co-locates data with on-demand processing tools to enable users to access both raw point cloud data as well as custom derived products and visualizations. OpenTopography is built on a Service Oriented Architecture (SOA) in which applications and data resources are deployed as standards compliant (XML and SOAP) Web services with the open source Opal Toolkit. To develop the underlying applications for data access, filtering and conversion, and various processing tasks, OpenTopography has heavily leveraged existing open source software efforts for both lidar and raster data. Operating on the de facto LAS binary point cloud format (maintained by ASPRS), open source libLAS and LASlib libraries provide OpenTopography data ingestion, query and translation capabilities. Similarly, raster data manipulation is performed through a suite of services built on the Geospatial Data Abstraction Library (GDAL). OpenTopography has also developed our own algorithm for high-performance gridding of lidar point cloud data, Points2Grid, and have released the code as an open source project. An emerging conversation that the lidar community and OpenTopography are actively engaged in is the need for open, community supported standards and metadata for both full waveform and terrestrial (waveform and discrete return) lidar data. Further, given the immature nature of many lidar data archives and limited online access to public domain data, there is an opportunity to develop interoperable data catalogs based on an open standard such as the OGC CSW specification to facilitate discovery and access to Earth science oriented lidar data.

  13. Swiss Experiment: Design, implemention and use of a cross-disciplinary infrastructure for data intensive science

    NASA Astrophysics Data System (ADS)

    Dawes, N.; Salehi, A.; Clifton, A.; Bavay, M.; Aberer, K.; Parlange, M. B.; Lehning, M.

    2010-12-01

    It has long been known that environmental processes are cross-disciplinary, but data has continued to be acquired and held for a single purpose. Swiss Experiment is a rapidly evolving cross-disciplinary, distributed sensor data infrastructure, where tools for the environmental science community stem directly from computer science research. The platform uses the bleeding edge of computer science to acquire, store and distribute data and metadata from all environmental science disciplines at a variety of temporal and spatial resolutions. SwissEx is simultaneously developing new technologies to allow low cost, high spatial and temporal resolution measurements such that small areas can be intensely monitored. This data is then combined with existing widespread, low density measurements in the cross-disciplinary platform to provide well documented datasets, which are of use to multiple research disciplines. We present a flexible, generic infrastructure at an advanced stage of development. The infrastructure makes the most of Web 2.0 technologies for a collaborative working environment and as a user interface for a metadata database. This environment is already closely integrated with GSN, an open-source database middleware developed under Swiss Experiment for acquisition and storage of generic time-series data (2D and 3D). GSN can be queried directly by common data processing packages and makes data available in real-time to models and 3rd party software interfaces via its web service interface. It also provides real-time push or pull data exchange between instances, a user management system which leaves data owners in charge of their data, advanced real-time processing and much more. The SwissEx interface is increasingly gaining users and supporting environmental science in Switzerland. It is also an integral part of environmental education projects ClimAtscope and O3E, where the technologies can provide rapid feedback of results for children of all ages and where the data from their own stations can be compared to national data networks.

  14. Long distance education for croatian nurses with open source software.

    PubMed

    Radenovic, Aleksandar; Kalauz, Sonja

    2006-01-01

    Croatian Nursing Informatics Association (CNIA) has been established as result of continuing work on promoting nursing informatics in Croatia. Main goals of CNIA are promoting nursing informatics and education of nurses about nursing informatics and using information technology in nursing process. CNIA in start of work is developed three courses from nursing informatics all designed with support of long distance education with open source software. Courses are: A - 'From Data to Wisdom', B - 'Introduction to Nursing Informatics' and C - 'Nursing Informatics I'. Courses A and B are obligatory for C course. Technology used to implement these online courses is based on the open source Learning Management System (LMS), Claroline, free online collaborative learning platform. Courses are divided in two modules/days. First module/day participants have classical approach to education and second day with E-learning from home. These courses represent first courses from nursing informatics' and first long distance education for nurses also.

  15. CymeR: cytometry analysis using KNIME, docker and R

    PubMed Central

    Muchmore, B.; Alarcón-Riquelme, M.E.

    2017-01-01

    Abstract Summary: Here we present open-source software for the analysis of high-dimensional cytometry data using state of the art algorithms. Importantly, use of the software requires no programming ability, and output files can either be interrogated directly in CymeR or they can be used downstream with any other cytometric data analysis platform. Also, because we use Docker to integrate the multitude of components that form the basis of CymeR, we have additionally developed a proof-of-concept of how future open-source bioinformatic programs with graphical user interfaces could be developed. Availability and Implementation: CymeR is open-source software that ties several components into a single program that is perhaps best thought of as a self-contained data analysis operating system. Please see https://github.com/bmuchmore/CymeR/wiki for detailed installation instructions. Contact: brian.muchmore@genyo.es or marta.alarcon@genyo.es PMID:27998935

  16. CymeR: cytometry analysis using KNIME, docker and R.

    PubMed

    Muchmore, B; Alarcón-Riquelme, M E

    2017-03-01

    Here we present open-source software for the analysis of high-dimensional cytometry data using state of the art algorithms. Importantly, use of the software requires no programming ability, and output files can either be interrogated directly in CymeR or they can be used downstream with any other cytometric data analysis platform. Also, because we use Docker to integrate the multitude of components that form the basis of CymeR, we have additionally developed a proof-of-concept of how future open-source bioinformatic programs with graphical user interfaces could be developed. CymeR is open-source software that ties several components into a single program that is perhaps best thought of as a self-contained data analysis operating system. Please see https://github.com/bmuchmore/CymeR/wiki for detailed installation instructions. brian.muchmore@genyo.es or marta.alarcon@genyo.es. © The Author 2016. Published by Oxford University Press.

  17. Implementation and Testing of Low Cost Uav Platform for Orthophoto Imaging

    NASA Astrophysics Data System (ADS)

    Brucas, D.; Suziedelyte-Visockiene, J.; Ragauskas, U.; Berteska, E.; Rudinskas, D.

    2013-08-01

    Implementation of Unmanned Aerial Vehicles for civilian applications is rapidly increasing. Technologies which were expensive and available only for military use have recently spread on civilian market. There is a vast number of low cost open source components and systems for implementation on UAVs available. Using of low cost hobby and open source components ensures considerable decrease of UAV price, though in some cases compromising its reliability. In Space Science and Technology Institute (SSTI) in collaboration with Vilnius Gediminas Technical University (VGTU) researches have been performed in field of constructing and implementation of small UAVs composed of low cost open source components (and own developments). Most obvious and simple implementation of such UAVs - orthophoto imaging with data download and processing after the flight. The construction, implementation of UAVs, flight experience, data processing and data implementation will be further covered in the paper and presentation.

  18. THE BERKELEY DATA ANALYSIS SYSTEM (BDAS): AN OPEN SOURCE PLATFORM FOR BIG DATA ANALYTICS

    DTIC Science & Technology

    2017-09-01

    Evan Sparks, Oliver Zahn, Michael J. Franklin, David A. Patterson, Saul Perlmutter. Scientific Computing Meets Big Data Technology: An Astronomy ...Processing Astronomy Imagery Using Big Data Technology. IEEE Transaction on Big Data, 2016. Approved for Public Release; Distribution Unlimited. 22 [93

  19. Environmental and Landscape Remote Sensing Using Free and Open Source Image Processing Tools

    EPA Science Inventory

    As global climate change and human activities impact the environment, there is a growing need for scientific tools to monitor and measure environmental conditions that support human and ecological health. Remotely sensed imagery from satellite and airborne platforms provides a g...

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Kim, Jungwon; Vetter, Jeffrey S

    This paper presents a directive-based, high-level programming framework for high-performance reconfigurable computing. It takes a standard, portable OpenACC C program as input and generates a hardware configuration file for execution on FPGAs. We implemented this prototype system using our open-source OpenARC compiler; it performs source-to-source translation and optimization of the input OpenACC program into an OpenCL code, which is further compiled into a FPGA program by the backend Altera Offline OpenCL compiler. Internally, the design of OpenARC uses a high- level intermediate representation that separates concerns of program representation from underlying architectures, which facilitates portability of OpenARC. In fact, thismore » design allowed us to create the OpenACC-to-FPGA translation framework with minimal extensions to our existing system. In addition, we show that our proposed FPGA-specific compiler optimizations and novel OpenACC pragma extensions assist the compiler in generating more efficient FPGA hardware configuration files. Our empirical evaluation on an Altera Stratix V FPGA with eight OpenACC benchmarks demonstrate the benefits of our strategy. To demonstrate the portability of OpenARC, we show results for the same benchmarks executing on other heterogeneous platforms, including NVIDIA GPUs, AMD GPUs, and Intel Xeon Phis. This initial evidence helps support the goal of using a directive-based, high-level programming strategy for performance portability across heterogeneous HPC architectures.« less

  1. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    PubMed Central

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J; Schuster, Heiko; Ternette, Nicola; Alpízar, Adán; Schittenhelm, Ralf B; Ramarathinam, Sri H; Lindestam Arlehamn, Cecilia S; Chiek Koh, Ching; Gillet, Ludovic C; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David S; Deutsch, Eric W; Moritz, Robert L; Purcell, Anthony W; Rammensee, Hans-Georg; Stevanovic, Stefan; Aebersold, Ruedi

    2015-01-01

    We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies. DOI: http://dx.doi.org/10.7554/eLife.07661.001 PMID:26154972

  2. A Lightweight Remote Parallel Visualization Platform for Interactive Massive Time-varying Climate Data Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, T.; Huang, Q.; Liu, Q.

    2014-12-01

    Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.

  3. The importance of using open source technologies and common standards for interoperability within eHealth: Perspectives from the Millennium Villages Project

    PubMed Central

    Borland, Rob; Barasa, Mourice; Iiams-Hauser, Casey; Velez, Olivia; Kaonga, Nadi Nina; Berg, Matt

    2013-01-01

    The purpose of this paper is to illustrate the importance of using open source technologies and common standards for interoperability when implementing eHealth systems and illustrate this through case studies, where possible. The sources used to inform this paper draw from the implementation and evaluation of the eHealth Program in the context of the Millennium Villages Project (MVP). As the eHealth Team was tasked to deploy an eHealth architecture, the Millennium Villages Global-Network (MVG-Net), across all fourteen of the MVP sites in Sub-Saharan Africa, the team recognized the need for standards and uniformity but also realized that context would be an important factor. Therefore, the team decided to utilize open source solutions. The MVP implementation of MVG-Net provides a model for those looking to implement informatics solutions across disciplines and countries. Furthermore, there are valuable lessons learned that the eHealth community can benefit from. By sharing lessons learned and developing an accessible, open-source eHealth platform, we believe that we can more efficiently and rapidly achieve the health-related and collaborative Millennium Development Goals (MDGs). PMID:22894051

  4. Avionics and Power Management for Low-Cost High-Altitude Balloon Science Platforms

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey; Roberts, Anthony; McNatt, Jeremiah

    2016-01-01

    High-altitude balloons (HABs) have become popular as educational and scientific platforms for planetary research. This document outlines key components for missions where low cost and rapid development are desired. As an alternative to ground-based vacuum and thermal testing, these systems can be flight tested at comparable costs. Communication, solar, space, and atmospheric sensing experiments often require environments where ground level testing can be challenging or impossible in certain cases. When performing HAB research the ability to monitor the status of the platform and gather data is key for both scientific and recoverability aspects of the mission. A few turnkey platform solutions are outlined that leverage rapidly evolving open-source engineering ecosystems. Rather than building custom components from scratch, these recommendations attempt to maximize simplicity and cost of HAB platforms to make launches more accessible to everyone.

  5. The NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform to Support the Analysis of Petascale Environmental Data Collections

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Pugh, T.; Wyborn, L. A.; Porter, D.; Allen, C.; Smillie, J.; Antony, J.; Trenham, C.; Evans, B. J.; Beckett, D.; Erwin, T.; King, E.; Hodge, J.; Woodcock, R.; Fraser, R.; Lescinsky, D. T.

    2014-12-01

    The National Computational Infrastructure (NCI) has co-located a priority set of national data assets within a HPC research platform. This powerful in-situ computational platform has been created to help serve and analyse the massive amounts of data across the spectrum of environmental collections - in particular the climate, observational data and geoscientific domains. This paper examines the infrastructure, innovation and opportunity for this significant research platform. NCI currently manages nationally significant data collections (10+ PB) categorised as 1) earth system sciences, climate and weather model data assets and products, 2) earth and marine observations and products, 3) geosciences, 4) terrestrial ecosystem, 5) water management and hydrology, and 6) astronomy, social science and biosciences. The data is largely sourced from the NCI partners (who include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. By co-locating these large valuable data assets, new opportunities have arisen by harmonising the data collections, making a powerful transdisciplinary research platformThe data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. New scientific software, cloud-scale techniques, server-side visualisation and data services have been harnessed and integrated into the platform, so that analysis is performed seamlessly across the traditional boundaries of the underlying data domains. Characterisation of the techniques along with performance profiling ensures scalability of each software component, all of which can either be enhanced or replaced through future improvements. A Development-to-Operations (DevOps) framework has also been implemented to manage the scale of the software complexity alone. This ensures that software is both upgradable and maintainable, and can be readily reused with complexly integrated systems and become part of the growing global trusted community tools for cross-disciplinary research.

  6. A Comprehensive, Open-source Platform for Mass Spectrometry-based Glycoproteomics Data Analysis.

    PubMed

    Liu, Gang; Cheng, Kai; Lo, Chi Y; Li, Jun; Qu, Jun; Neelamegham, Sriram

    2017-11-01

    Glycosylation is among the most abundant and diverse protein post-translational modifications (PTMs) identified to date. The structural analysis of this PTM is challenging because of the diverse monosaccharides which are not conserved among organisms, the branched nature of glycans, their isomeric structures, and heterogeneity in the glycan distribution at a given site. Glycoproteomics experiments have adopted the traditional high-throughput LC-MS n proteomics workflow to analyze site-specific glycosylation. However, comprehensive computational platforms for data analyses are scarce. To address this limitation, we present a comprehensive, open-source, modular software for glycoproteomics data analysis called GlycoPAT (GlycoProteomics Analysis Toolbox; freely available from www.VirtualGlycome.org/glycopat). The program includes three major advances: (1) "SmallGlyPep," a minimal linear representation of glycopeptides for MS n data analysis. This format allows facile serial fragmentation of both the peptide backbone and PTM at one or more locations. (2) A novel scoring scheme based on calculation of the "Ensemble Score (ES)," a measure that scores and rank-orders MS/MS spectrum for N- and O-linked glycopeptides using cross-correlation and probability based analyses. (3) A false discovery rate (FDR) calculation scheme where decoy glycopeptides are created by simultaneously scrambling the amino acid sequence and by introducing artificial monosaccharides by perturbing the original sugar mass. Parallel computing facilities and user-friendly GUIs (Graphical User Interfaces) are also provided. GlycoPAT is used to catalogue site-specific glycosylation on simple glycoproteins, standard protein mixtures and human plasma cryoprecipitate samples in three common MS/MS fragmentation modes: CID, HCD and ETD. It is also used to identify 960 unique glycopeptides in cell lysates from prostate cancer cells. The results show that the simultaneous consideration of peptide and glycan fragmentation is necessary for high quality MS n spectrum annotation in CID and HCD fragmentation modes. Additionally, they confirm the suitability of GlycoPAT to analyze shotgun glycoproteomics data. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  7. MBAT: a scalable informatics system for unifying digital atlasing workflows.

    PubMed

    Lee, Daren; Ruffins, Seth; Ng, Queenie; Sane, Nikhil; Anderson, Steve; Toga, Arthur

    2010-12-22

    Digital atlases provide a common semantic and spatial coordinate system that can be leveraged to compare, contrast, and correlate data from disparate sources. As the quality and amount of biological data continues to advance and grow, searching, referencing, and comparing this data with a researcher's own data is essential. However, the integration process is cumbersome and time-consuming due to misaligned data, implicitly defined associations, and incompatible data sources. This work addressing these challenges by providing a unified and adaptable environment to accelerate the workflow to gather, align, and analyze the data. The MouseBIRN Atlasing Toolkit (MBAT) project was developed as a cross-platform, free open-source application that unifies and accelerates the digital atlas workflow. A tiered, plug-in architecture was designed for the neuroinformatics and genomics goals of the project to provide a modular and extensible design. MBAT provides the ability to use a single query to search and retrieve data from multiple data sources, align image data using the user's preferred registration method, composite data from multiple sources in a common space, and link relevant informatics information to the current view of the data or atlas. The workspaces leverage tool plug-ins to extend and allow future extensions of the basic workspace functionality. A wide variety of tool plug-ins were developed that integrate pre-existing as well as newly created technology into each workspace. Novel atlasing features were also developed, such as supporting multiple label sets, dynamic selection and grouping of labels, and synchronized, context-driven display of ontological data. MBAT empowers researchers to discover correlations among disparate data by providing a unified environment for bringing together distributed reference resources, a user's image data, and biological atlases into the same spatial or semantic context. Through its extensible tiered plug-in architecture, MBAT allows researchers to customize all platform components to quickly achieve personalized workflows.

  8. Shipping Science Worldwide with Open Source Containers

    NASA Astrophysics Data System (ADS)

    Molineaux, J. P.; McLaughlin, B. D.; Pilone, D.; Plofchan, P. G.; Murphy, K. J.

    2014-12-01

    Scientific applications often present difficult web-hosting needs. Their compute- and data-intensive nature, as well as an increasing need for high-availability and distribution, combine to create a challenging set of hosting requirements. In the past year, advancements in container-based virtualization and related tooling have offered new lightweight and flexible ways to accommodate diverse applications with all the isolation and portability benefits of traditional virtualization. This session will introduce and demonstrate an open-source, single-interface, Platform-as-a-Serivce (PaaS) that empowers application developers to seamlessly leverage geographically distributed, public and private compute resources to achieve highly-available, performant hosting for scientific applications.

  9. ImagePy: an open-source, Python-based and platform-independent software package for boimage analysis.

    PubMed

    Wang, Anliang; Yan, Xiaolong; Wei, Zhijun

    2018-04-27

    This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.

  10. 3-D interactive visualisation tools for Hi spectral line imaging

    NASA Astrophysics Data System (ADS)

    van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.

    2017-06-01

    Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.

  11. Closed-Loop, Multichannel Experimentation Using the Open-Source NeuroRighter Electrophysiology Platform

    PubMed Central

    Newman, Jonathan P.; Zeller-Townson, Riley; Fong, Ming-Fai; Arcot Desai, Sharanya; Gross, Robert E.; Potter, Steve M.

    2013-01-01

    Single neuron feedback control techniques, such as voltage clamp and dynamic clamp, have enabled numerous advances in our understanding of ion channels, electrochemical signaling, and neural dynamics. Although commercially available multichannel recording and stimulation systems are commonly used for studying neural processing at the network level, they provide little native support for real-time feedback. We developed the open-source NeuroRighter multichannel electrophysiology hardware and software platform for closed-loop multichannel control with a focus on accessibility and low cost. NeuroRighter allows 64 channels of stimulation and recording for around US $10,000, along with the ability to integrate with other software and hardware. Here, we present substantial enhancements to the NeuroRighter platform, including a redesigned desktop application, a new stimulation subsystem allowing arbitrary stimulation patterns, low-latency data servers for accessing data streams, and a new application programming interface (API) for creating closed-loop protocols that can be inserted into NeuroRighter as plugin programs. This greatly simplifies the design of sophisticated real-time experiments without sacrificing the power and speed of a compiled programming language. Here we present a detailed description of NeuroRighter as a stand-alone application, its plugin API, and an extensive set of case studies that highlight the system’s abilities for conducting closed-loop, multichannel interfacing experiments. PMID:23346047

  12. Stewardship and management challenges within a cloud-based open data ecosystem (Invited Paper 211863)

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2017-12-01

    NOAA's Big Data Project is conducting an experiment in the collaborative distribution of open government data to non-governmental cloud-based systems. Through Cooperative Research and Development Agreements signed in 2015 between NOAA and Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium, NOAA is distributing open government data to a wide community of potential users. There are a number of significant advantages related to the use of open data on commercial cloud platforms, but through this experiment NOAA is also discovering significant challenges for those stewarding and maintaining NOAA's data resources in support of users in the wider open data ecosystem. Among the challenges that will be discussed are: the need to provide effective interpretation of the data content to enable their use by data scientists from other expert communities; effective maintenance of Collaborators' open data stores through coordinated publication of new data and new versions of older data; the provenance and verification of open data as authentic NOAA-sourced data across multiple management boundaries and analytical tools; and keeping pace with the accelerating expectations of users with regard to improved quality control, data latency, availability, and discoverability. Suggested strategies to address these challenges will also be described.

  13. Stabilisation problem in biaxial platform

    NASA Astrophysics Data System (ADS)

    Lindner, Tymoteusz; Rybarczyk, Dominik; Wyrwał, Daniel

    2016-12-01

    The article describes investigation of rolling ball stabilization problem on a biaxial platform. The aim of the control system proposed here is to stabilize ball moving on a plane in equilibrium point. The authors proposed a control algorithm based on cascade PID and they compared it with another control method. The article shows the results of the accuracy of ball stabilization and influence of applied filter on the signal waveform. The application used to detect the ball position measured by digital camera has been written using a cross platform .Net wrapper to the OpenCV image processing library - EmguCV. The authors used the bipolar stepper motor with dedicated electronic controller. The data between the computer and the designed controller are sent with use of the RS232 standard. The control stand is based on ATmega series microcontroller.

  14. Postgraduate Students as OER Capacitators

    ERIC Educational Resources Information Center

    King, Thomas William

    2017-01-01

    A comprehensive theoretical, legal and practical basis for OER has been developed over the past fifteen years, supported by the expansion of open source curation platforms and the work of advocacy groups and international bodies. OER's potential has been sufficiently documented; the question remains how best to support, integrate and normalise OER…

  15. Engaging Language Learners through Technology Integration: Theory, Applications, and Outcomes

    ERIC Educational Resources Information Center

    Li, Shuai, Ed.; Swanson, Peter, Ed.

    2014-01-01

    Web 2.0 technologies, open source software platforms, and mobile applications have transformed teaching and learning of second and foreign languages. Language teaching has transitioned from a teacher-centered approach to a student-centered approach through the use of Computer-Assisted Language Learning (CALL) and new teaching approaches.…

  16. SpiceyPy, a Python Wrapper for SPICE

    NASA Astrophysics Data System (ADS)

    Annex, A.

    2017-06-01

    SpiceyPy is an open source Python wrapper for the NAIF SPICE toolkit. It is available for macOS, Linux, and Windows platforms and for Python versions 2.7.x and 3.x as well as Anaconda. SpiceyPy can be installed by running: “pip install spiceypy.”

  17. Learning Analytics Platform, towards an Open Scalable Streaming Solution for Education

    ERIC Educational Resources Information Center

    Lewkow, Nicholas; Zimmerman, Neil; Riedesel, Mark; Essa, Alfred

    2015-01-01

    Next generation digital learning environments require delivering "just-in-time feedback" to learners and those who support them. Unlike traditional business intelligence environments, streaming data requires resilient infrastructure that can move data at scale from heterogeneous data sources, process the data quickly for use across…

  18. Integrating an Awareness of Selfhood and Society into Virtual Learning

    ERIC Educational Resources Information Center

    Stricker, Andrew, Ed.; Calongne, Cynthia, Ed.; Truman, Barbara, Ed.; Arenas, Fil, Ed.

    2017-01-01

    Recent technological advances have opened new platforms for learning and teaching. By utilizing virtual spaces, more educational opportunities are created for students who cannot attend a physical classroom environment. "Integrating an Awareness of Selfhood and Society into Virtual Learning" is a pivotal reference source that discusses…

  19. The Federal Geospatial Platform a shared infrastructure for publishing, discovering and exploiting public data and spatial applications.

    NASA Astrophysics Data System (ADS)

    Dabolt, T. O.

    2016-12-01

    The proliferation of open data and data services continues to thrive and is creating new challenges on how researchers, policy analysts and other decision makes can quickly discover and use relevant data. While traditional metadata catalog approaches used by applications such as data.gov prove to be useful starting points for data search they can quickly frustrate end users who are seeking ways to quickly find and then use data in machine to machine environs. The Geospatial Platform is overcoming these obstacles and providing end users and applications developers a richer more productive user experience. The Geospatial Platform leverages a collection of open source and commercial technology hosted on Amazon Web Services providing an ecosystem of services delivering trusted, consistent data in open formats to all users as well as a shared infrastructure for federal partners to serve their spatial data assets. It supports a diverse array of communities of practice ranging on topics from the 16 National Geospatial Data Assets Themes, to homeland security and climate adaptation. Come learn how you can contribute your data and leverage others or check it out on your own at https://www.geoplatform.gov/

  20. The GPlates Portal: Cloud-Based Interactive 3D Visualization of Global Geophysical and Geological Data in a Web Browser.

    PubMed

    Müller, R Dietmar; Qin, Xiaodong; Sandwell, David T; Dutkiewicz, Adriana; Williams, Simon E; Flament, Nicolas; Maus, Stefan; Seton, Maria

    2016-01-01

    The pace of scientific discovery is being transformed by the availability of 'big data' and open access, open source software tools. These innovations open up new avenues for how scientists communicate and share data and ideas with each other and with the general public. Here, we describe our efforts to bring to life our studies of the Earth system, both at present day and through deep geological time. The GPlates Portal (portal.gplates.org) is a gateway to a series of virtual globes based on the Cesium Javascript library. The portal allows fast interactive visualization of global geophysical and geological data sets, draped over digital terrain models. The globes use WebGL for hardware-accelerated graphics and are cross-platform and cross-browser compatible with complete camera control. The globes include a visualization of a high-resolution global digital elevation model and the vertical gradient of the global gravity field, highlighting small-scale seafloor fabric such as abyssal hills, fracture zones and seamounts in unprecedented detail. The portal also features globes portraying seafloor geology and a global data set of marine magnetic anomaly identifications. The portal is specifically designed to visualize models of the Earth through geological time. These space-time globes include tectonic reconstructions of the Earth's gravity and magnetic fields, and several models of long-wavelength surface dynamic topography through time, including the interactive plotting of vertical motion histories at selected locations. The globes put the on-the-fly visualization of massive data sets at the fingertips of end-users to stimulate teaching and learning and novel avenues of inquiry.

  1. The GPlates Portal: Cloud-Based Interactive 3D Visualization of Global Geophysical and Geological Data in a Web Browser

    PubMed Central

    Müller, R. Dietmar; Qin, Xiaodong; Sandwell, David T.; Dutkiewicz, Adriana; Williams, Simon E.; Flament, Nicolas; Maus, Stefan; Seton, Maria

    2016-01-01

    The pace of scientific discovery is being transformed by the availability of ‘big data’ and open access, open source software tools. These innovations open up new avenues for how scientists communicate and share data and ideas with each other and with the general public. Here, we describe our efforts to bring to life our studies of the Earth system, both at present day and through deep geological time. The GPlates Portal (portal.gplates.org) is a gateway to a series of virtual globes based on the Cesium Javascript library. The portal allows fast interactive visualization of global geophysical and geological data sets, draped over digital terrain models. The globes use WebGL for hardware-accelerated graphics and are cross-platform and cross-browser compatible with complete camera control. The globes include a visualization of a high-resolution global digital elevation model and the vertical gradient of the global gravity field, highlighting small-scale seafloor fabric such as abyssal hills, fracture zones and seamounts in unprecedented detail. The portal also features globes portraying seafloor geology and a global data set of marine magnetic anomaly identifications. The portal is specifically designed to visualize models of the Earth through geological time. These space-time globes include tectonic reconstructions of the Earth’s gravity and magnetic fields, and several models of long-wavelength surface dynamic topography through time, including the interactive plotting of vertical motion histories at selected locations. The globes put the on-the-fly visualization of massive data sets at the fingertips of end-users to stimulate teaching and learning and novel avenues of inquiry. PMID:26960151

  2. NASA's Earth Imagery Service as Open Source Software

    NASA Astrophysics Data System (ADS)

    De Cesare, C.; Alarcon, C.; Huang, T.; Roberts, J. T.; Rodriguez, J.; Cechini, M. F.; Boller, R. A.; Baynes, K.

    2016-12-01

    The NASA Global Imagery Browse Service (GIBS) is a software system that provides access to an archive of historical and near-real-time Earth imagery from NASA-supported satellite instruments. The imagery itself is open data, and is accessible via standards such as the Open Geospatial Consortium (OGC)'s Web Map Tile Service (WMTS) protocol. GIBS includes three core software projects: The Imagery Exchange (TIE), OnEarth, and the Meta Raster Format (MRF) project. These projects are developed using a variety of open source software, including: Apache HTTPD, GDAL, Mapserver, Grails, Zookeeper, Eclipse, Maven, git, and Apache Commons. TIE has recently been released for open source, and is now available on GitHub. OnEarth, MRF, and their sub-projects have been on GitHub since 2014, and the MRF project in particular receives many external contributions from the community. Our software has been successful beyond the scope of GIBS: the PO.DAAC State of the Ocean and COVERAGE visualization projects reuse components from OnEarth. The MRF source code has recently been incorporated into GDAL, which is a core library in many widely-used GIS software such as QGIS and GeoServer. This presentation will describe the challenges faced in incorporating open software and open data into GIBS, and also showcase GIBS as a platform on which scientists and the general public can build their own applications.

  3. Harvesting rockfall hazard evaluation parameters from Google Earth Street View

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Agioutantis, Zacharias; Tripolitsiotis, Achilles; Steiakakis, Chrysanthos; Mertikas, Stelios

    2015-04-01

    Rockfall incidents along highways and railways prove extremely dangerous for properties, infrastructures and human lives. Several qualitative metrics such as the Rockfall Hazard Rating System (RHRS) and the Colorado Rockfall Hazard Rating System (CRHRS) have been established to estimate rockfall potential and provide risk maps in order to control and monitor rockfall incidents. The implementation of such metrics for efficient and reliable risk modeling require accurate knowledge of multi-parametric attributes such as the geological, geotechnical, topographic parameters of the study area. The Missouri Rockfall Hazard Rating System (MORH RS) identifies the most potentially problematic areas using digital video logging for the determination of parameters like slope height and angle, face irregularities, etc. This study aims to harvest in a semi-automated approach geometric and qualitative measures through open source platforms that may provide 3-dimensional views of the areas of interest. More specifically, the Street View platform from Google Maps, is hereby used to provide essential information that can be used towards 3-dimensional reconstruction of slopes along highways. The potential of image capturing along a programmable virtual route to provide the input data for photogrammetric processing is also evaluated. Moreover, qualitative characterization of the geological and geotechnical status, based on the Street View images, is performed. These attributes are then integrated to deliver a GIS-based rockfall hazard map. The 3-dimensional models are compared to actual photogrammetric measures in a rockfall prone area in Crete, Greece while in-situ geotechnical characterization is also used to compare and validate the hazard risk. This work is considered as the first step towards the exploitation of open source platforms to improve road safety and the development of an operational system where authorized agencies (i.e., civil protection) will be able to acquire near-real time hazard maps based on video images retrieved either by open source platforms, operational unmanned aerial vehicles, and/or simple video recordings from users. This work has been performed under the framework of the "Cooperation 2011" project ISTRIA (11_SYN_9_13989) funded from the Operational Program "Competitiveness and Entrepreneurship" (co-funded by the European Regional Development Fund (ERDF)) and managed by the Greek General Secretariat for Research and Technology.

  4. Creating a Rackspace and NASA Nebula compatible cloud using the OpenStack project (Invited)

    NASA Astrophysics Data System (ADS)

    Clark, R.

    2010-12-01

    NASA and Rackspace have both provided technology to the OpenStack that allows anyone to create a private Infrastructure as a Service (IaaS) cloud using open source software and commodity hardware. OpenStack is designed and developed completely in the open and with an open governance process. NASA donated Nova, which powers the compute portion of NASA Nebula Cloud Computing Platform, and Rackspace donated Swift, which powers Rackspace Cloud Files. The project is now in continuous development by NASA, Rackspace, and hundreds of other participants. When you create a private cloud using Openstack, you will have the ability to easily interact with your private cloud, a government cloud, and an ecosystem of public cloud providers, using the same API.

  5. Using OpenOffice as a Portable Interface to JAVA-Based Applications

    NASA Astrophysics Data System (ADS)

    Comeau, T.; Garrett, B.; Richon, J.; Romelfanger, F.

    2004-07-01

    STScI previously used Microsoft Word and Microsoft Access, a Sybase ODBC driver, and the Adobe Acrobat PDF writer, along with a substantial amount of Visual Basic, to generate a variety of documents for the internal Space Telescope Grants Administration System (STGMS). While investigating an upgrade to Microsoft Office XP, we began considering alternatives, ultimately selecting an open source product, OpenOffice.org. This reduces the total number of products required to operate the internal STGMS system, simplifies the build system, and opens the possibility of moving to a non-Windows platform. We describe the experience of moving from Microsoft Office to OpenOffice.org, and our other internal uses of OpenOffice.org in our development environment.

  6. Technological choices for mobile clinical applications.

    PubMed

    Ehrler, Frederic; Issom, David; Lovis, Christian

    2011-01-01

    The rise of cheaper and more powerful mobile devices make them a new and attractive platform for clinical applications. The interaction paradigm and portability of the device facilitates bedside human-machine interactions. The better accessibility to information and decision-support anywhere in the hospital improves the efficiency and the safety of care processes. In this study, we attempt to find out what are the most appropriate Operating System (OS) and Software Development Kit (SDK) to support the development of clinical applications on mobile devices. The Android platform is a Linux-based, open source platform that has many advantages. Two main SDKs are available on this platform: the native Android and the Adobe Flex SDK. Both of them have interesting features, but the latter has been preferred due its portability at comparable performance and ease of development.

  7. MetaNET--a web-accessible interactive platform for biological metabolic network analysis.

    PubMed

    Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael

    2014-01-01

    Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.

  8. Remote stereoscopic video play platform for naked eyes based on the Android system

    NASA Astrophysics Data System (ADS)

    Jia, Changxin; Sang, Xinzhu; Liu, Jing; Cheng, Mingsheng

    2014-11-01

    As people's life quality have been improved significantly, the traditional 2D video technology can not meet people's urgent desire for a better video quality, which leads to the rapid development of 3D video technology. Simultaneously people want to watch 3D video in portable devices,. For achieving the above purpose, we set up a remote stereoscopic video play platform. The platform consists of a server and clients. The server is used for transmission of different formats of video and the client is responsible for receiving remote video for the next decoding and pixel restructuring. We utilize and improve Live555 as video transmission server. Live555 is a cross-platform open source project which provides solutions for streaming media such as RTSP protocol and supports transmission of multiple video formats. At the receiving end, we use our laboratory own player. The player for Android, which is with all the basic functions as the ordinary players do and able to play normal 2D video, is the basic structure for redevelopment. Also RTSP is implemented into this structure for telecommunication. In order to achieve stereoscopic display, we need to make pixel rearrangement in this player's decoding part. The decoding part is the local code which JNI interface calls so that we can extract video frames more effectively. The video formats that we process are left and right, up and down and nine grids. In the design and development, a large number of key technologies from Android application development have been employed, including a variety of wireless transmission, pixel restructuring and JNI call. By employing these key technologies, the design plan has been finally completed. After some updates and optimizations, the video player can play remote 3D video well anytime and anywhere and meet people's requirement.

  9. Millstone: software for multiplex microbial genome analysis and engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  10. Millstone: software for multiplex microbial genome analysis and engineering.

    PubMed

    Goodman, Daniel B; Kuznetsov, Gleb; Lajoie, Marc J; Ahern, Brian W; Napolitano, Michael G; Chen, Kevin Y; Chen, Changping; Church, George M

    2017-05-25

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. We describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  11. Millstone: software for multiplex microbial genome analysis and engineering

    DOE PAGES

    Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.; ...

    2017-05-25

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  12. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    NASA Astrophysics Data System (ADS)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )

  13. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  14. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    NASA Astrophysics Data System (ADS)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun

    2015-09-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  15. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).

    PubMed

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-10-07

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  16. Review assessment support in Open Journal System using TextRank

    NASA Astrophysics Data System (ADS)

    Manalu, S. R.; Willy; Sundjaja, A. M.; Noerlina

    2017-01-01

    In this paper, a review assessment support in Open Journal System (OJS) using TextRank is proposed. OJS is an open-source journal management platform that provides a streamlined journal publishing workflow. TextRank is an unsupervised, graph-based ranking model commonly used as extractive auto summarization of text documents. This study applies the TextRank algorithm to summarize 50 article reviews from an OJS-based international journal. The resulting summaries are formed using the most representative sentences extracted from the reviews. The summaries are then used to help OJS editors in assessing a review’s quality.

  17. SunPy: Python for Solar Physics

    NASA Astrophysics Data System (ADS)

    Bobra, M.; Inglis, A. R.; Mumford, S.; Christe, S.; Freij, N.; Hewett, R.; Ireland, J.; Martinez Oliveros, J. C.; Reardon, K.; Savage, S. L.; Shih, A. Y.; Pérez-Suárez, D.

    2017-12-01

    SunPy is a community-developed open-source software library for solar physics. It is written in Python, a free, cross-platform, general-purpose, high-level programming language which is being increasingly adopted throughout the scientific community. SunPy aims to provide the software for obtaining and analyzing solar and heliospheric data. This poster introduces a new major release, SunPy version 0.8. The first major new feature introduced is Fido, the new primary interface to download data. It provides a consistent and powerful search interface to all major data providers including the VSO and the JSOC, as well as individual data sources such as GOES XRS time series. It is also easy to add new data sources as they become available, i.e. DKIST. The second major new feature is the SunPy coordinate framework. This provides a powerful way of representing coordinates, allowing simple and intuitive conversion between coordinate systems and viewpoints of different instruments (i.e., Solar Orbiter and the Parker Solar Probe), including transformation to astrophysical frames like ICRS. Other new features including new timeseries capabilities with better support for concatenation and metadata, updated documentation and example gallery. SunPy is distributed through pip and conda and all of its code is publicly available (sunpy.org).

  18. Development of a Web-Based Visualization Platform for Climate Research Using Google Earth

    NASA Technical Reports Server (NTRS)

    Sun, Xiaojuan; Shen, Suhung; Leptoukh, Gregory G.; Wang, Panxing; Di, Liping; Lu, Mingyue

    2011-01-01

    Recently, it has become easier to access climate data from satellites, ground measurements, and models from various data centers, However, searching. accessing, and prc(essing heterogeneous data from different sources are very tim -consuming tasks. There is lack of a comprehensive visual platform to acquire distributed and heterogeneous scientific data and to render processed images from a single accessing point for climate studies. This paper. documents the design and implementation of a Web-based visual, interoperable, and scalable platform that is able to access climatological fields from models, satellites, and ground stations from a number of data sources using Google Earth (GE) as a common graphical interface. The development is based on the TCP/IP protocol and various data sharing open sources, such as OPeNDAP, GDS, Web Processing Service (WPS), and Web Mapping Service (WMS). The visualization capability of integrating various measurements into cE extends dramatically the awareness and visibility of scientific results. Using embedded geographic information in the GE, the designed system improves our understanding of the relationships of different elements in a four dimensional domain. The system enables easy and convenient synergistic research on a virtual platform for professionals and the general public, gr$tly advancing global data sharing and scientific research collaboration.

  19. Evaporation from Lake Mead, Arizona and Nevada, 1997-99

    USGS Publications Warehouse

    Westenburg, Craig L.; DeMeo, Guy A.; Tanko, Daron J.

    2006-01-01

    Lake Mead is one of a series of large Colorado River reservoirs operated and maintained by the Bureau of Reclamation. The Colorado River system of reservoirs and diversions is an important source of water for millions of people in seven Western States and Mexico. The U.S. Geological Survey, in cooperation with the Bureau of Reclamation, conducted a study from 1997 to 1999 to estimate evaporation from Lake Mead. For this study, micrometeorological and hydrologic data were collected continually from instrumented platforms deployed at four locations on the lake, open-water areas of Boulder Basin, Virgin Basin, and Overton Arm and a protected cove in Boulder Basin. Data collected at the platforms were used to estimate Lake Mead evaporation by solving an energy-budget equation. The average annual evaporation rate at open-water stations from January 1998 to December 1999 was 7.5 feet. Because the spatial variation of monthly and annual evaporation rates was minimal for the open-water stations, a single open-water station in Boulder Basin would provide data that are adequate to estimate evaporation from Lake Mead.

  20. Numerical tools to predict the environmental loads for offshore structures under extreme weather conditions

    NASA Astrophysics Data System (ADS)

    Wu, Yanling

    2018-05-01

    In this paper, the extreme waves were generated using the open source computational fluid dynamic (CFD) tools — OpenFOAM and Waves2FOAM — using linear and nonlinear NewWave input. They were used to conduct the numerical simulation of the wave impact process. Numerical tools based on first-order (with and without stretching) and second-order NewWave are investigated. The simulation to predict force loading for the offshore platform under the extreme weather condition is implemented and compared.

  1. Oscillation Baselining and Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  2. FIRE: an open-software suite for real-time 2D/3D image registration for image guided radiotherapy research

    NASA Astrophysics Data System (ADS)

    Furtado, H.; Gendrin, C.; Spoerk, J.; Steiner, E.; Underwood, T.; Kuenzler, T.; Georg, D.; Birkfellner, W.

    2016-03-01

    Radiotherapy treatments have changed at a tremendously rapid pace. Dose delivered to the tumor has escalated while organs at risk (OARs) are better spared. The impact of moving tumors during dose delivery has become higher due to very steep dose gradients. Intra-fractional tumor motion has to be managed adequately to reduce errors in dose delivery. For tumors with large motion such as tumors in the lung, tracking is an approach that can reduce position uncertainty. Tumor tracking approaches range from purely image intensity based techniques to motion estimation based on surrogate tracking. Research efforts are often based on custom designed software platforms which take too much time and effort to develop. To address this challenge we have developed an open software platform especially focusing on tumor motion management. FLIRT is a freely available open-source software platform. The core method for tumor tracking is purely intensity based 2D/3D registration. The platform is written in C++ using the Qt framework for the user interface. The performance critical methods are implemented on the graphics processor using the CUDA extension. One registration can be as fast as 90ms (11Hz). This is suitable to track tumors moving due to respiration (~0.3Hz) or heartbeat (~1Hz). Apart from focusing on high performance, the platform is designed to be flexible and easy to use. Current use cases range from tracking feasibility studies, patient positioning and method validation. Such a framework has the potential of enabling the research community to rapidly perform patient studies or try new methods.

  3. openECA Detailed Design Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    This document describes the functional and non-functional requirements for: The openECA platform The included analytic systems that will: Validate the operational readiness and performance of the openECA platform Provide out-of-box value to those that implement the openECA platform with an initial collection of analytics

  4. Getting started with Open-Hardware: Development and Control of Microfluidic Devices

    PubMed Central

    da Costa, Eric Tavares; Mora, Maria F.; Willis, Peter A.; do Lago, Claudimir L.; Jiao, Hong; Garcia, Carlos D.

    2014-01-01

    Understanding basic concepts of electronics and computer programming allows researchers to get the most out of the equipment found in their laboratories. Although a number of platforms have been specifically designed for the general public and are supported by a vast array of on-line tutorials, this subject is not normally included in university chemistry curricula. Aiming to provide the basic concepts of hardware and software, this article is focused on the design and use of a simple module to control a series of PDMS-based valves. The module is based on a low-cost microprocessor (Teensy) and open-source software (Arduino). The microvalves were fabricated using thin sheets of PDMS and patterned using CO2 laser engraving, providing a simple and efficient way to fabricate devices without the traditional photolithographic process or facilities. Synchronization of valve control enabled the development of two simple devices to perform injection (1.6 ± 0.4 μL/stroke) and mixing of different solutions. Furthermore, a practical demonstration of the utility of this system for microscale chemical sample handling and analysis was achieved performing an on-chip acid-base titration, followed by conductivity detection with an open-source low-cost detection system. Overall, the system provided a very reproducible (98%) platform to perform fluid delivery at the microfluidic scale. PMID:24823494

  5. Global Health Innovation Technology Models.

    PubMed

    Harding, Kimberly

    2016-01-01

    Chronic technology and business process disparities between High Income, Low Middle Income and Low Income (HIC, LMIC, LIC) research collaborators directly prevent the growth of sustainable Global Health innovation for infectious and rare diseases. There is a need for an Open Source-Open Science Architecture Framework to bridge this divide. We are proposing such a framework for consideration by the Global Health community, by utilizing a hybrid approach of integrating agnostic Open Source technology and healthcare interoperability standards and Total Quality Management principles. We will validate this architecture framework through our programme called Project Orchid. Project Orchid is a conceptual Clinical Intelligence Exchange and Virtual Innovation platform utilizing this approach to support clinical innovation efforts for multi-national collaboration that can be locally sustainable for LIC and LMIC research cohorts. The goal is to enable LIC and LMIC research organizations to accelerate their clinical trial process maturity in the field of drug discovery, population health innovation initiatives and public domain knowledge networks. When sponsored, this concept will be tested by 12 confirmed clinical research and public health organizations in six countries. The potential impact of this platform is reduced drug discovery and public health innovation lag time and improved clinical trial interventions, due to reliable clinical intelligence and bio-surveillance across all phases of the clinical innovation process.

  6. Global Health Innovation Technology Models

    PubMed Central

    Harding, Kimberly

    2016-01-01

    Chronic technology and business process disparities between High Income, Low Middle Income and Low Income (HIC, LMIC, LIC) research collaborators directly prevent the growth of sustainable Global Health innovation for infectious and rare diseases. There is a need for an Open Source-Open Science Architecture Framework to bridge this divide. We are proposing such a framework for consideration by the Global Health community, by utilizing a hybrid approach of integrating agnostic Open Source technology and healthcare interoperability standards and Total Quality Management principles. We will validate this architecture framework through our programme called Project Orchid. Project Orchid is a conceptual Clinical Intelligence Exchange and Virtual Innovation platform utilizing this approach to support clinical innovation efforts for multi-national collaboration that can be locally sustainable for LIC and LMIC research cohorts. The goal is to enable LIC and LMIC research organizations to accelerate their clinical trial process maturity in the field of drug discovery, population health innovation initiatives and public domain knowledge networks. When sponsored, this concept will be tested by 12 confirmed clinical research and public health organizations in six countries. The potential impact of this platform is reduced drug discovery and public health innovation lag time and improved clinical trial interventions, due to reliable clinical intelligence and bio-surveillance across all phases of the clinical innovation process.

  7. Open-Source GIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vatsavai, Raju; Burk, Thomas E; Lime, Steve

    2012-01-01

    The components making up an Open Source GIS are explained in this chapter. A map server (Sect. 30.1) can broadly be defined as a software platform for dynamically generating spatially referenced digital map products. The University of Minnesota MapServer (UMN Map Server) is one such system. Its basic features are visualization, overlay, and query. Section 30.2 names and explains many of the geospatial open source libraries, such as GDAL and OGR. The other libraries are FDO, JTS, GEOS, JCS, MetaCRS, and GPSBabel. The application examples include derived GIS-software and data format conversions. Quantum GIS, its origin and its applications explainedmore » in detail in Sect. 30.3. The features include a rich GUI, attribute tables, vector symbols, labeling, editing functions, projections, georeferencing, GPS support, analysis, and Web Map Server functionality. Future developments will address mobile applications, 3-D, and multithreading. The origins of PostgreSQL are outlined and PostGIS discussed in detail in Sect. 30.4. It extends PostgreSQL by implementing the Simple Feature standard. Section 30.5 details the most important open source licenses such as the GPL, the LGPL, the MIT License, and the BSD License, as well as the role of the Creative Commons.« less

  8. Integrating an Educational Game in Moodle LMS

    ERIC Educational Resources Information Center

    Minovic, Miroslav; Milovanovic, Milos; Minovic, Jelena; Starcevic, Dusan

    2012-01-01

    The authors present a learning platform based on a computer game. Learning games combine two industries: education and entertainment, which is often called "Edutainment." The game is realized as a strategic game (similar to Risk[TM]), implemented as a module for Moodle CMS, utilizing Java Applet technology. Moodle is an open-source course…

  9. Automatic System for Producing and Distributing Lecture Recordings and Livestreams Using Opencast Matterhorn

    ERIC Educational Resources Information Center

    Jonach, Rafael; Ebner, Martin; Grigoriadis, Ypatios

    2015-01-01

    Lectures of courses at universities are increasingly being recorded and offered through various distribution channels to support students' learning activities. This research work aims to create an automatic system for producing and distributing high quality lecture recordings. Opencast Matterhorn is an open source platform for automated video…

  10. Developing Digital Competences Using an Educational Software. A Pedagogical Research

    ERIC Educational Resources Information Center

    Magdas, Ioana; Bontea, Timea

    2011-01-01

    Many teachers and people in educational institutions consider it necessary to prepare children for living in a computerized society. The Internet offers an incredible number of opportunities for teachers. The Web offer of e-learning open source platforms reached an impressive configuration. In this article, we present an educational software for…

  11. Leaf-GP: an open and automated software application for measuring growth phenotypes for arabidopsis and wheat.

    PubMed

    Zhou, Ji; Applegate, Christopher; Alonso, Albor Dobon; Reynolds, Daniel; Orford, Simon; Mackiewicz, Michal; Griffiths, Simon; Penfield, Steven; Pullen, Nick

    2017-01-01

    Plants demonstrate dynamic growth phenotypes that are determined by genetic and environmental factors. Phenotypic analysis of growth features over time is a key approach to understand how plants interact with environmental change as well as respond to different treatments. Although the importance of measuring dynamic growth traits is widely recognised, available open software tools are limited in terms of batch image processing, multiple traits analyses, software usability and cross-referencing results between experiments, making automated phenotypic analysis problematic. Here, we present Leaf-GP (Growth Phenotypes), an easy-to-use and open software application that can be executed on different computing platforms. To facilitate diverse scientific communities, we provide three software versions, including a graphic user interface (GUI) for personal computer (PC) users, a command-line interface for high-performance computer (HPC) users, and a well-commented interactive Jupyter Notebook (also known as the iPython Notebook) for computational biologists and computer scientists. The software is capable of extracting multiple growth traits automatically from large image datasets. We have utilised it in Arabidopsis thaliana and wheat ( Triticum aestivum ) growth studies at the Norwich Research Park (NRP, UK). By quantifying a number of growth phenotypes over time, we have identified diverse plant growth patterns between different genotypes under several experimental conditions. As Leaf-GP has been evaluated with noisy image series acquired by different imaging devices (e.g. smartphones and digital cameras) and still produced reliable biological outputs, we therefore believe that our automated analysis workflow and customised computer vision based feature extraction software implementation can facilitate a broader plant research community for their growth and development studies. Furthermore, because we implemented Leaf-GP based on open Python-based computer vision, image analysis and machine learning libraries, we believe that our software not only can contribute to biological research, but also demonstrates how to utilise existing open numeric and scientific libraries (e.g. Scikit-image, OpenCV, SciPy and Scikit-learn) to build sound plant phenomics analytic solutions, in a efficient and effective way. Leaf-GP is a sophisticated software application that provides three approaches to quantify growth phenotypes from large image series. We demonstrate its usefulness and high accuracy based on two biological applications: (1) the quantification of growth traits for Arabidopsis genotypes under two temperature conditions; and (2) measuring wheat growth in the glasshouse over time. The software is easy-to-use and cross-platform, which can be executed on Mac OS, Windows and HPC, with open Python-based scientific libraries preinstalled. Our work presents the advancement of how to integrate computer vision, image analysis, machine learning and software engineering in plant phenomics software implementation. To serve the plant research community, our modulated source code, detailed comments, executables (.exe for Windows; .app for Mac), and experimental results are freely available at https://github.com/Crop-Phenomics-Group/Leaf-GP/releases.

  12. VAGUE: a graphical user interface for the Velvet assembler.

    PubMed

    Powell, David R; Seemann, Torsten

    2013-01-15

    Velvet is a popular open-source de novo genome assembly software tool, which is run from the Unix command line. Most of the problems experienced by new users of Velvet revolve around constructing syntactically and semantically correct command lines, getting input files into acceptable formats and assessing the output. Here, we present Velvet Assembler Graphical User Environment (VAGUE), a multi-platform graphical front-end for Velvet. VAGUE aims to make sequence assembly accessible to a wider audience and to facilitate better usage amongst existing users of Velvet. VAGUE is implemented in JRuby and targets the Java Virtual Machine. It is available under an open-source GPLv2 licence from http://www.vicbioinformatics.com/. torsten.seemann@monash.edu.

  13. Development of Thread-compatible Open Source Stack

    NASA Astrophysics Data System (ADS)

    Zimmermann, Lukas; Mars, Nidhal; Schappacher, Manuel; Sikora, Axel

    2017-07-01

    The Thread protocol is a recent development based on 6LoWPAN (IPv6 over IEEE 802.15.4), but with extensions regarding a more media independent approach, which - additionally - also promises true interoperability. To evaluate and analyse the operation of a Thread network a given open source 6LoWPAN stack for embedded devices (emb::6) has been extended in order to comply with the Thread specification. The implementation covers Mesh Link Establishment (MLE) and network layer functionality as well as 6LoWPAN mesh under routing mechanism based on MAC short addresses. The development has been verified on a virtualization platform and allows dynamical establishment of network topologies based on Thread’s partitioning algorithm.

  14. Spatial Dmbs Architecture for a Free and Open Source Bim

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Valari, E.; Karachaliou, E.; Stylianidis, E.

    2017-08-01

    Recent research on the field of Building Information Modelling (BIM) technology, revealed that except of a few, accessible and free BIM viewers there is a lack of Free & Open Source Software (FOSS) BIM software for the complete BIM process. With this in mind and considering BIM as the technological advancement of Computer-Aided Design (CAD) systems, the current work proposes the use of a FOSS CAD software in order to extend its capabilities and transform it gradually into a FOSS BIM platform. Towards this undertaking, a first approach on developing a spatial Database Management System (DBMS) able to store, organize and manage the overall amount of information within a single application, is presented.

  15. Caffeine-catalyzed gels.

    PubMed

    DiCiccio, Angela M; Lee, Young-Ah Lucy; Glettig, Dean L; Walton, Elizabeth S E; de la Serna, Eva L; Montgomery, Veronica A; Grant, Tyler M; Langer, Robert; Traverso, Giovanni

    2018-07-01

    Covalently cross-linked gels are utilized in a broad range of biomedical applications though their synthesis often compromises easy implementation. Cross-linking reactions commonly utilize catalysts or conditions that can damage biologics and sensitive compounds, producing materials that require extensive post processing to achieve acceptable biocompatibility. As an alternative, we report a batch synthesis platform to produce covalently cross-linked materials appropriate for direct biomedical application enabled by green chemistry and commonly available food grade ingredients. Using caffeine, a mild base, to catalyze anhydrous carboxylate ring-opening of diglycidyl-ether functionalized monomers with citric acid as a tri-functional crosslinking agent we introduce a novel poly(ester-ether) gel synthesis platform. We demonstrate that biocompatible Caffeine Catalyzed Gels (CCGs) exhibit dynamic physical, chemical, and mechanical properties, which can be tailored in shape, surface texture, solvent response, cargo release, shear and tensile strength, among other potential attributes. The demonstrated versatility, low cost and facile synthesis of these CCGs renders them appropriate for a broad range of customized engineering applications including drug delivery constructs, tissue engineering scaffolds, and medical devices. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Using Citygml to Deploy Smart-City Services for Urban Ecosystems

    NASA Astrophysics Data System (ADS)

    Prandi, F.; De Amicis, R.; Piffer, S.; Soave, M.; Cadzow, S.; Gonzalez Boix, E.; D'Hont, E.

    2013-05-01

    The rapid technological evolution, which is characterizing all the disciplines involved within the wide concept of smart cities, becomes a key factor to trigger true user-driven innovation. In this context 3D city models will play an increasingly important role in our daily lives and become an essential part of the modern city information infrastructure (Spatial Data Infrastructure). The goal of this paper is to introduce the i-SCOPE (interoperable Smart City services through an Open Platform for urban Ecosystems) project methodology and implementations together with key technologies and open standards. Based on interoperable 3D CityGML UIMs, the aim of i-Scope is to deliver an open platform on top of which it possible to develop, within different domains, various "smart city" services. Moreover, in i-SCOPE different issues, transcending the mere technological domain, are being tackled, including aspects dealing with social and environmental issues. Indeed several tasks including citizen awareness, crowd source and voluntary based data collection as well as privacy issue concerning involved people should be considered.

  17. Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Lynnes, Christopher; Vollmer, Bruce; Savtchenko, Andrey; Theobald, Michael; Yang, Wenli

    2011-01-01

    The Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real-time Capability for EOS (LANCE) element at the Goddard Earth Sciences Data and Information Services Center (GES DISC) provides information on the global and regional atmospheric state, with very low temporal latency, to support climate research and improve weather forecasting. An open and interoperable platform is useful to facilitate access to, and integration of, LANCE AIRS NRT data. As Web services technology has matured in recent years, a new scalable Service-Oriented Architecture (SOA) is emerging as the basic platform for distributed computing and large networks of interoperable applications. Following the provide-register-discover-consume SOA paradigm, this presentation discusses how to use open-source geospatial software components to build Web services for publishing and accessing AIRS NRT data, explore the metadata relevant to registering and discovering data and services in the catalogue systems, and implement a Web portal to facilitate users' consumption of the data and services.

  18. Open Ephys electroencephalography (Open Ephys  +  EEG): a modular, low-cost, open-source solution to human neural recording

    NASA Astrophysics Data System (ADS)

    Black, Christopher; Voigts, Jakob; Agrawal, Uday; Ladow, Max; Santoyo, Juan; Moore, Christopher; Jones, Stephanie

    2017-06-01

    Objective. Electroencephalography (EEG) offers a unique opportunity to study human neural activity non-invasively with millisecond resolution using minimal equipment in or outside of a lab setting. EEG can be combined with a number of techniques for closed-loop experiments, where external devices are driven by specific neural signals. However, reliable, commercially available EEG systems are expensive, often making them impractical for individual use and research development. Moreover, by design, a majority of these systems cannot be easily altered to the specification needed by the end user. We focused on mitigating these issues by implementing open-source tools to develop a new EEG platform to drive down research costs and promote collaboration and innovation. Approach. Here, we present methods to expand the open-source electrophysiology system, Open Ephys (www.openephys.org), to include human EEG recordings. We describe the equipment and protocol necessary to interface various EEG caps with the Open Ephys acquisition board, and detail methods for processing data. We present applications of Open Ephys  +  EEG as a research tool and discuss how this innovative EEG technology lays a framework for improved closed-loop paradigms and novel brain-computer interface experiments. Main results. The Open Ephys  +  EEG system can record reliable human EEG data, as well as human EMG data. A side-by-side comparison of eyes closed 8-14 Hz activity between the Open Ephys  +  EEG system and the Brainvision ActiCHamp EEG system showed similar average power and signal to noise. Significance. Open Ephys  +  EEG enables users to acquire high-quality human EEG data comparable to that of commercially available systems, while maintaining the price point and extensibility inherent to open-source systems.

  19. Open Ephys electroencephalography (Open Ephys  +  EEG): a modular, low-cost, open-source solution to human neural recording.

    PubMed

    Black, Christopher; Voigts, Jakob; Agrawal, Uday; Ladow, Max; Santoyo, Juan; Moore, Christopher; Jones, Stephanie

    2017-06-01

    Electroencephalography (EEG) offers a unique opportunity to study human neural activity non-invasively with millisecond resolution using minimal equipment in or outside of a lab setting. EEG can be combined with a number of techniques for closed-loop experiments, where external devices are driven by specific neural signals. However, reliable, commercially available EEG systems are expensive, often making them impractical for individual use and research development. Moreover, by design, a majority of these systems cannot be easily altered to the specification needed by the end user. We focused on mitigating these issues by implementing open-source tools to develop a new EEG platform to drive down research costs and promote collaboration and innovation. Here, we present methods to expand the open-source electrophysiology system, Open Ephys (www.openephys.org), to include human EEG recordings. We describe the equipment and protocol necessary to interface various EEG caps with the Open Ephys acquisition board, and detail methods for processing data. We present applications of Open Ephys  +  EEG as a research tool and discuss how this innovative EEG technology lays a framework for improved closed-loop paradigms and novel brain-computer interface experiments. The Open Ephys  +  EEG system can record reliable human EEG data, as well as human EMG data. A side-by-side comparison of eyes closed 8-14 Hz activity between the Open Ephys  +  EEG system and the Brainvision ActiCHamp EEG system showed similar average power and signal to noise. Open Ephys  +  EEG enables users to acquire high-quality human EEG data comparable to that of commercially available systems, while maintaining the price point and extensibility inherent to open-source systems.

  20. Optical properties of potential condensates in exoplanetary atmospheres

    NASA Astrophysics Data System (ADS)

    Kitzmann, Daniel; Heng, Kevin

    2018-03-01

    The prevalence of clouds in currently observable exoplanetary atmospheres motivates the compilation and calculation of their optical properties. First, we present a new open-source Mie scattering code known as LX-MIE, which is able to consider large-size parameters (˜107) using a single computational treatment. We validate LX-MIE against the classical MIEVO code as well as previous studies. Secondly, we embark on an expanded survey of the published literature for both the real and imaginary components of the refractive indices of 32 condensate species. As much as possible, we rely on experimental measurements of the refractive indices and resort to obtaining the real from the imaginary component (or vice versa), via the Kramers-Kronig relation, only in the absence of data. We use these refractive indices as input for LX-MIE to compute the absorption, scattering and extinction efficiencies of all 32 condensate species. Finally, we use a three-parameter function to provide convenient fits to the shape of the extinction efficiency curve. We show that the errors associated with these simple fits in the Wide Field Camera 3 (WFC3), J, H, and K wavebands are ˜ 10 per cent. These fits allow for the extinction cross-section or opacity of the condensate species to be easily included in retrieval analyses of transmission spectra. We discuss prospects for future experimental work. The compilation of the optical constants and LX-MIE is publicly available as part of the open-source Exoclime Simulation Platform (http://www.exoclime.org).

  1. MindSeer: a portable and extensible tool for visualization of structural and functional neuroimaging data

    PubMed Central

    Moore, Eider B; Poliakov, Andrew V; Lincoln, Peter; Brinkley, James F

    2007-01-01

    Background Three-dimensional (3-D) visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. Results We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: . Conclusion MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine. PMID:17937818

  2. MindSeer: a portable and extensible tool for visualization of structural and functional neuroimaging data.

    PubMed

    Moore, Eider B; Poliakov, Andrew V; Lincoln, Peter; Brinkley, James F

    2007-10-15

    Three-dimensional (3-D) visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: http://sig.biostr.washington.edu/projects/MindSeer. MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine.

  3. An open source hydroeconomic model for California's water supply system: PyVIN

    NASA Astrophysics Data System (ADS)

    Dogan, M. S.; White, E.; Herman, J. D.; Hart, Q.; Merz, J.; Medellin-Azuara, J.; Lund, J. R.

    2016-12-01

    Models help operators and decision makers explore and compare different management and policy alternatives, better allocate scarce resources, and predict the future behavior of existing or proposed water systems. Hydroeconomic models are useful tools to increase benefits or decrease costs of managing water. Bringing hydrology and economics together, these models provide a framework for different disciplines that share similar objectives. This work proposes a new model to evaluate operation and adaptation strategies under existing and future hydrologic conditions for California's interconnected water system. This model combines the network structure of CALVIN, a statewide optimization model for California's water infrastructure, along with an open source solver written in the Python programming language. With the flexibilities of the model, reservoir operations, including water supply and hydropower, groundwater pumping, and the Delta water operations and requirements can now be better represented. Given time series of hydrologic inputs to the model, typical outputs include urban, agricultural and wildlife refuge water deliveries and shortage costs, conjunctive use of surface and groundwater systems, and insights into policy and management decisions, such as capacity expansion and groundwater management policies. Water market operations also represented in the model, allocating water from lower-valued users to higher-valued users. PyVIN serves as a cross-platform, extensible model to evaluate systemwide water operations. PyVIN separates data from the model structure, enabling model to be easily applied to other parts of the world where water is a scarce resource.

  4. Real-Time Motion Capture Toolbox (RTMocap): an open-source code for recording 3-D motion kinematics to study action-effect anticipations during motor and social interactions.

    PubMed

    Lewkowicz, Daniel; Delevoye-Turrell, Yvonne

    2016-03-01

    We present here a toolbox for the real-time motion capture of biological movements that runs in the cross-platform MATLAB environment (The MathWorks, Inc., Natick, MA). It provides instantaneous processing of the 3-D movement coordinates of up to 20 markers at a single instant. Available functions include (1) the setting of reference positions, areas, and trajectories of interest; (2) recording of the 3-D coordinates for each marker over the trial duration; and (3) the detection of events to use as triggers for external reinforcers (e.g., lights, sounds, or odors). Through fast online communication between the hardware controller and RTMocap, automatic trial selection is possible by means of either a preset or an adaptive criterion. Rapid preprocessing of signals is also provided, which includes artifact rejection, filtering, spline interpolation, and averaging. A key example is detailed, and three typical variations are developed (1) to provide a clear understanding of the importance of real-time control for 3-D motion in cognitive sciences and (2) to present users with simple lines of code that can be used as starting points for customizing experiments using the simple MATLAB syntax. RTMocap is freely available (http://sites.google.com/site/RTMocap/) under the GNU public license for noncommercial use and open-source development, together with sample data and extensive documentation.

  5. Building an Open-source Simulation Platform of Acoustic Radiation Force-based Breast Elastography

    PubMed Central

    Wang, Yu; Peng, Bo; Jiang, Jingfeng

    2017-01-01

    Ultrasound-based elastography including strain elastography (SE), acoustic radiation force Impulse (ARFI) imaging, point shear wave elastography (pSWE) and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. “ground truth”) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity – one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments. PMID:28075330

  6. Building an open-source simulation platform of acoustic radiation force-based breast elastography

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Peng, Bo; Jiang, Jingfeng

    2017-03-01

    Ultrasound-based elastography including strain elastography, acoustic radiation force impulse (ARFI) imaging, point shear wave elastography and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. ‘ground truth’) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity—one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments.

  7. Progress in Open-World, Integrative, Collaborative Science Data Platforms (Invited)

    NASA Astrophysics Data System (ADS)

    Fox, P. A.

    2013-12-01

    As collaborative, or network science spreads into more Earth and space science fields, both the participants and their funders have expressed a very strong desire for highly functional data and information capabilities that are a) easy to use, b) integrated in a variety of ways, c) leverage prior investments and keep pace with rapid technical change, and d) are not expensive or time-consuming to build or maintain. In response, and based on our accumulated experience over the last decade and a maturing of several key technical approaches, we have adapted, extended, and integrated several open source applications and frameworks that handle major portions of functionality for these platforms. At minimum, these functions include: an object-type repository, collaboration tools, an ability to identify and manage all key entities in the platform, and an integrated portal to manage diverse content and applications, with varied access levels and privacy options. At a conceptual level, science networks (even small ones) deal with people, and many intellectual artifacts produced or consumed in research, organizational and/our outreach activities, as well as the relations among them. Increasingly these networks are modeled as knowledge networks, i.e. graphs with named and typed relations among the 'nodes'. Nodes can be people, organizations, datasets, events, presentations, publications, videos, meetings, reports, groups, and more. In this heterogeneous ecosystem, it is also important to use a set of common informatics approaches to co-design and co-evolve the needed science data platforms based on what real people want to use them for. In this contribution, we present our methods and results for information modeling, adapting, integrating and evolving a networked data science and information architecture based on several open source technologies (Drupal, VIVO, the Comprehensive Knowledge Archive Network; CKAN, and the Global Handle System; GHS). In particular we present both the instantiation of this data platform for the Deep Carbon Observatory, including key functional and non-functional attributes, how the smart mediation among the components is modeled and managed, and discuss its general applicability.

  8. APEX/SPIN: a free test platform to measure speech intelligibility.

    PubMed

    Francart, Tom; Hofmann, Michael; Vanthornhout, Jonas; Van Deun, Lieselot; van Wieringen, Astrid; Wouters, Jan

    2017-02-01

    Measuring speech intelligibility in quiet and noise is important in clinical practice and research. An easy-to-use free software platform for conducting speech tests is presented, called APEX/SPIN. The APEX/SPIN platform allows the use of any speech material in combination with any noise. A graphical user interface provides control over a large range of parameters, such as number of loudspeakers, signal-to-noise ratio and parameters of the procedure. An easy-to-use graphical interface is provided for calibration and storage of calibration values. To validate the platform, perception of words in quiet and sentences in noise were measured both with APEX/SPIN and with an audiometer and CD player, which is a conventional setup in current clinical practice. Five normal-hearing listeners participated in the experimental evaluation. Speech perception results were similar for the APEX/SPIN platform and conventional procedures. APEX/SPIN is a freely available and open source platform that allows the administration of all kinds of custom speech perception tests and procedures.

  9. The use of open source bioinformatics tools to dissect transcriptomic data.

    PubMed

    Nitsche, Benjamin M; Ram, Arthur F J; Meyer, Vera

    2012-01-01

    Microarrays are a valuable technology to study fungal physiology on a transcriptomic level. Various microarray platforms are available comprising both single and two channel arrays. Despite different technologies, preprocessing of microarray data generally includes quality control, background correction, normalization, and summarization of probe level data. Subsequently, depending on the experimental design, diverse statistical analysis can be performed, including the identification of differentially expressed genes and the construction of gene coexpression networks.We describe how Bioconductor, a collection of open source and open development packages for the statistical programming language R, can be used for dissecting microarray data. We provide fundamental details that facilitate the process of getting started with R and Bioconductor. Using two publicly available microarray datasets from Aspergillus niger, we give detailed protocols on how to identify differentially expressed genes and how to construct gene coexpression networks.

  10. Waggle: A Framework for Intelligent Attentive Sensing and Actuation

    NASA Astrophysics Data System (ADS)

    Sankaran, R.; Jacob, R. L.; Beckman, P. H.; Catlett, C. E.; Keahey, K.

    2014-12-01

    Advances in sensor-driven computation and computationally steered sensing will greatly enable future research in fields including environmental and atmospheric sciences. We will present "Waggle," an open-source hardware and software infrastructure developed with two goals: (1) reducing the separation and latency between sensing and computing and (2) improving the reliability and longevity of sensing-actuation platforms in challenging and costly deployments. Inspired by "deep-space probe" systems, the Waggle platform design includes features that can support longitudinal studies, deployments with varying communication links, and remote management capabilities. Waggle lowers the barrier for scientists to incorporate real-time data from their sensors into their computations and to manipulate the sensors or provide feedback through actuators. A standardized software and hardware design allows quick addition of new sensors/actuators and associated software in the nodes and enables them to be coupled with computational codes both insitu and on external compute infrastructure. The Waggle framework currently drives the deployment of two observational systems - a portable and self-sufficient weather platform for study of small-scale effects in Chicago's urban core and an open-ended distributed instrument in Chicago that aims to support several research pursuits across a broad range of disciplines including urban planning, microbiology and computer science. Built around open-source software, hardware, and Linux OS, the Waggle system comprises two components - the Waggle field-node and Waggle cloud-computing infrastructure. Waggle field-node affords a modular, scalable, fault-tolerant, secure, and extensible platform for hosting sensors and actuators in the field. It supports insitu computation and data storage, and integration with cloud-computing infrastructure. The Waggle cloud infrastructure is designed with the goal of scaling to several hundreds of thousands of Waggle nodes. It supports aggregating data from sensors hosted by the nodes, staging computation, relaying feedback to the nodes and serving data to end-users. We will discuss the Waggle design principles and their applicability to various observational research pursuits, and demonstrate its capabilities.

  11. Open Source Cloud-Based Technologies for Bim

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.

    2018-05-01

    This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  12. Seqcrawler: biological data indexing and browsing platform.

    PubMed

    Sallou, Olivier; Bretaudeau, Anthony; Roult, Aurelien

    2012-07-24

    Seqcrawler takes its roots in software like SRS or Lucegene. It provides an indexing platform to ease the search of data and meta-data in biological banks and it can scale to face the current flow of data. While many biological bank search tools are available on the Internet, mainly provided by large organizations to search their data, there is a lack of free and open source solutions to browse one's own set of data with a flexible query system and able to scale from a single computer to a cloud system. A personal index platform will help labs and bioinformaticians to search their meta-data but also to build a larger information system with custom subsets of data. The software is scalable from a single computer to a cloud-based infrastructure. It has been successfully tested in a private cloud with 3 index shards (pieces of index) hosting ~400 millions of sequence information (whole GenBank, UniProt, PDB and others) for a total size of 600 GB in a fault tolerant architecture (high-availability). It has also been successfully integrated with software to add extra meta-data from blast results to enhance users' result analysis. Seqcrawler provides a complete open source search and store solution for labs or platforms needing to manage large amount of data/meta-data with a flexible and customizable web interface. All components (search engine, visualization and data storage), though independent, share a common and coherent data system that can be queried with a simple HTTP interface. The solution scales easily and can also provide a high availability infrastructure.

  13. Survey of Non-Rigid Registration Tools in Medicine.

    PubMed

    Keszei, András P; Berkels, Benjamin; Deserno, Thomas M

    2017-02-01

    We catalogue available software solutions for non-rigid image registration to support scientists in selecting suitable tools for specific medical registration purposes. Registration tools were identified using non-systematic search in Pubmed, Web of Science, IEEE Xplore® Digital Library, Google Scholar, and through references in identified sources (n = 22). Exclusions are due to unavailability or inappropriateness. The remaining (n = 18) tools were classified by (i) access and technology, (ii) interfaces and application, (iii) living community, (iv) supported file formats, and (v) types of registration methodologies emphasizing the similarity measures implemented. Out of the 18 tools, (i) 12 are open source, 8 are released under a permissive free license, which imposes the least restrictions on the use and further development of the tool, 8 provide graphical processing unit (GPU) support; (ii) 7 are built on software platforms, 5 were developed for brain image registration; (iii) 6 are under active development but only 3 have had their last update in 2015 or 2016; (iv) 16 support the Analyze format, while 7 file formats can be read with only one of the tools; and (v) 6 provide multiple registration methods and 6 provide landmark-based registration methods. Based on open source, licensing, GPU support, active community, several file formats, algorithms, and similarity measures, the tools Elastics and Plastimatch are chosen for the platform ITK and without platform requirements, respectively. Researchers in medical image analysis already have a large choice of registration tools freely available. However, the most recently published algorithms may not be included in the tools, yet.

  14. Innovation in engineering education through computer assisted learning and virtual university model

    NASA Astrophysics Data System (ADS)

    Raicu, A.; Raicu, G.

    2015-11-01

    The paper presents the most important aspects of innovation in Engineering Education using Computer Assisted Learning. The authors propose to increase the quality of Engineering Education programs of study at European standards. The use of computer assisted learning methodologies in all studies is becoming an important resource in Higher Education. We intend to improve the concept of e-Learning using virtual terminals, online support and assisting special training through live seminars and interactive labs to develop a virtual university model. We intend to encourage computer assisted learning and innovation as sources of competitive advantage, to permit vision and learning analysis, identifies new sources of technology and ideas. Our work is based on our university datasets collected during last fifteen years using several e-Learning systems. In Constanta Maritime University (CMU), using eLearning and Knowledge Management Services (KMS) is very important and we apply it effectively to achieve strategic objectives, such as collaboration, sharing and good practice. We have experience in this field since 2000 year using Moodle as KMS in our university. The term KMS can be associated to Open Source Software, Open Standards, Open Protocols and Open Knowledge licenses, initiatives and policies. In CMU Virtual Campus we have today over 12500 active users. Another experience of the authors is the implementation of MariTrainer Wiki educational platform based on Dokeos and DekiWiki under MARICOMP and MEP Leonardo da Vinci Project. We'll also present in this paper a case study under EU funded project POSDRU, where the authors implemented other educational platform in Technological High Schools from Romania used over 1000 teachers. Based on large datasets the study tries to improve the concept of e-Learning teaching using the revolutionary technologies. The new concept present in this paper is that the teaching and learning will be interactive and live. The new and modern techniques are the flexible learning courses, the production of learning demonstrators and testing. All the information from the virtual educational platform remain open space, communication between participants and continued after graduation, so we can talk about creating and maintaining a community of graduates, a partnership with them. Every European University must have a department which aims to provide computer assisted learning using knowledge creation through learning, capture and explication, sharing and collaborative communication, access, use and reuse and knowledge archiving.

  15. Normalization and standardization of electronic health records for high-throughput phenotyping: the SHARPn consortium

    PubMed Central

    Pathak, Jyotishman; Bailey, Kent R; Beebe, Calvin E; Bethard, Steven; Carrell, David S; Chen, Pei J; Dligach, Dmitriy; Endle, Cory M; Hart, Lacey A; Haug, Peter J; Huff, Stanley M; Kaggal, Vinod C; Li, Dingcheng; Liu, Hongfang; Marchant, Kyle; Masanz, James; Miller, Timothy; Oniki, Thomas A; Palmer, Martha; Peterson, Kevin J; Rea, Susan; Savova, Guergana K; Stancl, Craig R; Sohn, Sunghwan; Solbrig, Harold R; Suesse, Dale B; Tao, Cui; Taylor, David P; Westberg, Les; Wu, Stephen; Zhuo, Ning; Chute, Christopher G

    2013-01-01

    Research objective To develop scalable informatics infrastructure for normalization of both structured and unstructured electronic health record (EHR) data into a unified, concept-based model for high-throughput phenotype extraction. Materials and methods Software tools and applications were developed to extract information from EHRs. Representative and convenience samples of both structured and unstructured data from two EHR systems—Mayo Clinic and Intermountain Healthcare—were used for development and validation. Extracted information was standardized and normalized to meaningful use (MU) conformant terminology and value set standards using Clinical Element Models (CEMs). These resources were used to demonstrate semi-automatic execution of MU clinical-quality measures modeled using the Quality Data Model (QDM) and an open-source rules engine. Results Using CEMs and open-source natural language processing and terminology services engines—namely, Apache clinical Text Analysis and Knowledge Extraction System (cTAKES) and Common Terminology Services (CTS2)—we developed a data-normalization platform that ensures data security, end-to-end connectivity, and reliable data flow within and across institutions. We demonstrated the applicability of this platform by executing a QDM-based MU quality measure that determines the percentage of patients between 18 and 75 years with diabetes whose most recent low-density lipoprotein cholesterol test result during the measurement year was <100 mg/dL on a randomly selected cohort of 273 Mayo Clinic patients. The platform identified 21 and 18 patients for the denominator and numerator of the quality measure, respectively. Validation results indicate that all identified patients meet the QDM-based criteria. Conclusions End-to-end automated systems for extracting clinical information from diverse EHR systems require extensive use of standardized vocabularies and terminologies, as well as robust information models for storing, discovering, and processing that information. This study demonstrates the application of modular and open-source resources for enabling secondary use of EHR data through normalization into standards-based, comparable, and consistent format for high-throughput phenotyping to identify patient cohorts. PMID:24190931

  16. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  17. Analysis of vehicular traffic flow in the major areas of Kuala Lumpur utilizing open-traffic

    NASA Astrophysics Data System (ADS)

    Manogaran, Saargunawathy; Ali, Muhammad; Yusof, Kamaludin Mohamad; Suhaili, Ramdhan

    2017-09-01

    Vehicular traffic congestion occurs when a large number of drivers are overcrowded on the road and the traffic flow does not run smoothly. Traffic congestion causes chaos on the road and interruption to daily activities of users. Time consumed on road give lots of negative effects on productivity, social behavior, environmental and cost to economy. Congestion is worsens and leads to havoc during the emergency such as flood, accidents, road maintenance and etc., where behavior of traffic flow is always unpredictable and uncontrollable. Real-time and historical traffic data are critical inputs for most traffic flow analysis applications. Researcher attempt to predict traffic using simulations as there is no exact model of traffic flow exists due to its high complexity. Open Traffic is an open source platform available for traffic data analysis linked to Open Street Map (OSM). This research is aimed to study and understand the Open Traffic platform. The real-time traffic flow pattern in Kuala Lumpur area was successfully been extracted and analyzed using Open Traffic. It was observed that the congestion occurs on every major road in Kuala Lumpur and most of it owes to the offices and the economic and commercial centers during rush hours. At some roads the congestion occurs at night due to the tourism activities.

  18. Swan: A tool for porting CUDA programs to OpenCL

    NASA Astrophysics Data System (ADS)

    Harvey, M. J.; De Fabritiis, G.

    2011-04-01

    The use of modern, high-performance graphical processing units (GPUs) for acceleration of scientific computation has been widely reported. The majority of this work has used the CUDA programming model supported exclusively by GPUs manufactured by NVIDIA. An industry standardisation effort has recently produced the OpenCL specification for GPU programming. This offers the benefits of hardware-independence and reduced dependence on proprietary tool-chains. Here we describe a source-to-source translation tool, "Swan" for facilitating the conversion of an existing CUDA code to use the OpenCL model, as a means to aid programmers experienced with CUDA in evaluating OpenCL and alternative hardware. While the performance of equivalent OpenCL and CUDA code on fixed hardware should be comparable, we find that a real-world CUDA application ported to OpenCL exhibits an overall 50% increase in runtime, a reduction in performance attributable to the immaturity of contemporary compilers. The ported application is shown to have platform independence, running on both NVIDIA and AMD GPUs without modification. We conclude that OpenCL is a viable platform for developing portable GPU applications but that the more mature CUDA tools continue to provide best performance. Program summaryProgram title: Swan Catalogue identifier: AEIH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public License version 2 No. of lines in distributed program, including test data, etc.: 17 736 No. of bytes in distributed program, including test data, etc.: 131 177 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 256 Mbytes Classification: 6.5 External routines: NVIDIA CUDA, OpenCL Nature of problem: Graphical Processing Units (GPUs) from NVIDIA are preferentially programed with the proprietary CUDA programming toolkit. An alternative programming model promoted as an industry standard, OpenCL, provides similar capabilities to CUDA and is also supported on non-NVIDIA hardware (including multicore ×86 CPUs, AMD GPUs and IBM Cell processors). The adaptation of a program from CUDA to OpenCL is relatively straightforward but laborious. The Swan tool facilitates this conversion. Solution method:Swan performs a translation of CUDA kernel source code into an OpenCL equivalent. It also generates the C source code for entry point functions, simplifying kernel invocation from the host program. A concise host-side API abstracts the CUDA and OpenCL APIs. A program adapted to use Swan has no dependency on the CUDA compiler for the host-side program. The converted program may be built for either CUDA or OpenCL, with the selection made at compile time. Restrictions: No support for CUDA C++ features Running time: Nominal

  19. Instrumentino: An Open-Source Software for Scientific Instruments.

    PubMed

    Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C

    2015-01-01

    Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project.

  20. pyBadlands: A framework to simulate sediment transport, landscape dynamics and basin stratigraphic evolution through space and time

    PubMed Central

    2018-01-01

    Understanding Earth surface responses in terms of sediment dynamics to climatic variability and tectonics forcing is hindered by limited ability of current models to simulate long-term evolution of sediment transfer and associated morphological changes. This paper presents pyBadlands, an open-source python-based framework which computes over geological time (1) sediment transport from landmasses to coasts, (2) reworking of marine sediments by longshore currents and (3) development of coral reef systems. pyBadlands is cross-platform, distributed under the GPLv3 license and available on GitHub (http://github.com/badlands-model). Here, we describe the underlying physical assumptions behind the simulated processes and the main options already available in the numerical framework. Along with the source code, a list of hands-on examples is provided that illustrates the model capabilities. In addition, pre and post-processing classes have been built and are accessible as a companion toolbox which comprises a series of workflows to efficiently build, quantify and explore simulation input and output files. While the framework has been primarily designed for research, its simplicity of use and portability makes it a great tool for teaching purposes. PMID:29649301

  1. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  2. Determining Protein Complex Structures Based on a Bayesian Model of in Vivo Förster Resonance Energy Transfer (FRET) Data*

    PubMed Central

    Bonomi, Massimiliano; Pellarin, Riccardo; Kim, Seung Joong; Russel, Daniel; Sundin, Bryan A.; Riffle, Michael; Jaschob, Daniel; Ramsden, Richard; Davis, Trisha N.; Muller, Eric G. D.; Sali, Andrej

    2014-01-01

    The use of in vivo Förster resonance energy transfer (FRET) data to determine the molecular architecture of a protein complex in living cells is challenging due to data sparseness, sample heterogeneity, signal contributions from multiple donors and acceptors, unequal fluorophore brightness, photobleaching, flexibility of the linker connecting the fluorophore to the tagged protein, and spectral cross-talk. We addressed these challenges by using a Bayesian approach that produces the posterior probability of a model, given the input data. The posterior probability is defined as a function of the dependence of our FRET metric FRETR on a structure (forward model), a model of noise in the data, as well as prior information about the structure, relative populations of distinct states in the sample, forward model parameters, and data noise. The forward model was validated against kinetic Monte Carlo simulations and in vivo experimental data collected on nine systems of known structure. In addition, our Bayesian approach was validated by a benchmark of 16 protein complexes of known structure. Given the structures of each subunit of the complexes, models were computed from synthetic FRETR data with a distance root-mean-squared deviation error of 14 to 17 Å. The approach is implemented in the open-source Integrative Modeling Platform, allowing us to determine macromolecular structures through a combination of in vivo FRETR data and data from other sources, such as electron microscopy and chemical cross-linking. PMID:25139910

  3. The Confluence of GIS, Cloud and Open Source, Enabling Big Raster Data Applications

    NASA Astrophysics Data System (ADS)

    Plesea, L.; Emmart, C. B.; Boller, R. A.; Becker, P.; Baynes, K.

    2016-12-01

    The rapid evolution of available cloud services is profoundly changing the way applications are being developed and used. Massive object stores, service scalability, continuous integration are some of the most important cloud technology advances that directly influence science applications and GIS. At the same time, more and more scientists are using GIS platforms in their day to day research. Yet with new opportunities there are always some challenges. Given the large amount of data commonly required in science applications, usually large raster datasets, connectivity is one of the biggest problems. Connectivity has two aspects, one being the limited bandwidth and latency of the communication link due to the geographical location of the resources, the other one being the interoperability and intrinsic efficiency of the interface protocol used to connect. NASA and Esri are actively helping each other and collaborating on a few open source projects, aiming to provide some of the core technology components to directly address the GIS enabled data connectivity problems. Last year Esri contributed LERC, a very fast and efficient compression algorithm to the GDAL/MRF format, which itself is a NASA/Esri collaboration project. The MRF raster format has some cloud aware features that make it possible to build high performance web services on cloud platforms, as some of the Esri projects demonstrate. Currently, another NASA open source project, the high performance OnEarth WMTS server is being refactored and enhanced to better integrate with MRF, GDAL and Esri software. Taken together, the GDAL, MRF and OnEarth form the core of an open source CloudGIS toolkit that is already showing results. Since it is well integrated with GDAL, which is the most common interoperability component of GIS applications, this approach should improve the connectivity and performance of many science and GIS applications in the cloud.

  4. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  5. Research on private cloud computing based on analysis on typical opensource platform: a case study with Eucalyptus and Wavemaker

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoyuan; Yuan, Jian; Chen, Shi

    2013-03-01

    Cloud computing is one of the most popular topics in the IT industry and is recently being adopted by many companies. It has four development models, as: public cloud, community cloud, hybrid cloud and private cloud. Except others, private cloud can be implemented in a private network, and delivers some benefits of cloud computing without pitfalls. This paper makes a comparison of typical open source platforms through which we can implement a private cloud. After this comparison, we choose Eucalyptus and Wavemaker to do a case study on the private cloud. We also do some performance estimation of cloud platform services and development of prototype software as cloud services.

  6. Wireless energy transfer platform for medical sensors and implantable devices.

    PubMed

    Zhang, Fei; Hackworth, Steven A; Liu, Xiaoyu; Chen, Haiyan; Sclabassi, Robert J; Sun, Mingui

    2009-01-01

    Witricity is a newly developed technique for wireless energy transfer. This paper presents a frequency adjustable witricity system to power medical sensors and implantable devices. New witricity resonators are designed for both energy transmission and reception. A prototype platform is described, including an RF power source, two resonators with new structures, and inductively coupled input and output stages. In vitro experiments, both in open air and using a human head phantom consisting of simulated tissues, are employed to verify the feasibility of this platform. An animal model is utilized to evaluate in vivo energy transfer within the body of a laboratory pig. Our experiments indicate that witricity is an effective new tool for providing a variety of medical sensors and devices with power.

  7. ESTEST: An Open Science Platform for Electronic Structure Research

    ERIC Educational Resources Information Center

    Yuan, Gary

    2012-01-01

    Open science platforms in support of data generation, analysis, and dissemination are becoming indispensible tools for conducting research. These platforms use informatics and information technologies to address significant problems in open science data interoperability, verification & validation, comparison, analysis, post-processing,…

  8. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  9. An Interprofessional Web-Based Resource for Health Professions Preceptors

    PubMed Central

    McLeod, Elizabeth; Kwong, Mona; Tidball, Glynnis; Collins, John; Neufeld, Lois; Drynan, Donna

    2012-01-01

    Objective. To develop a Web-based preceptor education resource for healthcare professionals and evaluate its usefulness. Methods. Using an open source platform, 8 online modules called “E-tips for Practice Education” (E-tips) were developed that focused on topics identified relevant across healthcare disciplines. A cross-sectional survey design was used to evaluate the online resource. Ninety preceptors from 10 health disciplines affiliated with the University of British Columbia evaluated the E-tips. Results. The modules were well received by preceptors, with all participants indicating that they would recommend these modules to their colleagues, over 80% indicating the modules were very to extremely applicable, and over 60% indicating that E-tips had increased their confidence in their ability to teach. Conclusion. Participants reported E-tips to be highly applicable to their teaching role as preceptors. Given their multidisciplinary focus, these modules address a shared language and ideas about clinical teaching among those working in multi-disciplinary settings. PMID:23193332

  10. Sonification Prototype for Space Physics

    NASA Astrophysics Data System (ADS)

    Candey, R. M.; Schertenleib, A. M.; Diaz Merced, W. L.

    2005-12-01

    As an alternative and adjunct to visual displays, auditory exploration of data via sonification (data controlled sound) and audification (audible playback of data samples) is promising for complex or rapidly/temporally changing visualizations, for data exploration of large datasets (particularly multi-dimensional datasets), and for exploring datasets in frequency rather than spatial dimensions (see also International Conferences on Auditory Display ). Besides improving data exploration and analysis for most researchers, the use of sound is especially valuable as an assistive technology for visually-impaired people and can make science and math more exciting for high school and college students. Only recently have the hardware and software come together to make a cross-platform open-source sonification tool feasible. We have developed a prototype sonification data analysis tool using the JavaSound API and NASA GSFC's ViSBARD software . Wanda Diaz Merced, a blind astrophysicist from Puerto Rico, is instrumental in advising on and testing the tool.

  11. MollDE: a homology modeling framework you can click with.

    PubMed

    Canutescu, Adrian A; Dunbrack, Roland L

    2005-06-15

    Molecular Integrated Development Environment (MolIDE) is an integrated application designed to provide homology modeling tools and protocols under a uniform, user-friendly graphical interface. Its main purpose is to combine the most frequent modeling steps in a semi-automatic, interactive way, guiding the user from the target protein sequence to the final three-dimensional protein structure. The typical basic homology modeling process is composed of building sequence profiles of the target sequence family, secondary structure prediction, sequence alignment with PDB structures, assisted alignment editing, side-chain prediction and loop building. All of these steps are available through a graphical user interface. MolIDE's user-friendly and streamlined interactive modeling protocol allows the user to focus on the important modeling questions, hiding from the user the raw data generation and conversion steps. MolIDE was designed from the ground up as an open-source, cross-platform, extensible framework. This allows developers to integrate additional third-party programs to MolIDE. http://dunbrack.fccc.edu/molide/molide.php rl_dunbrack@fccc.edu.

  12. Inselect: Automating the Digitization of Natural History Collections

    PubMed Central

    Hudson, Lawrence N.; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W.; van der Walt, Stéfan; Smith, Vincent S.

    2015-01-01

    The world’s natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect—a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization. PMID:26599208

  13. Inselect: Automating the Digitization of Natural History Collections.

    PubMed

    Hudson, Lawrence N; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W; van der Walt, Stéfan; Smith, Vincent S

    2015-01-01

    The world's natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect-a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization.

  14. ARX - A Comprehensive Tool for Anonymizing Biomedical Data

    PubMed Central

    Prasser, Fabian; Kohlmayer, Florian; Lautenschläger, Ronald; Kuhn, Klaus A.

    2014-01-01

    Collaboration and data sharing have become core elements of biomedical research. Especially when sensitive data from distributed sources are linked, privacy threats have to be considered. Statistical disclosure control allows the protection of sensitive data by introducing fuzziness. Reduction of data quality, however, needs to be balanced against gains in protection. Therefore, tools are needed which provide a good overview of the anonymization process to those responsible for data sharing. These tools require graphical interfaces and the use of intuitive and replicable methods. In addition, extensive testing, documentation and openness to reviews by the community are important. Existing publicly available software is limited in functionality, and often active support is lacking. We present ARX, an anonymization tool that i) implements a wide variety of privacy methods in a highly efficient manner, ii) provides an intuitive cross-platform graphical interface, iii) offers a programming interface for integration into other software systems, and iv) is well documented and actively supported. PMID:25954407

  15. The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality

    NASA Technical Reports Server (NTRS)

    Conway, Darrel J.; Hughes, Steven P.

    2010-01-01

    The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).

  16. Open source data logger for low-cost environmental monitoring

    PubMed Central

    2014-01-01

    Abstract The increasing transformation of biodiversity into a data-intensive science has seen numerous independent systems linked and aggregated into the current landscape of biodiversity informatics. This paper outlines how we can move forward with this programme, incorporating real time environmental monitoring into our methodology using low-power and low-cost computing platforms. PMID:24855446

  17. Investigating Advances in the Acquisition of Secure Systems Based on Open Architectures

    DTIC Science & Technology

    2012-08-30

    markets (Guertin & Womble, 2012), efficient testing of component‐based OA...workalike of the closed‐source Second Life VW platform. Second Life (2012) is the current market leader in rapid virtual world development and...Strong Copyleft GPL, AGPL Many rights; obligations on “nearby” works Proprietary CTL , EULAs, TOSs Few rights Typical rights

  18. How Educational Ideas Catch On: The Promotion of Popular Education Innovations and the Role of Evidence

    ERIC Educational Resources Information Center

    Carrier, Nathalie

    2017-01-01

    Background: Within the wider education industry, a large quantity of ideas, practices and products are routinely promoted as useful innovations. These innovations span many forms such as software applications, open source courseware, online learning platforms and web 2.0 technologies. Coupled with this promotion, there is increasing interest and…

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George

    This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.

  20. A Computer Simulation Using Spreadsheets for Learning Concept of Steady-State Equilibrium

    ERIC Educational Resources Information Center

    Sharda, Vandana; Sastri, O. S. K. S.; Bhardwaj, Jyoti; Jha, Arbind K.

    2016-01-01

    In this paper, we present a simple spreadsheet based simulation activity that can be performed by students at the undergraduate level. This simulation is implemented in free open source software (FOSS) LibreOffice Calc, which is available for both Windows and Linux platform. This activity aims at building the probability distribution for the…

  1. Joint Intelligence Operations Center (JIOC) Baseline Business Process Model & Capabilities Evaluation Methodology

    DTIC Science & Technology

    2012-03-01

    Targeting Review Board OPLAN Operations Plan OPORD Operations Order OPSIT Operational Situation OSINT Open Source Intelligence OV...Analysis Evaluate FLTREPs MISREPs Unit Assign Assets Feedback Asset Shortfalls Multi-Int Collection Political & Embasy Law Enforcement HUMINT OSINT ...Embassy Information OSINT Manage Theater HUMINT Law Enforcement Collection Sort Requests Platform Information Agency Information M-I Collect

  2. Design and construction of a first-generation high-throughput integrated robotic molecular biology platform for bioenergy applications

    USDA-ARS?s Scientific Manuscript database

    The molecular biological techniques for plasmid-based assembly and cloning of gene open reading frames are essential for elucidating the function of the proteins encoded by the genes. These techniques involve the production of full-length cDNA libraries as a source of plasmid-based clones to expres...

  3. GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping

    PubMed Central

    Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan

    2016-01-01

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables. PMID:27529547

  4. GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping.

    PubMed

    Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan

    2016-01-01

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables.

  5. piscope - A Python based software package for the analysis of volcanic SO2 emissions using UV SO2 cameras

    NASA Astrophysics Data System (ADS)

    Gliss, Jonas; Stebel, Kerstin; Kylling, Arve; Solvejg Dinger, Anna; Sihler, Holger; Sudbø, Aasmund

    2017-04-01

    UV SO2 cameras have become a common method for monitoring SO2 emission rates from volcanoes. Scattered solar UV radiation is measured in two wavelength windows, typically around 310 nm and 330 nm (distinct / weak SO2 absorption) using interference filters. The data analysis comprises the retrieval of plume background intensities (to calculate plume optical densities), the camera calibration (to convert optical densities into SO2 column densities) and the retrieval of gas velocities within the plume as well as the retrieval of plume distances. SO2 emission rates are then typically retrieved along a projected plume cross section, for instance a straight line perpendicular to the plume propagation direction. Today, for most of the required analysis steps, several alternatives exist due to ongoing developments and improvements related to the measurement technique. We present piscope, a cross platform, open source software toolbox for the analysis of UV SO2 camera data. The code is written in the Python programming language and emerged from the idea of a common analysis platform incorporating a selection of the most prevalent methods found in literature. piscope includes several routines for plume background retrievals, routines for cell and DOAS based camera calibration including two individual methods to identify the DOAS field of view (shape and position) within the camera images. Gas velocities can be retrieved either based on an optical flow analysis or using signal cross correlation. A correction for signal dilution (due to atmospheric scattering) can be performed based on topographic features in the images. The latter requires distance retrievals to the topographic features used for the correction. These distances can be retrieved automatically on a pixel base using intersections of individual pixel viewing directions with the local topography. The main features of piscope are presented based on dataset recorded at Mt. Etna, Italy in September 2015.

  6. Cloud-based Jupyter Notebooks for Water Data Analysis

    NASA Astrophysics Data System (ADS)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative and reproducible science.

  7. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Technical Reports Server (NTRS)

    Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce

    2011-01-01

    Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases

  8. Aerosol and trace gas flux measurements from a mobile car platform on the highway

    NASA Astrophysics Data System (ADS)

    Gordon, M.; Miller, S. J.; Staebler, R. M.; Taylor, P.

    2016-12-01

    Mobile flux measurements of aerosols and trace gases at the surface can provide valuable information about the vertical transport of these compounds from near-surface sources. These measurements can be complimentary to stationary tower measurements or elevated mobile measurements from aircraft and unmanned aerial systems (UAS). In July, 2016 a mobile platform (Toyota Highlander), outfitted with a sonic anemometer (ATI), an open path CO2/H2O analyzer (Licor), and an ultrafine particle sizer (DMT), was driven on highways as part of a chasing study to investigate vehicle-induced turbulence and mixing. The open path analyzer and particle sizer inlet were co-located with the anemometer in order to investigate the feasibility of making flux measurements of heat, momentum, water vapour, CO2, and sub-micron aerosols on the highway. These highway flux measurements are compared to stationary platform measurements made upwind and downwind of the highway. Statistical and spectral analyses are used to demonstrate the validity of the mobile measurements. Uncertainties due to flow distortion around the vehicle, under-sampling, and heterogeneity of the vertical temperature and concentrations are investigated and discussed.

  9. Advances and current state of the security and privacy in electronic health records: survey from a social perspective.

    PubMed

    Tejero, Antonio; de la Torre, Isabel

    2012-10-01

    E-Health systems are experiencing an impulse in these last years, when many medical agencies began to include digital solutions into their platforms. Electronic Health Records (EHRs) are one of the most important improvements, being in its most part a patient-oriented tool. To achieve a completely operational EHR platform, security and privacy problems have to be resolved, due to the importance of the data included within these records. But given all the different methods to address security and privacy, they still remain in most cases as an open issue. This paper studies existing and proposed solutions included in different scenarios, in order to offer an overview of the current state in EHR systems. Bibliographic material has been obtained mainly from MEDLINE and SCOPUS sources, and over 30 publications have been analyzed. Many EHR platforms are being developed, but most of them present weaknesses when they are opened to the public. These architectures gain significance when they cover all the requisites related to security and privacy.

  10. Gas turbine vane platform element

    DOEpatents

    Campbell, Christian X [Oviedo, FL; Schiavo, Anthony L [Oviedo, FL; Morrison, Jay A [Oviedo, FL

    2012-08-28

    A gas turbine CMC shroud plate (48A) with a vane-receiving opening (79) that matches a cross-section profile of a turbine vane airfoil (22). The shroud plate (48A) has first and second curved circumferential sides (73A, 74A) that generally follow the curves of respective first and second curved sides (81, 82) of the vane-receiving opening. Walls (75A, 76A, 77A, 78A, 80, 88) extend perpendicularly from the shroud plate forming a cross-bracing structure for the shroud plate. A vane (22) may be attached to the shroud plate by pins (83) or by hoop-tension rings (106) that clamp tabs (103) of the shroud plate against bosses (105) of the vane. A circular array (20) of shroud plates (48A) may be assembled to form a vane shroud ring in which adjacent shroud plates are separated by compressible ceramic seals (93).

  11. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  12. Lattice QCD simulations using the OpenACC platform

    NASA Astrophysics Data System (ADS)

    Majumdar, Pushan

    2016-10-01

    In this article we will explore the OpenACC platform for programming Graphics Processing Units (GPUs). The OpenACC platform offers a directive based programming model for GPUs which avoids the detailed data flow control and memory management necessary in a CUDA programming environment. In the OpenACC model, programs can be written in high level languages with OpenMP like directives. We present some examples of QCD simulation codes using OpenACC and discuss their performance on the Fermi and Kepler GPUs.

  13. Query Health: standards-based, cross-platform population health surveillance

    PubMed Central

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Objective Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Materials and methods Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. Results We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. Discussions This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Conclusions Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. PMID:24699371

  14. Query Health: standards-based, cross-platform population health surveillance.

    PubMed

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. Easy research data handling with an OpenEarth DataLab for geo-monitoring research

    NASA Astrophysics Data System (ADS)

    Vanderfeesten, Maurice; van der Kuil, Annemiek; Prinčič, Alenka; den Heijer, Kees; Rombouts, Jeroen

    2015-04-01

    OpenEarth DataLab is an open source-based collaboration and processing platform to enable streamlined research data management from raw data ingest and transformation to interoperable distribution. It enables geo-scientists to easily synchronise, share, compute and visualise the dynamic and most up-to-date research data, scripts and models in multi-stakeholder geo-monitoring programs. This DataLab is developed by the Research Data Services team of TU Delft Library and 3TU.Datacentrum together with coastal engineers of Delft University of Technology and Deltares. Based on the OpenEarth software stack an environment has been developed to orchestrate numerous geo-related open source software components that can empower researchers and increase the overall research quality by managing research data; enabling automatic and interoperable data workflows between all the components with track & trace, hit & run data transformation processing in cloud infrastructure using MatLab and Python, synchronisation of data and scripts (SVN), and much more. Transformed interoperable data products (KML, NetCDF, PostGIS) can be used by ready-made OpenEarth tools for further analyses and visualisation, and can be distributed via interoperable channels such as THREDDS (OpenDAP) and GeoServer. An example of a successful application of OpenEarth DataLab is the Sand Motor, an innovative method for coastal protection in the Netherlands. The Sand Motor is a huge volume of sand that has been applied along the coast to be spread naturally by wind, waves and currents. Different research disciplines are involved concerned with: weather, waves and currents, sand distribution, water table and water quality, flora and fauna, recreation and management. Researchers share and transform their data in the OpenEarth DataLab, that makes it possible to combine their data and to see influence of different aspects of the coastal protection on their models. During the project the data are available only for the researchers involved. After the project a large part of the data and scripts will be published with DOI in the Data Archive of 3TU.Datacentrum for reuse in new research. For the 83 project members of the Sand Motor, the OpenEarth DataLab is available on www.zandmotordata.nl. The OpenEarth DataLab not only saves time and increases quality, but has the potential to open new frontiers for exploring cross-domain analysis and visualisations, revealing new scientific insights.

  16. PRESTO-Tango as an open-source resource for interrogation of the druggable human GPCRome.

    PubMed

    Kroeze, Wesley K; Sassano, Maria F; Huang, Xi-Ping; Lansu, Katherine; McCorvy, John D; Giguère, Patrick M; Sciaky, Noah; Roth, Bryan L

    2015-05-01

    G protein-coupled receptors (GPCRs) are essential mediators of cellular signaling and are important targets of drug action. Of the approximately 350 nonolfactory human GPCRs, more than 100 are still considered to be 'orphans' because their endogenous ligands remain unknown. Here, we describe a unique open-source resource that allows interrogation of the druggable human GPCRome via a G protein-independent β-arrestin-recruitment assay. We validate this unique platform at more than 120 nonorphan human GPCR targets, demonstrate its utility for discovering new ligands for orphan human GPCRs and describe a method (parallel receptorome expression and screening via transcriptional output, with transcriptional activation following arrestin translocation (PRESTO-Tango)) for the simultaneous and parallel interrogation of the entire human nonolfactory GPCRome.

  17. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    PubMed

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  18. Data Mining Meets HCI: Making Sense of Large Graphs

    DTIC Science & Technology

    2012-07-01

    graph algo- rithms, won the Open Source Software World Challenge, Silver Award. We have released Pegasus as free , open-source software, downloaded by...METIS [77], spectral clustering [108], and the parameter- free “Cross-associations” (CA) [26]. Belief Propagation can also be used for clus- tering, as...number of tools have been developed to support “ landscape ” views of information. These include WebBook and Web- Forager [23], which use a book metaphor

  19. Dynamic VM Provisioning for TORQUE in a Cloud Environment

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.

    2014-06-01

    Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.

  20. BioImageXD: an open, general-purpose and high-throughput image-processing platform.

    PubMed

    Kankaanpää, Pasi; Paavolainen, Lassi; Tiitta, Silja; Karjalainen, Mikko; Päivärinne, Joacim; Nieminen, Jonna; Marjomäki, Varpu; Heino, Jyrki; White, Daniel J

    2012-06-28

    BioImageXD puts open-source computer science tools for three-dimensional visualization and analysis into the hands of all researchers, through a user-friendly graphical interface tuned to the needs of biologists. BioImageXD has no restrictive licenses or undisclosed algorithms and enables publication of precise, reproducible and modifiable workflows. It allows simple construction of processing pipelines and should enable biologists to perform challenging analyses of complex processes. We demonstrate its performance in a study of integrin clustering in response to selected inhibitors.

Top