Sample records for open source content

  1. Cyberscience and the Knowledge-Based Economy. Open Access and Trade Publishing: From Contradiction to Compatibility with Non-Exclusive Copyright Licensing

    ERIC Educational Resources Information Center

    Armbruster, Chris

    2008-01-01

    Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge…

  2. A Clinician-Centered Evaluation of the Usability of AHLTA and Automated Clinical Practice Guidelines at TAMC

    DTIC Science & Technology

    2011-03-31

    evidence based medicine into clinical practice. It will decrease costs and enable multiple stakeholders to work in an open content/source environment to exchange clinical content, develop and test technology and explore processes in applied CDS. Design: Comparative study between the KMR infrastructure and capabilities developed as an open source, vendor agnostic solution for aCPG execution within AHLTA and the current DoD/MHS standard evaluating: H1: An open source, open standard KMR and Clinical Decision Support Engine can enable organizations to share domain

  3. Creating Open Source Conversation

    ERIC Educational Resources Information Center

    Sheehan, Kate

    2009-01-01

    Darien Library, where the author serves as head of knowledge and learning services, launched a new website on September 1, 2008. The website is built with Drupal, an open source content management system (CMS). In this article, the author describes how she and her colleagues overhauled the library's website to provide an open source content…

  4. Technical Communications in OSS Content Management Systems: An Academic Institutional Case Study

    ERIC Educational Resources Information Center

    Cripps, Michael J.

    2011-01-01

    Single sourcing through a content management system (CMS) is altering technical communication practices in many organizations, including institutions of higher education. Open source software (OSS) solutions are currently among the most popular content management platforms adopted by colleges and universities in the United States and abroad. The…

  5. Development of an open-source web-based intervention for Brazilian smokers - Viva sem Tabaco.

    PubMed

    Gomide, H P; Bernardino, H S; Richter, K; Martins, L F; Ronzani, T M

    2016-08-02

    Web-based interventions for smoking cessation available in Portuguese do not adhere to evidence-based treatment guidelines. Besides, all existing web-based interventions are built on proprietary platforms that developing countries often cannot afford. We aimed to describe the development of "Viva sem Tabaco", an open-source web-based intervention. The development of the intervention included the selection of content from evidence-based guidelines for smoking cessation, the design of the first layout, conduction of 2 focus groups to identify potential features, refinement of the layout based on focus groups and correction of content based on feedback provided by specialists on smoking cessation. At the end, we released the source-code and intervention on the Internet and translated it into Spanish and English. The intervention developed fills gaps in the information available in Portuguese and the lack of open-source interventions for smoking cessation. The open-source licensing format and its translation system may help researchers from different countries deploying evidence-based interventions for smoking cessation.

  6. An emerging role: the nurse content curator.

    PubMed

    Brooks, Beth A

    2015-01-01

    A new phenomenon, the inverted or "flipped" classroom, assumes that students are no longer acquiring knowledge exclusively through textbooks or lectures. Instead, they are seeking out the vast amount of free information available to them online (the very essence of open source) to supplement learning gleaned in textbooks and lectures. With so much open-source content available to nursing faculty, it benefits the faculty to use readily available, technologically advanced content. The nurse content curator supports nursing faculty in its use of such content. Even more importantly, the highly paid, time-strapped faculty is not spending an inordinate amount of effort surfing for and evaluating content. The nurse content curator does that work, while the faculty uses its time more effectively to help students vet the truth, make meaning of the content, and learn to problem-solve. Brooks. © 2014 Wiley Periodicals, Inc.

  7. Open Source Software and the Intellectual Commons.

    ERIC Educational Resources Information Center

    Dorman, David

    2002-01-01

    Discusses the Open Source Software method of software development and its relationship to control over information content. Topics include digital library resources; reference services; preservation; the legal and economic status of information; technical standards; access to digital data; control of information use; and copyright and patent laws.…

  8. Open Source, Crowd Source: Harnessing the Power of the People behind Our Libraries

    ERIC Educational Resources Information Center

    Trainor, Cindi

    2009-01-01

    Purpose: The purpose of this paper is to provide an insight into the use of Web 2.0 and Library 2.0 technologies so that librarians can combine open source software with user-generated content to create a richer discovery experience for their users. Design/methodology/approach: Following a description of the current state of integrated library…

  9. What Can OpenEI Do For You?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-12-10

    Open Energy Information (OpenEI) is an open source web platform—similar to the one used by Wikipedia—developed by the US Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to make the large amounts of energy-related data and information more easily searched, accessed, and used both by people and automated machine processes. Built utilizing the standards and practices of the Linked Open Data community, the OpenEI platform is much more robust and powerful than typical web sites and databases. As an open platform, all users can search, edit, add, and access data in OpenEI for free. The user communitymore » contributes the content and ensures its accuracy and relevance; as the community expands, so does the content's comprehensiveness and quality. The data are structured and tagged with descriptors to enable cross-linking among related data sets, advanced search functionality, and consistent, usable formatting. Data input protocols and quality standards help ensure the content is structured and described properly and derived from a credible source. Although DOE/NREL is developing OpenEI and seeding it with initial data, it is designed to become a true community model with millions of users, a large core of active contributors, and numerous sponsors.« less

  10. What Can OpenEI Do For You?

    ScienceCinema

    None

    2018-02-06

    Open Energy Information (OpenEI) is an open source web platform—similar to the one used by Wikipedia—developed by the US Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to make the large amounts of energy-related data and information more easily searched, accessed, and used both by people and automated machine processes. Built utilizing the standards and practices of the Linked Open Data community, the OpenEI platform is much more robust and powerful than typical web sites and databases. As an open platform, all users can search, edit, add, and access data in OpenEI for free. The user community contributes the content and ensures its accuracy and relevance; as the community expands, so does the content's comprehensiveness and quality. The data are structured and tagged with descriptors to enable cross-linking among related data sets, advanced search functionality, and consistent, usable formatting. Data input protocols and quality standards help ensure the content is structured and described properly and derived from a credible source. Although DOE/NREL is developing OpenEI and seeding it with initial data, it is designed to become a true community model with millions of users, a large core of active contributors, and numerous sponsors.

  11. Online Textbooks Deliver Timely, Real-World Content

    ERIC Educational Resources Information Center

    Seidel, Kim

    2009-01-01

    Faced with the challenge of keeping up with the rapidly changing field of information systems, author and teacher John Gallaugher opted to write an open source textbook with a new online company, Flat World Knowledge (FWK). Gallaugher's open source textbook, "Information Systems: A Manager's Guide to Harnessing Technology", has an expected…

  12. Sustaining Composition: Studying Content-Based, Ecological, and Economical Sustainability of Open-Source Textbooks through "Writing Spaces: Readings on Writing"

    ERIC Educational Resources Information Center

    Munson, Margaret

    2013-01-01

    Writing programs in institutions of higher education work to prepare students for real-world writing within any field of study. The composition of "Writing Spaces: Readings on Writing" offers an open-source text for students, teachers, and policy-makers at all levels. Exposure to an open space for learning encourages access to information,…

  13. Content Is King: An Analysis of How the Twitter Discourse Surrounding Open Education Unfolded from 2009 to 2016

    ERIC Educational Resources Information Center

    Paskevicius, Michael; Veletsianos, George; Kimmons, Royce

    2018-01-01

    Inspired by open educational resources, open pedagogy, and open source software, the openness movement in education has different meanings for different people. In this study, we use Twitter data to examine the discourses surrounding openness as well as the people who participate in discourse around openness. By targeting hashtags related to open…

  14. Digital Preservation in Open-Source Digital Library Software

    ERIC Educational Resources Information Center

    Madalli, Devika P.; Barve, Sunita; Amin, Saiful

    2012-01-01

    Digital archives and digital library projects are being initiated all over the world for materials of different formats and domains. To organize, store, and retrieve digital content, many libraries as well as archiving centers are using either proprietary or open-source software. While it is accepted that print media can survive for centuries with…

  15. Sustainable Multilingual Communication: Managing Multilingual Content Using Free and Open Source Content Management Systems

    ERIC Educational Resources Information Center

    Kelsey, Todd

    2011-01-01

    It is often too complicated or expensive for most educators, non-profits and individuals to create and maintain a multilingual Web site, because of the technological hurdles, and the logistics of working with content in different languages. But multilingual content management systems, combined with streamlined processes and inexpensive…

  16. Generation of open biomedical datasets through ontology-driven transformation and integration processes.

    PubMed

    Carmen Legaz-García, María Del; Miñarro-Giménez, José Antonio; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2016-06-03

    Biomedical research usually requires combining large volumes of data from multiple heterogeneous sources, which makes difficult the integrated exploitation of such data. The Semantic Web paradigm offers a natural technological space for data integration and exploitation by generating content readable by machines. Linked Open Data is a Semantic Web initiative that promotes the publication and sharing of data in machine readable semantic formats. We present an approach for the transformation and integration of heterogeneous biomedical data with the objective of generating open biomedical datasets in Semantic Web formats. The transformation of the data is based on the mappings between the entities of the data schema and the ontological infrastructure that provides the meaning to the content. Our approach permits different types of mappings and includes the possibility of defining complex transformation patterns. Once the mappings are defined, they can be automatically applied to datasets to generate logically consistent content and the mappings can be reused in further transformation processes. The results of our research are (1) a common transformation and integration process for heterogeneous biomedical data; (2) the application of Linked Open Data principles to generate interoperable, open, biomedical datasets; (3) a software tool, called SWIT, that implements the approach. In this paper we also describe how we have applied SWIT in different biomedical scenarios and some lessons learned. We have presented an approach that is able to generate open biomedical repositories in Semantic Web formats. SWIT is able to apply the Linked Open Data principles in the generation of the datasets, so allowing for linking their content to external repositories and creating linked open datasets. SWIT datasets may contain data from multiple sources and schemas, thus becoming integrated datasets.

  17. CellProfiler and KNIME: open source tools for high content screening.

    PubMed

    Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc

    2013-01-01

    High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.

  18. Unclassified Information Sharing and Coordination in Security, Stabilization, Transition and Reconstruction Efforts

    DTIC Science & Technology

    2008-03-01

    is implemented using the Drupal (2007) content management system (CMS) and many of the baseline information sharing and collaboration tools have...been contributed through the Dru- pal open source community. Drupal is a very modular open source software written in PHP hypertext processor...needed to suit the particular problem domain. While other frameworks have the potential to provide similar advantages (“Ruby,” 2007), Drupal was

  19. Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy.

    PubMed

    Görlitz, Frederik; Kelly, Douglas J; Warren, Sean C; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J; Stuhmeier, Frank; Neil, Mark A A; Tate, Edward W; Dunsby, Christopher; French, Paul M W

    2017-01-18

    We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set.

  20. Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy

    PubMed Central

    Warren, Sean C.; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A.; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J.; Stuhmeier, Frank; Neil, Mark A. A.; Tate, Edward W.; Dunsby, Christopher; French, Paul M. W.

    2017-01-01

    We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set. PMID:28190060

  1. "Open-Sourcing" Personal Learning

    ERIC Educational Resources Information Center

    Fiedler, Sebastian H.D.

    2014-01-01

    This article offers a critical reflection on the contemporary Open Educational Resource (OER) movement, its unquestioned investment in a collective "content fetish" and an educational "problem description" that focuses on issues of scarcity, access, and availability of quality materials. It also argues that OER proponents fail…

  2. CoSN K12 Open Technologies Implementation Study #3. Moodle: An Open Learning Content Management System for Schools

    ERIC Educational Resources Information Center

    Consortium for School Networking (NJ1), 2008

    2008-01-01

    This report introduces educators to Moodle, an open-source software program for managing courses online. It briefly defines what Moodle is, what it can do, and gives specific examples of how it is being implemented. An appendix contains brief profiles of five school organizations that are using Moodle.

  3. 40 CFR Table 8 to Subpart Wwww of... - Initial Compliance With Organic HAP Emissions Limits

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... SOURCE CATEGORIES National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic... organic HAP emissions limit . . . You have demonstrated initial complianceif . . . 1. open molding and... contents. 2. open molding centrifugal casting, continuous lamination/casting, SMC and BMC manufacturing...

  4. Genotype x environment interactions in eggplant for fruit phenolic acid content

    USDA-ARS?s Scientific Manuscript database

    Eggplant fruit are a rich source of phenolic acids that contribute to fruit nutritive value and influence culinary quality. We evaluated the influence of production environment on eggplant fruit phenolic acid content. Ten Solanum melongena accessions including five F1 hybrid cultivars, three open-...

  5. Google Sky as an Interactive Content Delivery System

    NASA Astrophysics Data System (ADS)

    Parrish, Michael

    2009-05-01

    In support of the International Year of Astronomy New Media Task Group's mission to create online astronomy content, several existing technologies are being leveraged. With this undertaking in mind, Google Sky provides an immersive contextual environment for both exploration and content presentation. As such, it affords opportunities for new methods of interactive media delivery. Traditional astronomy news sources and blogs are able to literally set a story at the location of their topic. Furthermore, audio based material can be complimented by a series of locations in the form of a guided tour. In order to provide automated generation and management of this content, an open source software suite has been developed.

  6. Terror: Social Media and Extremism

    DTIC Science & Technology

    2014-05-01

    rely on user reporting to police content , the Chinese government proactively monitors and removes content from websites it finds inappropriate. As...interpretation of the law left users unsure as to what constituted content that could harm the society. Unabridged laws intent on controlling social media can...what is going on in social media. The internet, especially with the advent of Web 2.0 and user driven content , has become a goldmine for open source

  7. Designing a Virtual-Reality-Based, Gamelike Math Learning Environment

    ERIC Educational Resources Information Center

    Xu, Xinhao; Ke, Fengfeng

    2016-01-01

    This exploratory study examined the design issues related to a virtual-reality-based, gamelike learning environment (VRGLE) developed via OpenSimulator, an open-source virtual reality server. The researchers collected qualitative data to examine the VRGLE's usability, playability, and content integration for math learning. They found it important…

  8. OER Use in Intermediate Language Instruction: A Case Study

    ERIC Educational Resources Information Center

    Godwin-Jones, Robert

    2017-01-01

    This paper reports on a case study in the experimental use of Open Educational Resources (OERs) in intermediate level language instruction. The resources come from three sources: the instructor, the students, and open content repositories. The objective of this action research project was to provide student-centered learning materials, enhance…

  9. Open Drug Discovery Teams: A Chemistry Mobile App for Collaboration.

    PubMed

    Ekins, Sean; Clark, Alex M; Williams, Antony J

    2012-08-01

    The Open Drug Discovery Teams (ODDT) project provides a mobile app primarily intended as a research topic aggregator of predominantly open science data collected from various sources on the internet. It exists to facilitate interdisciplinary teamwork and to relieve the user from data overload, delivering access to information that is highly relevant and focused on their topic areas of interest. Research topics include areas of chemistry and adjacent molecule-oriented biomedical sciences, with an emphasis on those which are most amenable to open research at present. These include rare and neglected diseases, and precompetitive and public-good initiatives such as green chemistry. The ODDT project uses a free mobile app as user entry point. The app has a magazine-like interface, and server-side infrastructure for hosting chemistry-related data as well as value added services. The project is open to participation from anyone and provides the ability for users to make annotations and assertions, thereby contributing to the collective value of the data to the engaged community. Much of the content is derived from public sources, but the platform is also amenable to commercial data input. The technology could also be readily used in-house by organizations as a research aggregator that could integrate internal and external science and discussion. The infrastructure for the app is currently based upon the Twitter API as a useful proof of concept for a real time source of publicly generated content. This could be extended further by accessing other APIs providing news and data feeds of relevance to a particular area of interest. As the project evolves, social networking features will be developed for organizing participants into teams, with various forms of communication and content management possible.

  10. Open Drug Discovery Teams: A Chemistry Mobile App for Collaboration

    PubMed Central

    Ekins, Sean; Clark, Alex M; Williams, Antony J

    2012-01-01

    Abstract The Open Drug Discovery Teams (ODDT) project provides a mobile app primarily intended as a research topic aggregator of predominantly open science data collected from various sources on the internet. It exists to facilitate interdisciplinary teamwork and to relieve the user from data overload, delivering access to information that is highly relevant and focused on their topic areas of interest. Research topics include areas of chemistry and adjacent molecule-oriented biomedical sciences, with an emphasis on those which are most amenable to open research at present. These include rare and neglected diseases, and precompetitive and public-good initiatives such as green chemistry. The ODDT project uses a free mobile app as user entry point. The app has a magazine-like interface, and server-side infrastructure for hosting chemistry-related data as well as value added services. The project is open to participation from anyone and provides the ability for users to make annotations and assertions, thereby contributing to the collective value of the data to the engaged community. Much of the content is derived from public sources, but the platform is also amenable to commercial data input. The technology could also be readily used in-house by organizations as a research aggregator that could integrate internal and external science and discussion. The infrastructure for the app is currently based upon the Twitter API as a useful proof of concept for a real time source of publicly generated content. This could be extended further by accessing other APIs providing news and data feeds of relevance to a particular area of interest. As the project evolves, social networking features will be developed for organizing participants into teams, with various forms of communication and content management possible. PMID:23198003

  11. Phytoecdysteroids and flavonoid glycosides among Chilean and commercial sources of Chenopodium quinoa: variation and correlation to physicochemical characteristics

    PubMed Central

    Graf, Brittany; Rojo, Leonel E.; Delatorre-Herrera, Jose; Poulev, Alexander; Calfio, Camila; Raskin, Ilya

    2015-01-01

    BACKGROUND Little is known about varietal differences in the content of bioactive phytoecdysteroids (PE) and flavonoid glycosides (FG) from quinoa (Chenopodium quinoa Willd.). The aim of this study was to determine the variation in PE and FG content among seventeen distinct quinoa sources and identify correlations to genotypic (highland vs. lowland) and physicochemical characteristics (seed color, 100-seed weight, protein content, oil content). RESULTS PE and FG concentrations exhibited over 4-fold differences across quinoa sources, ranging from 138 ± 11 μg/g to 570 ± 124 μg/g total PE content and 192 ± 24 μg/g to 804 ± 91 μg/g total FG content. Mean FG content was significantly higher in highland Chilean varieties (583.6 ± 148.9 μg/g) versus lowland varieties (228.2 ± 63.1 μg/g) grown under the same environmental conditions (P = 0.0046; t-test). Meanwhile, PE content was positively and significantly correlated with oil content across all quinoa sources (r = 0.707, P = 0.002; Pearson correlation). CONCLUSION FG content may be genotypically regulated in quinoa. PE content may be increased via enhancement of oil content. These findings may open new avenues for the improvement and development of quinoa as a functional food. PMID:25683633

  12. A Technology Enhanced Learning Model for Quality Education

    NASA Astrophysics Data System (ADS)

    Sherly, Elizabeth; Uddin, Md. Meraj

    Technology Enhanced Learning and Teaching (TELT) Model provides learning through collaborations and interactions with a framework for content development and collaborative knowledge sharing system as a supplementary for learning to improve the quality of education system. TELT deals with a unique pedagogy model for Technology Enhanced Learning System which includes course management system, digital library, multimedia enriched contents and video lectures, open content management system and collaboration and knowledge sharing systems. Open sources like Moodle and Wiki for content development, video on demand solution with a low cost mid range system, an exhaustive digital library are provided in a portal system. The paper depicts a case study of e-learning initiatives with TELT model at IIITM-K and how effectively implemented.

  13. ERMes: Open Source Simplicity for Your E-Resource Management

    ERIC Educational Resources Information Center

    Doering, William; Chilton, Galadriel

    2009-01-01

    ERMes, the latest version of electronic resource management system (ERM), is a relational database; content in different tables connects to, and works with, content in other tables. ERMes requires Access 2007 (Windows) or Access 2008 (Mac) to operate as the database utilizes functionality not available in previous versions of Microsoft Access. The…

  14. Integrating personalized medical test contents with XML and XSL-FO.

    PubMed

    Toddenroth, Dennis; Dugas, Martin; Frankewitsch, Thomas

    2011-03-01

    In 2004 the adoption of a modular curriculum at the medical faculty in Muenster led to the introduction of centralized examinations based on multiple-choice questions (MCQs). We report on how organizational challenges of realizing faculty-wide personalized tests were addressed by implementation of a specialized software module to automatically generate test sheets from individual test registrations and MCQ contents. Key steps of the presented method for preparing personalized test sheets are (1) the compilation of relevant item contents and graphical media from a relational database with database queries, (2) the creation of Extensible Markup Language (XML) intermediates, and (3) the transformation into paginated documents. The software module by use of an open source print formatter consistently produced high-quality test sheets, while the blending of vectorized textual contents and pixel graphics resulted in efficient output file sizes. Concomitantly the module permitted an individual randomization of item sequences to prevent illicit collusion. The automatic generation of personalized MCQ test sheets is feasible using freely available open source software libraries, and can be efficiently deployed on a faculty-wide scale.

  15. Do Open Source LMSs Support Personalization? A Comparative Evaluation

    NASA Astrophysics Data System (ADS)

    Kerkiri, Tania; Paleologou, Angela-Maria

    A number of parameters that support the LMSs capabilities towards content personalization are presented and substantiated. These parameters constitute critical criteria for an exhaustive investigation of the personalization capabilities of the most popular open source LMSs. Results are comparatively shown and commented upon, thus highlighting a course of conduct for the implementation of new personalization methodologies for these LMSs, aligned at their existing infrastructure, to maintain support of the numerous educational institutions entrusting major part of their curricula to them. Meanwhile, new capabilities arise as drawn from a more efficient description of the existing resources -especially when organized into widely available repositories- that lead to qualitatively advanced learner-oriented courses which would ideally meet the challenge of combining personification of demand and personalization of thematic content at once.

  16. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.

    PubMed

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.

  17. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis

    PubMed Central

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library. PMID:26656189

  18. 40 CFR Table 2 to Subpart Vvvv of... - Alternative Organic HAP Content Requirements for Open Molding Resin and Gel Coat Operations

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Requirements for Open Molding Resin and Gel Coat Operations 2 Table 2 to Subpart VVVV of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES National Emission Standards for Hazardous Air...

  19. An Open Source Agenda for Research Linking Text and Image Content Features.

    ERIC Educational Resources Information Center

    Goodrum, Abby A.; Rorvig, Mark E.; Jeong, Ki-Tai; Suresh, Chitturi

    2001-01-01

    Proposes methods to utilize image primitives to support term assignment for image classification. Proposes to release code for image analysis in a common tool set for other researchers to use. Of particular focus is the expansion of work by researchers in image indexing to include image content-based feature extraction capabilities in their work.…

  20. 12 CFR 344.5 - Content and time of notification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... transaction; or (C) In the case of open end investment company securities, the bank has provided the customer... remuneration from the customer or any other source in connection with the transaction, a statement of the source and amount of any remuneration to be received if such would be required under paragraph (b)(6) of...

  1. SuML: A Survey Markup Language for Generalized Survey Encoding

    PubMed Central

    Barclay, MW; Lober, WB; Karras, BT

    2002-01-01

    There is a need in clinical and research settings for a sophisticated, generalized, web based survey tool that supports complex logic, separation of content and presentation, and computable guidelines. There are many commercial and open source survey packages available that provide simple logic; few provide sophistication beyond “goto” statements; none support the use of guidelines. These tools are driven by databases, static web pages, and structured documents using markup languages such as eXtensible Markup Language (XML). We propose a generalized, guideline aware language and an implementation architecture using open source standards.

  2. Open source and open content: A framework for global collaboration in social-ecological research

    Treesearch

    Charles Schweik; Tom Evans; J. Morgan Grove

    2005-01-01

    This paper discusses opportunities for alternative collaborative approaches for social-ecological research in general and, in this context, for modeling land-use/land-cover change. In this field, the rate of progress in academic research is steady but perhaps not as rapid or efficient as might be possible with alternative organizational frameworks. The convergence of...

  3. Physics 30 Program Machine-Scorable Open-Ended Questions: Unit 2: Electric and Magnetic Forces. Diploma Examinations Program.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton.

    This document outlines the use of machine-scorable open-ended questions for the evaluation of Physics 30 in Alberta. Contents include: (1) an introduction to the questions; (2) sample instruction sheet; (3) fifteen sample items; (4) item information including the key, difficulty, and source of each item; (5) solutions to items having multiple…

  4. All Rights Reversed: A Study of Copyleft, Open-Source, and Open-Content Licensing

    ERIC Educational Resources Information Center

    Frantsvog, Dean A.

    2012-01-01

    In the United States and much of the world, the current framework of intellectual property laws revolves around protecting others from tampering with an author's work. The copyright holder decides who can use it, who can change it, and who can share. There is a growing school of thought, though, that holds that intellectual creations should be…

  5. Department Involvement in Instructional Materials Development for ODL Study at the Zimbabwe Open University (ZOU)

    ERIC Educational Resources Information Center

    Tanyanyiwa, Vincent Itai; Mutambanengwe, Betty

    2015-01-01

    The teaching and designing of modules at Zimbabwe Open University (ZOU) is the principal responsibility of a single body of teaching staff, although some authors and content reviewers could be sourced from elsewhere if they are not available in ZOU. This survey, through a case study, examines the involvement of lecturers and staff in the…

  6. Rhetorical Savvy as Social Skill: Modeling Entrepreneur Identity Construction within Educational Content Management Systems

    ERIC Educational Resources Information Center

    Spartz, John M.

    2010-01-01

    This article focuses on one aspect of rhetorical training that writing instructors have an opportunity--if not an obligation--to inculcate (or at least introduce) in students studying to be entrepreneurs and taking their writing classes. Specifically, through the use of an open source Content Management System (CMS) (e.g., Drupal or Moodle),…

  7. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  8. Biomass production of multipopulation microalgae in open air pond for biofuel potential.

    PubMed

    Selvakumar, P; Umadevi, K

    2016-04-01

    Biodiesel gains attention as it is made from renewable resources and has considerable environmental benefits. The present investigation has focused on large scale cultivation of multipopulation microalgae in open air pond using natural sea water without any additional nutritive supplements for low cost biomass production as a possible source of biofuel in large scale. Open air algal pond attained average chlorophyll concentration of 11.01 µg/L with the maximum of 43.65 µg/L as well as a higher lipid concentration of 18% (w/w) with lipid content 9.3 mg/L on the 10th day of the culture; and maximum biomass of 0.36 g/L on the 7th day of the culture. Composition analysis of fatty acid methyl ester (FAME) was performed by gas chromatography and mass spectrometry (GCMS). Multipopulation of algal biomass had 18% of total lipid content with 55% of total saturated fatty acids (SFA), 35.3% of monounsaturated fatty acids (MUFA) and 9.7% of polyunsaturated fatty acids (PUFA), revealing a potential source of biofuel production at low cost.

  9. The Study of Soil Protection in the System of the Cultivated Lands of Kemerovo Region

    NASA Astrophysics Data System (ADS)

    Yakovchenko, M. A.; Konstantinova, O. B.; Kosolapova, A. A.

    2015-09-01

    The heavy metal content in the surface soils is characterized with their ingress for the given period of time. The sources of heavy metals in the soil are precipitation, seeds, dust, organic and mineral fertilizers, and others. The paper studies the heavy metal content in the soils of the waste dumps of the open-pit coal mines.

  10. Deepening Pre-Service Teachers' Knowledge of Technology, Pedagogy, and Content (TPACK) in an Elementary School Mathematics Methods Course

    ERIC Educational Resources Information Center

    Polly, Drew

    2014-01-01

    This paper presents the findings of a study that examined pre-service teachers' development of knowledge about technology, pedagogy and content (TPACK) during a mathematics pedagogy course focused on elementary school mathematics in the United States. Data sources included work samples from pre-service teachers as well as an open-ended survey…

  11. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.

  12. Collaborate!

    ERIC Educational Resources Information Center

    Villano, Matt

    2007-01-01

    This article explores different approaches that facilitate online collaboration. The newest efforts in collaboration revolve around wikis. These websites allow visitors to add, remove, edit, and change content directly online. Another fairly affordable approach involves open source, a programming language that is, in many ways, collaborative…

  13. Criteria for Public Open Space Enhancement to Achieve Social Interaction: a Review Paper

    NASA Astrophysics Data System (ADS)

    Salih, S. A.; Ismail, S.

    2017-12-01

    A This paper presents a various literatures, studies, transcripts and papers aiming to provide an overview of some theories and existing research on the significance of natural environments and green open spaces to achieve social interaction and outdoor recreation. The main objective of the paper is to identify the factors that affecting social interaction in green open spaces, through proving that an appropriate open spaces is important to enhance social interaction and community. This study employs (qualitative) summarizing content analysis method which mainly focused on collect and summarizing of documentation such as transcripts, articles, papers, and books from more than 25 source, regarding the importance of public open spaces for the community. The summarizing content analysis of this paper is the fundament for a qualitative oriented procedure of text interpretation used to analyse the information gathered. Results of this study confirms that sound social interaction need an appropriate physical space including criteria of: design, activities, access and linkage, administration and maintenance, place attachment and users’ characteristics, also previous studies in this area have a health perspective with measures of physical activity of open spaces in general.

  14. A review on the sources and spatial-temporal distributions of Pb in Jiaozhou Bay

    NASA Astrophysics Data System (ADS)

    Yang, Dongfang; Zhang, Jie; Wang, Ming; Zhu, Sixi; Wu, Yunjie

    2017-12-01

    This paper provided a review on the source, spatial-distribution, temporal variations of Pb in Jiaozhou Bay based on investigation of Pb in surface and waters in different seasons during 1979-1983. The source strengths of Pb sources in Jiaozhou Bay were showing increasing trends, and the pollution level of Pb in this bay was slight or moderate in the early stage of reform and opening-up. Pb contents in the marine bay were mainly determined by the strength and frequency of Pb inputs from human activities, and Pb could be moving from high content areas to low content areas in the ocean interior. Surface waters in the ocean was polluted by human activities, and bottom waters was polluted by means of vertical water’s effect. The process of spatial distribution of Pb in waters was including three steps, i.e., 1), Pb was transferring to surface waters in the bay, 2) Pb was transferring to surface waters, and 3) Pb was transferring to and accumulating in bottom waters.

  15. osni.info-Using free/libre/open source software to build a virtual international community for open source nursing informatics.

    PubMed

    Oyri, Karl; Murray, Peter J

    2005-12-01

    Many health informatics organizations seem to be slow to take up the advantages of dynamic, web-based technologies for providing services to, and interaction with, their members; these are often the very technologies they promote for use within healthcare environments. This paper aims to introduce some of the many free/libre/open source (FLOSS) applications that are now available to develop interactive websites and dynamic online communities as part of the structure of health informatics organizations, and to show how the Open Source Nursing Informatics Working Group (OSNI) of the special interest group in nursing informatics of the International Medical Informatics Association (IMIA-NI) is using some of these tools to develop an online community of nurse informaticians through their website, at . Some background introduction to FLOSS applications is used for the benefit of those less familiar with such tools, and examples of some of the FLOSS content management systems (CMS) being used by OSNI are described. The experiences of the OSNI will facilitate a knowledgeable nursing contribution to the wider discussions on the applications of FLOSS within health and healthcare, and provides a model that many other groups could adopt.

  16. Open and Crowd-Sourced Data for Treaty Verification

    DTIC Science & Technology

    2014-10-01

    the case of user - generated Internet content , such as Wikipedia, or Amazon reviews. Another example is the Zooniverse citizen science project,6 which...Prescribed by ANSI Std. Z39.18 Contents EXECUTIVE SUMMARY 1 1 INTRODUCTION 5 1.1 Charge to the Panel . . . . . . . . . . . . . . . . . . . . . . . 7 2...number of potential observations can in many in- stances make up for relatively crude measurements made by the pub- lic. Users are motivated to

  17. Comparison of petroleum generation kinetics by isothermal hydrous and nonisothermal open-system pyrolysis

    USGS Publications Warehouse

    Lewan, M.D.; Ruble, T.E.

    2002-01-01

    This study compares kinetic parameters determined by open-system pyrolysis and hydrous pyrolysis using aliquots of source rocks containing different kerogen types. Kinetic parameters derived from these two pyrolysis methods not only differ in the conditions employed and products generated, but also in the derivation of the kinetic parameters (i.e., isothermal linear regression and non-isothermal nonlinear regression). Results of this comparative study show that there is no correlation between kinetic parameters derived from hydrous pyrolysis and open-system pyrolysis. Hydrous-pyrolysis kinetic parameters determine narrow oil windows that occur over a wide range of temperatures and depths depending in part on the organic-sulfur content of the original kerogen. Conversely, open-system kinetic parameters determine broad oil windows that show no significant differences with kerogen types or their organic-sulfur contents. Comparisons of the kinetic parameters in a hypothetical thermal-burial history (2.5 ??C/my) show open-system kinetic parameters significantly underestimate the extent and timing of oil generation for Type-US kerogen and significantly overestimate the extent and timing of petroleum formation for Type-I kerogen compared to hydrous pyrolysis kinetic parameters. These hypothetical differences determined by the kinetic parameters are supported by natural thermal-burial histories for the Naokelekan source rock (Type-IIS kerogen) in the Zagros basin of Iraq and for the Green River Formation (Type-I kerogen) in the Uinta basin of Utah. Differences in extent and timing of oil generation determined by open-system pyrolysis and hydrous pyrolysis can be attributed to the former not adequately simulating natural oil generation conditions, products, and mechanisms.

  18. Windows Memory Forensic Data Visualization

    DTIC Science & Technology

    2014-06-12

    clustering characteristics (Bastian, et al, 2009). The software is written in Java and utilizes the OpenGL library for rendering graphical content...Toolkit 2 nd ed. Burlington MA: Syngress. D3noob. (2013, February 8). Using a MYSQL database as a source of data. Message posted to http

  19. Assessing the contribution of wetlands and subsided islands to dissolved organic matter and disinfection byproduct precursors in the Sacramento-San Joaquin River Delta: A geochemical approach

    USGS Publications Warehouse

    Kraus, T.E.C.; Bergamaschi, B.A.; Hernes, P.J.; Spencer, R.G.M.; Stepanauskas, R.; Kendall, C.; Losee, R.F.; Fujii, R.

    2008-01-01

    This study assesses how rivers, wetlands, island drains and open water habitats within the Sacramento-San Joaquin River Delta affect dissolved organic matter (DOM) content and composition, and disinfection byproduct (DBP) formation. Eleven sites representative of these habitats were sampled on six dates to encompass seasonal variability. Using a suite of qualitative analyses, including specific DBP formation potential, absorbance, fluorescence, lignin content and composition, C and N stable isotopic compositions, and structural groupings determined using CPMAS (cross polarization, magic angle spinning) 13C NMR, we applied a geochemical fingerprinting approach to characterize the DOM from different Delta habitats, and infer DOM and DBP precursor sources and estimate the relative contribution from different sources. Although river input was the predominant source of dissolved organic carbon (DOC), we observed that 13-49% of the DOC exported from the Delta originated from sources within the Delta, depending on season. Interaction with shallow wetlands and subsided islands significantly increased DOC and DBP precursor concentrations and affected DOM composition, while deep open water habitats had little discernable effect. Shallow wetlands contributed the greatest amounts of DOM and DBP precursors in the spring and summer, in contrast to island drains which appeared to be an important source during winter months. The DOM derived from wetlands and island drains had greater haloacetic acid precursor content relative to incoming river water, while two wetlands contributed DOM with greater propensity to form trihalomethanes. These results are pertinent to restoration of the Delta. Large scale introduction of shallow wetlands, a proposed restoration strategy, could alter existing DOC and DBP precursor concentrations, depending on their hydrologic connection to Delta channels. ?? 2008 Elsevier Ltd.

  20. PyPedia: using the wiki paradigm as crowd sourcing environment for bioinformatics protocols.

    PubMed

    Kanterakis, Alexandros; Kuiper, Joël; Potamias, George; Swertz, Morris A

    2015-01-01

    Today researchers can choose from many bioinformatics protocols for all types of life sciences research, computational environments and coding languages. Although the majority of these are open source, few of them possess all virtues to maximize reuse and promote reproducible science. Wikipedia has proven a great tool to disseminate information and enhance collaboration between users with varying expertise and background to author qualitative content via crowdsourcing. However, it remains an open question whether the wiki paradigm can be applied to bioinformatics protocols. We piloted PyPedia, a wiki where each article is both implementation and documentation of a bioinformatics computational protocol in the python language. Hyperlinks within the wiki can be used to compose complex workflows and induce reuse. A RESTful API enables code execution outside the wiki. Initial content of PyPedia contains articles for population statistics, bioinformatics format conversions and genotype imputation. Use of the easy to learn wiki syntax effectively lowers the barriers to bring expert programmers and less computer savvy researchers on the same page. PyPedia demonstrates how wiki can provide a collaborative development, sharing and even execution environment for biologists and bioinformaticians that complement existing resources, useful for local and multi-center research teams. PyPedia is available online at: http://www.pypedia.com. The source code and installation instructions are available at: https://github.com/kantale/PyPedia_server. The PyPedia python library is available at: https://github.com/kantale/pypedia. PyPedia is open-source, available under the BSD 2-Clause License.

  1. Digital beacon receiver for ionospheric TEC measurement developed with GNU Radio

    NASA Astrophysics Data System (ADS)

    Yamamoto, M.

    2008-11-01

    A simple digital receiver named GNU Radio Beacon Receiver (GRBR) was developed for the satellite-ground beacon experiment to measure the ionospheric total electron content (TEC). The open-source software toolkit for the software defined radio, GNU Radio, is utilized to realize the basic function of the receiver and perform fast signal processing. The software is written in Python for a LINUX PC. The open-source hardware called Universal Software Radio Peripheral (USRP), which best matches the GNU Radio, is used as a front-end to acquire the satellite beacon signals of 150 and 400 MHz. The first experiment was successful as results from GRBR showed very good agreement to those from the co-located analog beacon receiver. Detailed design information and software codes are open at the URL http://www.rish.kyoto-u.ac.jp/digitalbeacon/.

  2. A novel low-cost open-hardware platform for monitoring soil water content and multiple soil-air-vegetation parameters.

    PubMed

    Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana

    2014-10-21

    Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost "open hardware" platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSE(training) = 2.63; RMSE(validation) = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies.

  3. A Novel Low-Cost Open-Hardware Platform for Monitoring Soil Water Content and Multiple Soil-Air-Vegetation Parameters

    PubMed Central

    Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana

    2014-01-01

    Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost “open hardware” platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSEtraining = 2.63; RMSEvalidation = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies. PMID:25337742

  4. The Case for Open Source Software: The Interactional Discourse Lab

    ERIC Educational Resources Information Center

    Choi, Seongsook

    2016-01-01

    Computational techniques and software applications for the quantitative content analysis of texts are now well established, and many qualitative data software applications enable the manipulation of input variables and the visualization of complex relations between them via interactive and informative graphical interfaces. Although advances in…

  5. Audiovisual heritage preservation in Earth and Space Science Informatics: Videos from Free and Open Source Software for Geospatial (FOSS4G) conferences in the TIB|AV-Portal.

    NASA Astrophysics Data System (ADS)

    Löwe, Peter; Marín Arraiza, Paloma; Plank, Margret

    2016-04-01

    The influence of Free and Open Source Software (FOSS) projects on Earth and Space Science Informatics (ESSI) continues to grow, particularly in the emerging context of Data Science or Open Science. The scientific significance and heritage of FOSS projects is only to a limited amount covered by traditional scientific journal articles: Audiovisual conference recordings contain significant information for analysis, reference and citation. In the context of data driven research, this audiovisual content needs to be accessible by effective search capabilities, enabling the content to be searched in depth and retrieved. Thereby, it is ensured that the content producers receive credit for their efforts within the respective communities. For Geoinformatics and ESSI, one distinguished driver is the OSGeo Foundation (OSGeo), founded in 2006 to support and promote the interdisciplinary collaborative development of open geospatial technologies and data. The organisational structure is based on software projects that have successfully passed the OSGeo incubation process, proving their compliance with FOSS licence models. This quality assurance is crucial for the transparent and unhindered application in (Open) Science. The main communication channels within and between the OSGeo-hosted community projects for face to face meetings are conferences on national, regional and global scale. Video recordings have been complementing the scientific proceedings since 2006. During the last decade, the growing body of OSGeo videos has been negatively affected by content loss, obsolescence of video technology and dependence on commercial video portals. Even worse, the distributed storage and lack of metadata do not guarantee concise and efficient access of the content. This limits the retrospective analysis of video content from past conferences. But, it also indicates a need for reliable, standardized, comparable audiovisual repositories for the future, as the number of OSGeo projects continues to grow - and so does the number of topics to be addressed at conferences. Up to now, commercial Web 2.0 platforms like Youtube and Vimeo were used. However, these platforms lack capabilities for long-term archiving and scientific citation, such as persistent identifiers that permit the citation of specific intervals of the overall content. To address these issues, the scientific library community has started to implement improved multimedia archiving and retrieval services for scientific audiovisual content which fulfil these requirements. Using the reference case of the OSGeo conference video recordings, this paper gives an overview over the new and growing collection activities by the German National Library of Science and Technology for audiovisual content in Geoinformatics/ESSI in the TIB|AV Portal for audiovisual content. Following a successful start in 2014 and positive response from the OSGeo Community, the TIB acquisition strategy for OSGeo video material was extended to include German, European, North-American and global conference content. The collection grows steadily by new conference content and also by harvesting of past conference videos from commercial Web 2.0 platforms like Youtube and Vimeo. This positions the TIB|AV-Portal as a reliable and concise long-term resource for innovation mining, education and scholarly research within the ESSI context both within Academia and Industry.

  6. ggCyto: Next Generation Open-Source Visualization Software for Cytometry.

    PubMed

    Van, Phu; Jiang, Wenxin; Gottardo, Raphael; Finak, Greg

    2018-06-01

    Open source software for computational cytometry has gained in popularity over the past few years. Efforts such as FlowCAP, the Lyoplate and Euroflow projects have highlighted the importance of efforts to standardize both experimental and computational aspects of cytometry data analysis. The R/BioConductor platform hosts the largest collection of open source cytometry software covering all aspects of data analysis and providing infrastructure to represent and analyze cytometry data with all relevant experimental, gating, and cell population annotations enabling fully reproducible data analysis. Data visualization frameworks to support this infrastructure have lagged behind. ggCyto is a new open-source BioConductor software package for cytometry data visualization built on ggplot2 that enables ggplot-like functionality with the core BioConductor flow cytometry data structures. Amongst its features are the ability to transform data and axes on-the-fly using cytometry-specific transformations, plot faceting by experimental meta-data variables, and partial matching of channel, marker and cell populations names to the contents of the BioConductor cytometry data structures. We demonstrate the salient features of the package using publicly available cytometry data with complete reproducible examples in a supplementary material vignette. https://bioconductor.org/packages/devel/bioc/html/ggcyto.html. gfinak@fredhutch.org. Supplementary data are available at Bioinformatics online and at http://rglab.org/ggcyto/.

  7. On providing the fault-tolerant operation of information systems based on open content management systems

    NASA Astrophysics Data System (ADS)

    Kratov, Sergey

    2018-01-01

    Modern information systems designed to service a wide range of users, regardless of their subject area, are increasingly based on Web technologies and are available to users via Internet. The article discusses the issues of providing the fault-tolerant operation of such information systems, based on free and open source content management systems. The toolkit available to administrators of similar systems is shown; the scenarios for using these tools are described. Options for organizing backups and restoring the operability of systems after failures are suggested. Application of the proposed methods and approaches allows providing continuous monitoring of the state of systems, timely response to the emergence of possible problems and their prompt solution.

  8. Engineer. The Professional Bulletin of Army Engineers. Volume 42. May-August 2012

    DTIC Science & Technology

    2012-08-01

    reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing the collection of...better than it is. I hope that you can see the intent to include good ideas from all sources in the contents of this profession- al bulletin; in the...openness displayed during visits around the world by the commandant, the regimental command ser- geant major, and regimental chief warrant officer; and

  9. Participatory Development Strategies for Open Source Content Management Systems

    ERIC Educational Resources Information Center

    Panke, Stefanie; Kohls, Christian; Gaiser, Birgit

    2007-01-01

    Stefanie Panke, Christian Kohls, and Birgit Gaiser maintain that effective strategies for the development of educational technology can only arise when the process is understood, analyzed, and assessed as a social phenomenon, and when the experience of users is integrated within the design process. To illustrate, they describe the early…

  10. Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks

    NASA Astrophysics Data System (ADS)

    Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.

    2010-12-01

    Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC).

  11. XaNSoNS: GPU-accelerated simulator of diffraction patterns of nanoparticles

    NASA Astrophysics Data System (ADS)

    Neverov, V. S.

    XaNSoNS is an open source software with GPU support, which simulates X-ray and neutron 1D (or 2D) diffraction patterns and pair-distribution functions (PDF) for amorphous or crystalline nanoparticles (up to ∼107 atoms) of heterogeneous structural content. Among the multiple parameters of the structure the user may specify atomic displacements, site occupancies, molecular displacements and molecular rotations. The software uses general equations nonspecific to crystalline structures to calculate the scattering intensity. It supports four major standards of parallel computing: MPI, OpenMP, Nvidia CUDA and OpenCL, enabling it to run on various architectures, from CPU-based HPCs to consumer-level GPUs.

  12. Improving Writing with Wiki Discussion Forums

    ERIC Educational Resources Information Center

    Corrigan, John

    2010-01-01

    Wikis are open-source sites, meaning that users may add, remove, or edit most content quickly. Because they are a public venue, students became more engaged and invested in what they wrote, wrote more frequently, edited their work more carefully, collaborated, and became accustomed to frequent peer and adult feedback. Writing practice and in-depth…

  13. Affordable Open-Source Data Loggers for Distributed Measurements of Sap-Flux, Stem Growth, Relative Humidity, Temperature, and Soil Water Content

    NASA Astrophysics Data System (ADS)

    Anderson, T.; Jencso, K. G.; Hoylman, Z. H.; Hu, J.

    2015-12-01

    Characterizing the mechanisms that lead to differences in forest ecosystem productivity across complex terrain remains a challenge. This difficulty can be partially attributed to the cost of installing networks of proprietary data loggers that monitor differences in the biophysical factors contributing to tree growth. Here, we describe the development and initial application of a network of open source data loggers. These data loggers are based on the Arduino platform, but were refined into a custom printed circuit board (PCB). This reduced the cost and complexity of the data loggers, which made them cheap to reproduce and reliable enough to withstand the harsh environmental conditions experienced in Ecohydrology studies. We demonstrate the utility of these loggers for high frequency, spatially-distributed measurements of sap-flux, stem growth, relative humidity, temperature, and soil water content across 36 landscape positions in the Lubrecht Experimental Forest, MT, USA. This new data logging technology made it possible to develop a spatially distributed monitoring network within the constraints of our research budget and may provide new insights into factors affecting forest productivity across complex terrain.

  14. Applying representational state transfer (REST) architecture to archetype-based electronic health record systems

    PubMed Central

    2013-01-01

    Background The openEHR project and the closely related ISO 13606 standard have defined structures supporting the content of Electronic Health Records (EHRs). However, there is not yet any finalized openEHR specification of a service interface to aid application developers in creating, accessing, and storing the EHR content. The aim of this paper is to explore how the Representational State Transfer (REST) architectural style can be used as a basis for a platform-independent, HTTP-based openEHR service interface. Associated benefits and tradeoffs of such a design are also explored. Results The main contribution is the formalization of the openEHR storage, retrieval, and version-handling semantics and related services into an implementable HTTP-based service interface. The modular design makes it possible to prototype, test, replicate, distribute, cache, and load-balance the system using ordinary web technology. Other contributions are approaches to query and retrieval of the EHR content that takes caching, logging, and distribution into account. Triggering on EHR change events is also explored. A final contribution is an open source openEHR implementation using the above-mentioned approaches to create LiU EEE, an educational EHR environment intended to help newcomers and developers experiment with and learn about the archetype-based EHR approach and enable rapid prototyping. Conclusions Using REST addressed many architectural concerns in a successful way, but an additional messaging component was needed to address some architectural aspects. Many of our approaches are likely of value to other archetype-based EHR implementations and may contribute to associated service model specifications. PMID:23656624

  15. Applying representational state transfer (REST) architecture to archetype-based electronic health record systems.

    PubMed

    Sundvall, Erik; Nyström, Mikael; Karlsson, Daniel; Eneling, Martin; Chen, Rong; Örman, Håkan

    2013-05-09

    The openEHR project and the closely related ISO 13606 standard have defined structures supporting the content of Electronic Health Records (EHRs). However, there is not yet any finalized openEHR specification of a service interface to aid application developers in creating, accessing, and storing the EHR content.The aim of this paper is to explore how the Representational State Transfer (REST) architectural style can be used as a basis for a platform-independent, HTTP-based openEHR service interface. Associated benefits and tradeoffs of such a design are also explored. The main contribution is the formalization of the openEHR storage, retrieval, and version-handling semantics and related services into an implementable HTTP-based service interface. The modular design makes it possible to prototype, test, replicate, distribute, cache, and load-balance the system using ordinary web technology. Other contributions are approaches to query and retrieval of the EHR content that takes caching, logging, and distribution into account. Triggering on EHR change events is also explored.A final contribution is an open source openEHR implementation using the above-mentioned approaches to create LiU EEE, an educational EHR environment intended to help newcomers and developers experiment with and learn about the archetype-based EHR approach and enable rapid prototyping. Using REST addressed many architectural concerns in a successful way, but an additional messaging component was needed to address some architectural aspects. Many of our approaches are likely of value to other archetype-based EHR implementations and may contribute to associated service model specifications.

  16. Emerging adults' perceptions of messages about physical appearance.

    PubMed

    Gillen, Meghan M; Lefkowitz, Eva S

    2009-06-01

    Emerging adults receive messages about physical appearance from a range of sources, but few studies have examined the content of these messages. Undergraduates (N=154) who identified as African American, Latino American, and European American answered 4 open-ended questions about messages they perceived about physical appearance from family, peers, school, and media. Raters coded responses for content and affect. The most common messages perceived were the importance/non-importance of appearance, positive comments about appearance, and the link between attractiveness and success. The perception of these messages frequently differed by gender and source, but rarely by ethnicity. Women perceived more frequent and more negative messages than did men. Individuals perceived the media as transmitting more negative messages and the family more healthful and positive ones.

  17. Crawling The Web for Libre: Selecting, Integrating, Extending and Releasing Open Source Software

    NASA Astrophysics Data System (ADS)

    Truslove, I.; Duerr, R. E.; Wilcox, H.; Savoie, M.; Lopez, L.; Brandt, M.

    2012-12-01

    Libre is a project developed by the National Snow and Ice Data Center (NSIDC). Libre is devoted to liberating science data from its traditional constraints of publication, location, and findability. Libre embraces and builds on the notion of making knowledge freely available, and both Creative Commons licensed content and Open Source Software are crucial building blocks for, as well as required deliverable outcomes of the project. One important aspect of the Libre project is to discover cryospheric data published on the internet without prior knowledge of the location or even existence of that data. Inspired by well-known search engines and their underlying web crawling technologies, Libre has explored tools and technologies required to build a search engine tailored to allow users to easily discover geospatial data related to the polar regions. After careful consideration, the Libre team decided to base its web crawling work on the Apache Nutch project (http://nutch.apache.org). Nutch is "an open source web-search software project" written in Java, with good documentation, a significant user base, and an active development community. Nutch was installed and configured to search for the types of data of interest, and the team created plugins to customize the default Nutch behavior to better find and categorize these data feeds. This presentation recounts the Libre team's experiences selecting, using, and extending Nutch, and working with the Nutch user and developer community. We will outline the technical and organizational challenges faced in order to release the project's software as Open Source, and detail the steps actually taken. We distill these experiences into a set of heuristics and recommendations for using, contributing to, and releasing Open Source Software.

  18. High fluoride water in Bondo-Rarieda area of Siaya County, Kenya: a hydro-geological implication on public health in the Lake Victoria Basin.

    PubMed

    Wambu, Enos W; Agong, Stephen G; Anyango, Beatrice; Akuno, Walter; Akenga, Teresa

    2014-05-17

    Only a few studies to evaluate groundwater fluoride in Eastern Africa have been undertaken outside the volcanic belt of the Great Eastern Africa Rift Valley. The extent and impact of water fluoride outside these regions therefore remain unclear. The current study evaluated fluoride levels in household water sources in Bondo-Rarieda Area in the Kenyan part of the Lake Victoria Basin (LVB) and highlighted the risk posed by water fluoride to the resident communities. The results, it was anticipated, will contribute to in-depth understanding of the fluoride problem in the region. A total of 128 water samples were collected from different water sources from the entire study area and analyzed for fluoride content using ion-selective electrodes. Lake Victoria was the main water source in the area but dams and open pans (39.5%), boreholes and shallow wells (23.5%), and streams (18.5%) were the principal water sources outside walking distances from the lake. The overall mean fluoride content of the water exceeded recommended limits for drinking water. The mean water fluoride was highest in Uyoma (1.39±0.84 ppm), Nyang'oma (1.00±0.59 ppm) and Asembo (0.92±0.46 ppm) and lowest in Maranda Division (0.69±0.42 ppm). Ponds (1.41±0.82 ppm), springs (1.25±0.43 ppm), dams and open pans (0.96±0.79 ppm), and streams (0.95±0.41 ppm) had highest fluoride levels but lake and river water did not have elevated fluoride levels. Groundwater fluoride decreased with increasing distance from the lake indicating that water fluoride may have hydro-geologically been translocated into the region from geochemical sources outside the area. Lake Victoria was the main water source for the residents of Bondo-Rarieda Area. Majority of in-land residents however used water from dams, open pans, boreholes, shallow wells, ponds and streams, which was generally saline and fluoridated. It was estimated that 36% of children living in this area, who consume water from ground sources from the area could be at the risk of dental fluorosis.

  19. Increasing Interoperability of E-Learning Content in Moodle within a Franco-Arabo Educative Context

    ERIC Educational Resources Information Center

    El Harrassi, Souad; Labour, Michel

    2010-01-01

    This article examines how Moodle, as an open-source Learning Management System, can be made more interoperable. The authors tested two software standards, LAMS and RELOAD, compatible with socio-constructivism norms. The analysis showed that pedagogic activities created with the LAMS-IMS Learning Design Level A Format are useable with Moodle but…

  20. An Annotated Bibliography of Articles in the "Journal of Speech and Language Pathology-Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Esch, Barbara E.; Forbes, Heather J.

    2017-01-01

    The open-source "Journal of Speech and Language Pathology-Applied Behavior Analysis" ("JSLP-ABA") was published online from 2006 to 2010. We present an annotated bibliography of 80 articles published in the now-defunct journal with the aim of representing its scholarly content to readers of "The Analysis of Verbal…

  1. Land cover change map comparisons using open source web mapping technologies

    Treesearch

    Erik Lindblom; Ian Housman; Tony Guay; Mark Finco; Kevin Megown

    2015-01-01

    The USDA Forest Service is evaluating the status of current landscape change maps and assessing gaps in their information content. These activities have been occurring under the auspices of the Landscape Change Monitoring System (LCMS) project, which is a joint effort between USFS Research, USFS Remote Sensing Applications Center (RSAC), USGS Earth Resources...

  2. Flux Of Carbon from an Airborne Laboratory (FOCAL): Synergy of airborne and surface measures of carbon emission and isotopologue content from tundra landscape in Alaska

    NASA Astrophysics Data System (ADS)

    Dobosy, R.; Dumas, E.; Sayres, D. S.; Kochendorfer, J.

    2013-12-01

    Arctic tundra, recognized as a potential major source of new atmospheric carbon, is characterized by low topographic relief and small-scale heterogeneity consisting of small lakes and intervening tundra vegetation. This fits well the flux-fragment method (FFM) of analysis of data from low-flying aircraft. The FFM draws on 1)airborne eddy-covariance flux measurements, 2)a classified surface-characteristics map (e.g. open water vs tundra), 3)a footprint model, and 4)companion surface-based eddy-covariance flux measurements. The FOCAL, a collaboration among Harvard University's Anderson Group, NOAA's Atmospheric Turbulence and Diffusion Division (ATDD), and Aurora Flight Sciences, Inc., made coordinated flights in 2013 August with a collaborating surface site. The FOCAL gathers not only flux data for CH4 and CO2 but also the corresponding carbon-isotopologue content of these gases. The surface site provides a continuous sample of carbon flux from interstitial tundra over time throughout the period of the campaign. The FFM draws samples from the aircraft data over many instances of tundra and also open water. From this we will determine how representative the surface site is of the larger area (100 km linear scale), and how much the open water differs from the tundra as a source of carbon.

  3. Open source marketing: Camel cigarette brand marketing in the "Web 2.0" world.

    PubMed

    Freeman, B; Chapman, S

    2009-06-01

    The international trend towards comprehensive bans on tobacco advertising has seen the tobacco industry become increasingly innovative in its approach to marketing. Further fuelling this innovation is the rapid evolution and accessibility of web-based technology. The internet, as a relatively unregulated marketing environment, provides many opportunities for tobacco companies to pursue their promotional ambitions. In this paper, "open source marketing" is considered as a vehicle that has been appropriated by the tobacco industry, through a case study of efforts to design the packaging for the Camel Signature Blends range of cigarettes. Four sources are used to explore this case study including a marketing literature search, a web-based content search via the Google search engine, interviews with advertising trade informants and an analysis of the Camel brand website. RJ Reynolds (RJR) has proven to be particularly innovative in designing cigarette packaging. RJR engaged with thousands of consumers through their Camel brand website to design four new cigarette flavours and packages. While the Camel Signature Blends packaging designs were subsequently modified for the retail market due to problems arising with their cartoon-like imagery, important lessons arise on how the internet blurs the line between marketing and market research. Open source marketing has the potential to exploit advertising ban loopholes and stretch legal definitions in order to generate positive word of mouth about tobacco products. There are also lessons in the open source marketing movement for more effective tobacco control measures including interactive social marketing campaigns and requiring plain packaging of tobacco products.

  4. Nowcasting influenza outbreaks using open-source media report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ray, Jaideep; Brownstein, John S.

    We construct and verify a statistical method to nowcast influenza activity from a time-series of the frequency of reports concerning influenza related topics. Such reports are published electronically by both public health organizations as well as newspapers/media sources, and thus can be harvested easily via web crawlers. Since media reports are timely, whereas reports from public health organization are delayed by at least two weeks, using timely, open-source data to compensate for the lag in %E2%80%9Cofficial%E2%80%9D reports can be useful. We use morbidity data from networks of sentinel physicians (both the Center of Disease Control's ILINet and France's Sentinelles network)more » as the gold standard of influenza-like illness (ILI) activity. The time-series of media reports is obtained from HealthMap (http://healthmap.org). We find that the time-series of media reports shows some correlation ( 0.5) with ILI activity; further, this can be leveraged into an autoregressive moving average model with exogenous inputs (ARMAX model) to nowcast ILI activity. We find that the ARMAX models have more predictive skill compared to autoregressive (AR) models fitted to ILI data i.e., it is possible to exploit the information content in the open-source data. We also find that when the open-source data are non-informative, the ARMAX models reproduce the performance of AR models. The statistical models are tested on data from the 2009 swine-flu outbreak as well as the mild 2011-2012 influenza season in the U.S.A.« less

  5. Tracking Clouds with low cost GNSS chips aided by the Arduino platform

    NASA Astrophysics Data System (ADS)

    Hameed, Saji; Realini, Eugenio; Ishida, Shinya

    2016-04-01

    The Global Navigation Satellite System (GNSS) is a constellation of satellites that is used to provide geo-positioning services. Besides this application, the GNSS system is important for a wide range of scientific and civilian applications. For example, GNSS systems are routinely used in civilian applications such as surveying and scientific applications such as the study of crustal deformation. Another important scientific application of GNSS system is in meteorological research. Here it is mainly used to determine the total water vapour content of the troposphere, hereafter Precipitable Water Vapor (PWV). However, both GNSS receivers and software have prohibitively high price due to a variety of reasons. To overcome this somewhat artificial barrier we are exploring the use of low-cost GNSS receivers along with open source GNSS software for scientific research, in particular for GNSS meteorology research. To achieve this aim, we have developed a custom Arduino compatible data logging board that is able to operate together with a specific low-cost single frequency GNSS receiver chip from NVS Technologies AG. We have also developed an open-source software bundle that includes a new Arduino core for the Atmel324p chip, which is the main processor used in our custom logger. We have also developed software code that enables data collection, logging and parsing of the GNSS data stream. Additionally we have comprehensively evaluated the low power characteristics of the GNSS receiver and logger boards. Currently we are exploring the use of several openly source or free to use for research software to map GNSS delays to PWV. These include the open source goGPS (http://www.gogps-project.org/) and gLAB (http://gage.upc.edu/gLAB) and the openly available GAMIT software from Massachusetts Institute of Technology (MIT). We note that all the firmware and software developed as part of this project is available on an open source license.

  6. Lessons learned in the generation of biomedical research datasets using Semantic Open Data technologies.

    PubMed

    Legaz-García, María del Carmen; Miñarro-Giménez, José Antonio; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2015-01-01

    Biomedical research usually requires combining large volumes of data from multiple heterogeneous sources. Such heterogeneity makes difficult not only the generation of research-oriented dataset but also its exploitation. In recent years, the Open Data paradigm has proposed new ways for making data available in ways that sharing and integration are facilitated. Open Data approaches may pursue the generation of content readable only by humans and by both humans and machines, which are the ones of interest in our work. The Semantic Web provides a natural technological space for data integration and exploitation and offers a range of technologies for generating not only Open Datasets but also Linked Datasets, that is, open datasets linked to other open datasets. According to the Berners-Lee's classification, each open dataset can be given a rating between one and five stars attending to can be given to each dataset. In the last years, we have developed and applied our SWIT tool, which automates the generation of semantic datasets from heterogeneous data sources. SWIT produces four stars datasets, given that fifth one can be obtained by being the dataset linked from external ones. In this paper, we describe how we have applied the tool in two projects related to health care records and orthology data, as well as the major lessons learned from such efforts.

  7. Three edible wild mushrooms from Nigeria: their proximate and mineral composition.

    PubMed

    Alofe, F V; Odeyemi, O; Oke, O L

    1996-01-01

    The pilei (caps) and the stipes (stalks) of the button and early open-cap (cup) stages of Lentinus subnudus, Psathyrella atroumbonata and Termitomyces striatus were assayed separately for their proximate and mineral composition. The differences observed in the contents of some of the proximate components seem to be related to species and mushroom parts. P. atroumbonata was richest in crude and true protein, L. subnudus was richer in crude fiber, ash and carbohydrates. Mineral contents appeared to be dependent on type and parts of the mushrooms analysed. The three mushrooms were good sources of magnesium, zinc and iron. L. subnudus contained between 14.83 and 20.00 ppm of iron, P. atroumbonata contained between 20.01 and 22.09 ppm and T. striatus contained between 17.13 and 22.93 ppm of iron. The pilei of P. atroumbonata and T. striatus are very good sources of zinc. Zinc contents for the pilei of P. atroumbonata were 63.81 and 64.94 ppm respectively. Zinc contents for T. striatus were 90.45 and 92.49 ppm respectively.

  8. Implementation of Web-Based Education in Egypt through Cloud Computing Technologies and Its Effect on Higher Education

    ERIC Educational Resources Information Center

    El-Seoud, M. Samir Abou; El-Sofany, Hosam F.; Taj-Eddin, Islam A. T. F.; Nosseir, Ann; El-Khouly, Mahmoud M.

    2013-01-01

    The information technology educational programs at most universities in Egypt face many obstacles that can be overcome using technology enhanced learning. An open source Moodle eLearning platform has been implemented at many public and private universities in Egypt, as an aid to deliver e-content and to provide the institution with various…

  9. The Use of Distance Learning in the Educational Process Content and Structure of Educational Platforms--Analysis of the Platform Educans

    ERIC Educational Resources Information Center

    Igado, Manuel Fandos

    2010-01-01

    This work provides some considerations that complements the scarcity of researches this field of knowledge of the e-learning specifically referred to secondary education. Distance training programmes (both open source code and not) are becoming increasingly more popular, especially in higher level education. However, there are very few cases of…

  10. Google earth as a source of ancillary material in a history of psychology class.

    PubMed

    Stevison, Blake K; Biggs, Patrick T; Abramson, Charles I

    2010-06-01

    This article discusses the use of Google Earth to visit significant geographical locations associated with events in the history of psychology. The process of opening files, viewing content, adding placemarks, and saving customized virtual tours on Google Earth are explained. Suggestions for incorporating Google Earth into a history of psychology course are also described.

  11. Texas' Influence over Textbook Content Could Shift with Changes in the Market

    ERIC Educational Resources Information Center

    Robelen, Erik W.

    2010-01-01

    As the Texas board of education prepares to adopt controversial new standards for social studies in May, many observers and news outlets have emphasized that the action may have ripple effects that reach classrooms far beyond the Lone Star State. Yet the extent of Texas' reach is a matter of debate, and recent legislation opens up new sources of…

  12. Transitioning from Marketing-Oriented Design to User-Oriented Design: A Case Study

    ERIC Educational Resources Information Center

    Laster, Shari; Stitz, Tammy; Bove, Frank J.; Wise, Casey

    2011-01-01

    The transition to a new architecture and design for an academic library Web site does not always proceed smoothly. In this case study, a library at a large research university hired an outside Web development contractor to create a new architecture and design for the university's Web site using dotCMS, an open-source content management system. The…

  13. An open ecosystem engagement strategy through the lens of global food safety

    PubMed Central

    Stacey, Paul; Fons, Garin; Bernardo, Theresa M

    2015-01-01

    The Global Food Safety Partnership (GFSP) is a public/private partnership established through the World Bank to improve food safety systems through a globally coordinated and locally-driven approach. This concept paper aims to establish a framework to help GFSP fully leverage the potential of open models. In preparing this paper the authors spoke to many different GFSP stakeholders who asked questions about open models such as: what is it?what’s in it for me?why use an open rather than a proprietary model?how will open models generate equivalent or greater sustainable revenue streams compared to the current “traditional” approaches?  This last question came up many times with assertions that traditional service providers need to see opportunity for equivalent or greater revenue dollars before they will buy-in. This paper identifies open value propositions for GFSP stakeholders and proposes a framework for creating and structuring that value. Open Educational Resources (OER) were the primary open practice GFSP partners spoke to us about, as they provide a logical entry point for collaboration. Going forward, funders should consider requiring that educational resources and concomitant data resulting from their sponsorship should be open, as a public good. There are, however, many other forms of open practice that bring value to the GFSP. Nine different open strategies and tactics (Appendix A) are described, including: open content (including OER and open courseware), open data, open access (research), open government, open source software, open standards, open policy, open licensing and open hardware. It is recommended that all stakeholders proactively pursue "openness" as an operating principle. This paper presents an overall GFSP Open Ecosystem Engagement Strategy within which specific local case examples can be situated. Two different case examples, China and Colombia, are presented to show both project-based and crowd-sourced, direct-to-public paths through this ecosystem. PMID:26213614

  14. Metadata management for high content screening in OMERO

    PubMed Central

    Li, Simon; Besson, Sébastien; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Lindner, Dominik; Linkert, Melissa; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Allan, Chris; Burel, Jean-Marie; Moore, Josh; Swedlow, Jason R.

    2016-01-01

    High content screening (HCS) experiments create a classic data management challenge—multiple, large sets of heterogeneous structured and unstructured data, that must be integrated and linked to produce a set of “final” results. These different data include images, reagents, protocols, analytic output, and phenotypes, all of which must be stored, linked and made accessible for users, scientists, collaborators and where appropriate the wider community. The OME Consortium has built several open source tools for managing, linking and sharing these different types of data. The OME Data Model is a metadata specification that supports the image data and metadata recorded in HCS experiments. Bio-Formats is a Java library that reads recorded image data and metadata and includes support for several HCS screening systems. OMERO is an enterprise data management application that integrates image data, experimental and analytic metadata and makes them accessible for visualization, mining, sharing and downstream analysis. We discuss how Bio-Formats and OMERO handle these different data types, and how they can be used to integrate, link and share HCS experiments in facilities and public data repositories. OME specifications and software are open source and are available at https://www.openmicroscopy.org. PMID:26476368

  15. Metadata management for high content screening in OMERO.

    PubMed

    Li, Simon; Besson, Sébastien; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Lindner, Dominik; Linkert, Melissa; Moore, William J; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Allan, Chris; Burel, Jean-Marie; Moore, Josh; Swedlow, Jason R

    2016-03-01

    High content screening (HCS) experiments create a classic data management challenge-multiple, large sets of heterogeneous structured and unstructured data, that must be integrated and linked to produce a set of "final" results. These different data include images, reagents, protocols, analytic output, and phenotypes, all of which must be stored, linked and made accessible for users, scientists, collaborators and where appropriate the wider community. The OME Consortium has built several open source tools for managing, linking and sharing these different types of data. The OME Data Model is a metadata specification that supports the image data and metadata recorded in HCS experiments. Bio-Formats is a Java library that reads recorded image data and metadata and includes support for several HCS screening systems. OMERO is an enterprise data management application that integrates image data, experimental and analytic metadata and makes them accessible for visualization, mining, sharing and downstream analysis. We discuss how Bio-Formats and OMERO handle these different data types, and how they can be used to integrate, link and share HCS experiments in facilities and public data repositories. OME specifications and software are open source and are available at https://www.openmicroscopy.org. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  16. ION SOURCE

    DOEpatents

    Brobeck, W.M.

    1959-04-14

    This patent deals with calutrons and more particularly to an arrangement therein whereby charged bottles in a calutron source unit may be replaced without admitting atmospheric air to the calutron vacuum chamber. As described, an ion unit is disposed within a vacuum tank and has a reservoir open toward a wall of the tank. A spike projects from thc source into the reservoir. When a charge bottle is placed in the reservoir, the spike breaks a frangible seal on the bottle. After the contents of the bottle are expended the bottle may be withdrawn and replaced with another charge bottle by a varuum lock arrangement in conjunction with an arm for manipulating the bottle.

  17. Ion source

    DOEpatents

    Brobeck, W. M.

    1959-04-14

    This patent deals with calutrons and more particularly to an arrangement therein whereby charged bottles in a calutron source unit may be replaced without admitting atmospheric air to the calutron vacuum chamber. As described, an ion unit is disposed within a vacuum tank and has a reservoir open toward a wall of the tank. A spike projects from the source into the reservoir. When a charge bottle is placed in the reservoir, the spike breaks a frangible seal on the bottle. After the contents of the bottle are expended the bottle may be withdrawn and replaced with another charge bottle by a vacuum lock arrangement in conjunction with an arm for manipulating the bottle.

  18. Water-vapor pressure control in a volume

    NASA Technical Reports Server (NTRS)

    Scialdone, J. J.

    1978-01-01

    The variation with time of the partial pressure of water in a volume that has openings to the outside environment and includes vapor sources was evaluated as a function of the purging flow and its vapor content. Experimental tests to estimate the diffusion of ambient humidity through openings and to validate calculated results were included. The purging flows required to produce and maintain a certain humidity in shipping containers, storage rooms, and clean rooms can be estimated with the relationship developed here. These purging flows are necessary to prevent the contamination, degradation, and other effects of water vapor on the systems inside these volumes.

  19. Concierge: Personal Database Software for Managing Digital Research Resources

    PubMed Central

    Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro

    2007-01-01

    This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800

  20. Contra-Rotating Open Rotor Tone Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane

    2014-01-01

    Reliable prediction of contra-rotating open rotor (CROR) noise is an essential element of any strategy for the development of low-noise open rotor propulsion systems that can meet both the community noise regulations and cabin noise limits. Since CROR noise spectra exhibit a preponderance of tones, significant efforts have been directed towards predicting their tone content. To that end, there has been an ongoing effort at NASA to assess various in-house open rotor tone noise prediction tools using a benchmark CROR blade set for which significant aerodynamic and acoustic data have been acquired in wind tunnel tests. In the work presented here, the focus is on the nearfield noise of the benchmark open rotor blade set at the cruise condition. Using an analytical CROR tone noise model with input from high-fidelity aerodynamic simulations, tone noise spectra have been predicted and compared with the experimental data. Comparisons indicate that the theoretical predictions are in good agreement with the data, especially for the dominant tones and for the overall sound pressure level of tones. The results also indicate that, whereas the individual rotor tones are well predicted by the combination of the thickness and loading sources, for the interaction tones it is essential that the quadrupole source is also included in the analysis.

  1. Influence of cold stress on contents of soluble sugars, vitamin C and free amino acids including gamma-aminobutyric acid (GABA) in spinach (Spinacia oleracea).

    PubMed

    Yoon, Young-Eun; Kuppusamy, Saranya; Cho, Kye Man; Kim, Pil Joo; Kwack, Yong-Bum; Lee, Yong Bok

    2017-01-15

    The contents of soluble sugars (sucrose, fructose, glucose, maltose and raffinose), vitamin C and free amino acids (34 compounds, essential and non-essential) were quantified in open-field and greenhouse-grown spinaches in response to cold stress using liquid chromatography. In general, greenhouse cultivation produced nutritionally high value spinach in a shorter growing period, where the soluble sugars, vitamin C and total amino acids concentrations, including essential were in larger amounts compared to those grown in open-field scenarios. Further, low temperature exposure of spinach during a shorter growth period resulted in the production of spinach with high sucrose, ascorbate, proline, gamma-aminobutyric acid, valine and leucine content, and these constitute the most important energy/nutrient sources. In conclusion, cultivation of spinach in greenhouse at a low temperature (4-7°C) and exposure for a shorter period (7-21days) before harvest is recommended. This strategy will produce a high quality product that people can eat. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. A one-stage cultivation process for lipid- and carbohydrate-rich biomass of Scenedesmus obtusiusculus based on artificial and natural water sources.

    PubMed

    Schulze, Christian; Reinhardt, Jakob; Wurster, Martina; Ortiz-Tena, José Guillermo; Sieber, Volker; Mundt, Sabine

    2016-10-01

    A one-stage cultivation process of the microalgae Scenedesmus obtusiusculus with medium based on natural water sources was developed to enhance lipids and carbohydrates. A medium based on artificial sea water, Baltic Sea water and river water with optimized nutrient concentrations compared to the standard BG-11 for nitrate (-75%), phosphate and iron (-90%) was used for cultivation. Although nitrate exhaustion over cultivation resulted in nitrate limitation, growth of the microalgae was not reduced. The lipid content increased from 6.0% to 19.9%, an increase in oleic and stearic acid was observed. The unsaponifiable matter of the lipid fraction was reduced from 19.5% to 11.4%. The carbohydrate yield rose from 45% to 50% and the protein content decreased from 32.4% to 15.9%. Using natural water sources with optimized nutrient concentrations could open the opportunity to modulate biomass composition and to reduce the cultivation costs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. High fluoride water in Bondo-Rarieda area of Siaya County, Kenya: a hydro-geological implication on public health in the Lake Victoria Basin

    PubMed Central

    2014-01-01

    Background Only a few studies to evaluate groundwater fluoride in Eastern Africa have been undertaken outside the volcanic belt of the Great Eastern Africa Rift Valley. The extent and impact of water fluoride outside these regions therefore remain unclear. The current study evaluated fluoride levels in household water sources in Bondo-Rarieda Area in the Kenyan part of the Lake Victoria Basin (LVB) and highlighted the risk posed by water fluoride to the resident communities. The results, it was anticipated, will contribute to in-depth understanding of the fluoride problem in the region. Methods A total of 128 water samples were collected from different water sources from the entire study area and analyzed for fluoride content using ion-selective electrodes. Results Lake Victoria was the main water source in the area but dams and open pans (39.5%), boreholes and shallow wells (23.5%), and streams (18.5%) were the principal water sources outside walking distances from the lake. The overall mean fluoride content of the water exceeded recommended limits for drinking water. The mean water fluoride was highest in Uyoma (1.39±0.84 ppm), Nyang’oma (1.00±0.59 ppm) and Asembo (0.92±0.46 ppm) and lowest in Maranda Division (0.69±0.42 ppm). Ponds (1.41±0.82 ppm), springs (1.25±0.43 ppm), dams and open pans (0.96±0.79 ppm), and streams (0.95±0.41 ppm) had highest fluoride levels but lake and river water did not have elevated fluoride levels. Groundwater fluoride decreased with increasing distance from the lake indicating that water fluoride may have hydro-geologically been translocated into the region from geochemical sources outside the area. Conclusions Lake Victoria was the main water source for the residents of Bondo-Rarieda Area. Majority of in-land residents however used water from dams, open pans, boreholes, shallow wells, ponds and streams, which was generally saline and fluoridated. It was estimated that 36% of children living in this area, who consume water from ground sources from the area could be at the risk of dental fluorosis. PMID:24884434

  4. Retrospective checking of compliance with practice guidelines for acute stroke care: a novel experiment using openEHR’s Guideline Definition Language

    PubMed Central

    2014-01-01

    Background Providing scalable clinical decision support (CDS) across institutions that use different electronic health record (EHR) systems has been a challenge for medical informatics researchers. The lack of commonly shared EHR models and terminology bindings has been recognised as a major barrier to sharing CDS content among different organisations. The openEHR Guideline Definition Language (GDL) expresses CDS content based on openEHR archetypes and can support any clinical terminologies or natural languages. Our aim was to explore in an experimental setting the practicability of GDL and its underlying archetype formalism. A further aim was to report on the artefacts produced by this new technological approach in this particular experiment. We modelled and automatically executed compliance checking rules from clinical practice guidelines for acute stroke care. Methods We extracted rules from the European clinical practice guidelines as well as from treatment contraindications for acute stroke care and represented them using GDL. Then we executed the rules retrospectively on 49 mock patient cases to check the cases’ compliance with the guidelines, and manually validated the execution results. We used openEHR archetypes, GDL rules, the openEHR reference information model, reference terminologies and the Data Archetype Definition Language. We utilised the open-sourced GDL Editor for authoring GDL rules, the international archetype repository for reusing archetypes, the open-sourced Ocean Archetype Editor for authoring or modifying archetypes and the CDS Workbench for executing GDL rules on patient data. Results We successfully represented clinical rules about 14 out of 19 contraindications for thrombolysis and other aspects of acute stroke care with 80 GDL rules. These rules are based on 14 reused international archetypes (one of which was modified), 2 newly created archetypes and 51 terminology bindings (to three terminologies). Our manual compliance checks for 49 mock patients were a complete match versus the automated compliance results. Conclusions Shareable guideline knowledge for use in automated retrospective checking of guideline compliance may be achievable using GDL. Whether the same GDL rules can be used for at-the-point-of-care CDS remains unknown. PMID:24886468

  5. Big Data Meets Physics Education Research: From MOOCs to University-Led High School Programs

    NASA Astrophysics Data System (ADS)

    Seaton, Daniel

    2017-01-01

    The Massive Open Online Course (MOOC) movement has catalyzed discussions of digital learning on campuses around the world and highlighted the increasingly large, complex datasets related to learning. Physics Education Research can and should play a key role in measuring outcomes of this most recent wave of digital education. In this talk, I will discuss big data and learning analytics through multiple modes of teaching and learning enabled by the open-source edX platform: open-online, flipped, and blended. Open-Online learning will be described through analysis of MOOC offerings from Harvard and MIT, where 2.5 million unique users have led to 9 million enrollments across nearly 300 courses. Flipped instruction will be discussed through an Advanced Placement program at Davidson College that empowers high school teachers to use AP aligned, MOOC content directly in their classrooms with only their students. Analysis of this program will be highlighted, including results from a pilot study showing a positive correlation between content usage and externally validated AP exam scores. Lastly, blended learning will be discussed through specific residential use cases at Davidson College and MIT, highlighting unique course models that blend open-online and residential experiences. My hope for this talk is that listeners will better understand the current wave of digital education and the opportunities it provides for data-driven teaching and learning.

  6. The assessment of online heath videos for surgery in Crohn's Disease.

    PubMed

    Marshall, J H; Baker, D M; Lee, M J; Jones, G L; Lobo, A J; Brown, S R

    2018-02-10

    YouTube ™ is an open-access, non-peer reviewed video-hosting site and is used as a source of publicly available healthcare information. This study aimed to assess the thematic content of the most viewed videos relating to surgery and Crohn's Disease and to explore the viewer interactions to these videos. A search of YouTube ™ was carried out using one search string. The 50 most viewed videos were identified and categorised by source, content themes and assessed for viewer interactions. Video comments were used to describe the usefulness of the video content to viewers. The majority of videos were uploaded by patients (n=21).The remainder were uploaded by individual health care professionals (n=9), hospital/speciality associations (n=18) and industry (n=2). The median number of likes for patient videos was significantly higher than hospital/speciality association videos (p=<0.001). Patient videos received more comments praising the video content(n=27), and more comments asking for further information (n=14). The median number of likes for 'experience of surgery' (p=<0.001) and 'experience ofdisease' (p=0.0015) themed videos, were significantly higher than 'disease management' themed videos. Crohn's disease patients use YouTube ™ as a surgical information source. The content of patient sourced videos focused on surgical and disease experience, suggesting these themes are important to patients.Current patient developed videos provide limited information, as reflected by viewers requesting further information. Storytelling patient-centred videos combined with clinical evidence may be a good model for future videos. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  7. Free open access medical education can help rural clinicians deliver 'quality care, out there'.

    PubMed

    Leeuwenburg, Tim J; Parker, Casey

    2015-01-01

    Rural clinicians require expertise across a broad range of specialties, presenting difficulty in maintaining currency of knowledge and application of best practice. Free open access medical education is a new paradigm in continuing professional education. Use of the internet and social media allows a globally accessible crowd-sourced adjunct, providing inline (contextual) and offline (asynchronous) content to augment traditional educational principles and the availability of relevant resources for life-long learning. This markedly reduces knowledge translation (the delay from inception of a new idea to bedside implementation) and allows rural clinicians to further expertise by engaging in discussion of cutting edge concepts with peers worldwide.

  8. CognitionMaster: an object-based image analysis framework

    PubMed Central

    2013-01-01

    Background Automated image analysis methods are becoming more and more important to extract and quantify image features in microscopy-based biomedical studies and several commercial or open-source tools are available. However, most of the approaches rely on pixel-wise operations, a concept that has limitations when high-level object features and relationships between objects are studied and if user-interactivity on the object-level is desired. Results In this paper we present an open-source software that facilitates the analysis of content features and object relationships by using objects as basic processing unit instead of individual pixels. Our approach enables also users without programming knowledge to compose “analysis pipelines“ that exploit the object-level approach. We demonstrate the design and use of example pipelines for the immunohistochemistry-based cell proliferation quantification in breast cancer and two-photon fluorescence microscopy data about bone-osteoclast interaction, which underline the advantages of the object-based concept. Conclusions We introduce an open source software system that offers object-based image analysis. The object-based concept allows for a straight-forward development of object-related interactive or fully automated image analysis solutions. The presented software may therefore serve as a basis for various applications in the field of digital image analysis. PMID:23445542

  9. Detecting a signal in the noise: monitoring the global spread of novel psychoactive substances using media and other open-source information†

    PubMed Central

    Young, Matthew M; Dubeau, Chad; Corazza, Ornella

    2015-01-01

    Objective To determine the feasibility and utility of using media reports and other open-source information collected by the Global Public Health Intelligence Network (GPHIN), an event-based surveillance system operated by the Public Health Agency of Canada, to rapidly detect clusters of adverse drug events associated with ‘novel psychoactive substances’ (NPS) at the international level. Methods and Results Researchers searched English media reports collected by the GPHIN between 1997 and 2013 for references to synthetic cannabinoids. They screened the resulting reports for relevance and content (i.e., reports of morbidity and arrest), plotted and compared with other available indicators (e.g., US poison control center exposures). The pattern of results from the analysis of GPHIN reports resembled the pattern seen from the other indicators. Conclusions The results of this study indicate that using media and other open-source information can help monitor the presence, usage, local policy, law enforcement responses, and spread of NPS in a rapid effective way. Further, modifying GPHIN to actively track NPS would be relatively inexpensive to implement and would be highly complementary to current national and international monitoring efforts. © 2015 The Authors. Human Psychopharmacology: Clinical and Experimental published by John Wiley & Sons, Ltd. PMID:26216568

  10. Baobab Laboratory Information Management System: Development of an Open-Source Laboratory Information Management System for Biobanking

    PubMed Central

    Bendou, Hocine; Sizani, Lunga; Reid, Tim; Swanepoel, Carmen; Ademuyiwa, Toluwaleke; Merino-Martinez, Roxana; Meuller, Heimo; Abayomi, Akin

    2017-01-01

    A laboratory information management system (LIMS) is central to the informatics infrastructure that underlies biobanking activities. To date, a wide range of commercial and open-source LIMSs are available and the decision to opt for one LIMS over another is often influenced by the needs of the biobank clients and researchers, as well as available financial resources. The Baobab LIMS was developed by customizing the Bika LIMS software (www.bikalims.org) to meet the requirements of biobanking best practices. The need to implement biobank standard operation procedures as well as stimulate the use of standards for biobank data representation motivated the implementation of Baobab LIMS, an open-source LIMS for Biobanking. Baobab LIMS comprises modules for biospecimen kit assembly, shipping of biospecimen kits, storage management, analysis requests, reporting, and invoicing. The Baobab LIMS is based on the Plone web-content management framework. All the system requirements for Plone are applicable to Baobab LIMS, including the need for a server with at least 8 GB RAM and 120 GB hard disk space. Baobab LIMS is a server–client-based system, whereby the end user is able to access the system securely through the internet on a standard web browser, thereby eliminating the need for standalone installations on all machines. PMID:28375759

  11. Baobab Laboratory Information Management System: Development of an Open-Source Laboratory Information Management System for Biobanking.

    PubMed

    Bendou, Hocine; Sizani, Lunga; Reid, Tim; Swanepoel, Carmen; Ademuyiwa, Toluwaleke; Merino-Martinez, Roxana; Meuller, Heimo; Abayomi, Akin; Christoffels, Alan

    2017-04-01

    A laboratory information management system (LIMS) is central to the informatics infrastructure that underlies biobanking activities. To date, a wide range of commercial and open-source LIMSs are available and the decision to opt for one LIMS over another is often influenced by the needs of the biobank clients and researchers, as well as available financial resources. The Baobab LIMS was developed by customizing the Bika LIMS software ( www.bikalims.org ) to meet the requirements of biobanking best practices. The need to implement biobank standard operation procedures as well as stimulate the use of standards for biobank data representation motivated the implementation of Baobab LIMS, an open-source LIMS for Biobanking. Baobab LIMS comprises modules for biospecimen kit assembly, shipping of biospecimen kits, storage management, analysis requests, reporting, and invoicing. The Baobab LIMS is based on the Plone web-content management framework. All the system requirements for Plone are applicable to Baobab LIMS, including the need for a server with at least 8 GB RAM and 120 GB hard disk space. Baobab LIMS is a server-client-based system, whereby the end user is able to access the system securely through the internet on a standard web browser, thereby eliminating the need for standalone installations on all machines.

  12. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    PubMed

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  13. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    PubMed Central

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  14. Remote Monitoring of Soil Water Content, Temperature, and Heat Flow Using Low-Cost Cellular (3G) IoT Technology

    NASA Astrophysics Data System (ADS)

    Ham, J. M.

    2016-12-01

    New microprocessor boards, open-source sensors, and cloud infrastructure developed for the Internet of Things (IoT) can be used to create low-cost monitoring systems for environmental research. This project describes two applications in soil science and hydrology: 1) remote monitoring of the soil temperature regime near oil and gas operations to detect the thermal signature associated with the natural source zone degradation of hydrocarbon contaminants in the vadose zone, and 2) remote monitoring of soil water content near the surface as part of a global citizen science network. In both cases, prototype data collection systems were built around the cellular (2G/3G) "Electron" microcontroller (www.particle.io). This device allows connectivity to the cloud using a low-cost global SIM and data plan. The systems have cellular connectivity in over 100 countries and data can be logged to the cloud for storage. Users can view data real time over any internet connection or via their smart phone. For both projects, data logging, storage, and visualization was done using IoT services like Thingspeak (thingspeak.com). The soil thermal monitoring system was tested on experimental plots in Colorado USA to evaluate the accuracy and reliability of different temperature sensors and 3D printed housings. The soil water experiment included comparison opens-source capacitance-based sensors to commercial versions. Results demonstrate the power of leveraging IoT technology for field research.

  15. Educators Assess "Open Content" Movement

    ERIC Educational Resources Information Center

    Trotter, Andrew

    2009-01-01

    This article discusses the open-content movement in education. A small but growing movement of K-12 educators is latching on to educational resources that are "open," or free for others to use, change, and republish on web sites that promote sharing. The open-content movement is fueled partly by digital creation tools that make it easy…

  16. The Virtual Collaboration Environment: New Media for Crisis Response

    DTIC Science & Technology

    2011-05-01

    openvce.net/forum-alternative-platforms and http://openvce.net/more), the open-source Drupal ®-based Proceedings of the 8th International ISCRAM... Drupal is a widely used modular content management system, with an active development community of its own. It provides a user management system and...authoring text documents (a facility felt to be lacking at the time in Drupal ). This wiki feature has itself been supplemented with experimental

  17. Apparatus for unloading pressurized fluid

    DOEpatents

    Rehberger, Kevin M.

    1994-01-01

    An apparatus for unloading fluid, preferably pressurized gas, from containers in a controlled manner that protects the immediate area from exposure to the container contents. The device consists of an unloading housing, which is enclosed within at least one protective structure, for receiving the dispensed contents of the steel container, and a laser light source, located external to the protective structure, for opening the steel container instantaneously. The neck or stem of the fluid container is placed within the sealed interior environment of the unloading housing. The laser light passes through both the protective structure and the unloading housing to instantaneously pierce a small hole within the stem of the container. Both the protective structure and the unloading housing are specially designed to allow laser light passage without compromising the light's energy level. Also, the unloading housing allows controlled flow of the gas once it has been dispensed from the container. The external light source permits remote operation of the unloading device.

  18. Project on Elite Athlete Commitment (PEAK): IV. identification of new candidate commitment sources in the sport commitment model.

    PubMed

    Scanlan, Tara K; Russell, David G; Scanlan, Larry A; Klunchoo, Tatiana J; Chow, Graig M

    2013-10-01

    Following a thorough review of the current updated Sport Commitment Model, new candidate commitment sources for possible future inclusion in the model are presented. They were derived from data obtained using the Scanlan Collaborative Interview Method. Three elite New Zealand teams participated: amateur All Black rugby players, amateur Silver Fern netball players, and professional All Black rugby players. An inductive content analysis of these players' open-ended descriptions of their sources of commitment identified four unique new candidate commitment sources: Desire to Excel, Team Tradition, Elite Team Membership, and Worthy of Team Membership. A detailed definition of each candidate source is included along with example quotes from participants. Using a mixed-methods approach, these candidate sources provide a basis for future investigations to test their viability and generalizability for possible expansion of the Sport Commitment Model.

  19. Distinct Crater and Conduit Infrasound Reveal an Open Vent Volcano Running Out of Gas

    NASA Astrophysics Data System (ADS)

    Lyons, J. J.; Fee, D.; Haney, M. M.; Diefenbach, A. K.; Carn, S. A.

    2017-12-01

    Open-vent degassing dominated activity at Mount Pagan, Mariana Islands dating back to at least 2013, when ground-based sensors were installed, to mid-2015 when degassing fell below detection limits. Gas sampling indicated shallow magma was the source, and an analysis of LP seismicity showed that repeated pressurization and venting of a shallow crack controlled degassing. Open-vent degassing also produced abundant infrasound, recorded on two 6-element arrays. Two main infrasound features are the focus of this study: 1) a 0.3 Hz iVLP and 2) a 1.7 Hz iLP. Tens of thousands of iVLPs and iLPs were recorded over the 22-month study period, and correlation and cluster analyses show little change in both waveform and frequency content, suggesting a non-destructive, repeating source. An interesting upper conduit-crater geometry was discovered in helicopter overflights of the summit crater, and to test the effects of the crater and conduit shape and size on the infrasound signals, a high-resolution (<1 meter) DEM of the crater was produced by structure-from-motion using video captured during helicopter orbits. We perform full-waveform inversion of the infrasound data using the 3D topography, and show that a synthetic monopole source induces distinct resonance in the crater and upper conduit that mostly reproduces the iVLP and iLP signals, respectively. Further investigation of the infrasound catalogue shows that while the frequency content and waveforms remained stable through time, the amplitude of the iVLP events began decreasing months prior to cessation of degassing. Initially, the iLP amplitudes remained unaffected while the iVLP amplitudes dropped, but in the final months before degassing ended iLP amplitudes also began decreasing. We interpret this pattern as a progressive decline in the gas overpressure, initially resulting in a decreased ability to trigger resonance in the large crater volume, but eventually affecting the ability of the monopole source to induce resonance in the smaller upper conduit volume. We compare the infrasound amplitudes to passive SO2 degassing of measured from the OMI sensor on NASA's Aura satellite during the study period and find a remarkable similarity in the datasets, confirming that the subtle waning of infrasound amplitudes was a harbinger of an open vent volcano running out of gas.

  20. Awareness, Attitudes and Participation of Teaching Staff towards the Open Content Movement in One University

    ERIC Educational Resources Information Center

    Reed, Peter

    2012-01-01

    This research investigates the current awareness of, and participation in, the open content movement at one UK institution for higher education. The open content movement and the open educational resources can be seen as potential methods for reducing time and cost of technology-enhanced learning developments; however, its sustainability and, to…

  1. Stochastic production phase design for an open pit mining complex with multiple processing streams

    NASA Astrophysics Data System (ADS)

    Asad, Mohammad Waqar Ali; Dimitrakopoulos, Roussos; van Eldert, Jeroen

    2014-08-01

    In a mining complex, the mine is a source of supply of valuable material (ore) to a number of processes that convert the raw ore to a saleable product or a metal concentrate for production of the refined metal. In this context, expected variation in metal content throughout the extent of the orebody defines the inherent uncertainty in the supply of ore, which impacts the subsequent ore and metal production targets. Traditional optimization methods for designing production phases and ultimate pit limit of an open pit mine not only ignore the uncertainty in metal content, but, in addition, commonly assume that the mine delivers ore to a single processing facility. A stochastic network flow approach is proposed that jointly integrates uncertainty in supply of ore and multiple ore destinations into the development of production phase design and ultimate pit limit. An application at a copper mine demonstrates the intricacies of the new approach. The case study shows a 14% higher discounted cash flow when compared to the traditional approach.

  2. OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale

    NASA Astrophysics Data System (ADS)

    Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason

    2015-03-01

    The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.

  3. Stirling cycle engine

    DOEpatents

    Lundholm, Gunnar

    1983-01-01

    In a Stirling cycle engine having a plurality of working gas charges separated by pistons reciprocating in cylinders, the total gas content is minimized and the mean pressure equalization among the serial cylinders is improved by using two piston rings axially spaced at least as much as the piston stroke and by providing a duct in the cylinder wall opening in the space between the two piston rings and leading to a source of minimum or maximum working gas pressure.

  4. Study of Adversarial and Defensive Components in an Experimental Machinery Control Systems Laboratory Environment

    DTIC Science & Technology

    2014-09-01

    prevention system (IPS), capable of performing real-time traffic analysis and packet logging on IP networks [25]. Snort’s features include protocol... analysis and content searching/matching. Snort can detect a variety of attacks and network probes, such as buffer overflows, port scans and OS...www.digitalbond.com/tools/the- rack/jtr-s7-password-cracking/ Kismet Mike Kershaw Cross- platform Open source wireless network detector and wireless sniffer

  5. 40 CFR Table 3 to Subpart Wwww of... - Organic HAP Emissions Limits for Existing Open Molding Sources, New Open Molding Sources Emitting...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and... CATEGORIES National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and...

  6. Sources of fatty acids in Lake Michigan surface microlayers and subsurface waters

    NASA Astrophysics Data System (ADS)

    Meyers, Philip A.; Owen, Robert M.

    1980-11-01

    Fatty acid and organic carbon contents have been measured in the particulate and dissolved phases of surface microlayer and subsurface water samples collected from Lake Michigan. Concentrations are highest close to fluvial sources and lowest in offshore areas, yet surface/subsurface fractionation is lowest near river mouths and highest in open lake locations. These gradients plus accompanying fatty acid compositional changes indicate that river-borne organic materials are important constituents of coastal Lake Michigan microlayers and that sinking and turbulent resuspension of particulates affect surface film characteristics. Lake neuston and plankton contribute organic components which partially replace potamic materials removed by sinking.

  7. Future-saving audiovisual content for Data Science: Preservation of geoinformatics video heritage with the TIB|AV-Portal

    NASA Astrophysics Data System (ADS)

    Löwe, Peter; Plank, Margret; Ziedorn, Frauke

    2015-04-01

    In data driven research, the access to citation and preservation of the full triad consisting of journal article, research data and -software has started to become good scientific practice. To foster the adoption of this practice the significance of software tools has to be acknowledged, which enable scientists to harness auxiliary audiovisual content in their research work. The advent of ubiquitous computer-based audiovisual recording and corresponding Web 2.0 hosting platforms like Youtube, Slideshare and GitHub has created new ecosystems for contextual information related to scientific software and data, which continues to grow both in size and variety of content. The current Web 2.0 platforms lack capabilities for long term archiving and scientific citation, such as persistent identifiers allowing to reference specific intervals of the overall content. The audiovisual content currently shared by scientists ranges from commented howto-demonstrations on software handling, installation and data-processing, to aggregated visual analytics of the evolution of software projects over time. Such content are crucial additions to the scientific message, as they ensure that software-based data-processing workflows can be assessed, understood and reused in the future. In the context of data driven research, such content needs to be accessible by effective search capabilities, enabling the content to be retrieved and ensuring that the content producers receive credit for their efforts within the scientific community. Improved multimedia archiving and retrieval services for scientific audiovisual content which meet these requirements are currently implemented by the scientific library community. This paper exemplifies the existing challenges, requirements, benefits and the potential of the preservation, accessibility and citability of such audiovisual content for the Open Source communities based on the new audiovisual web service TIB|AV Portal of the German National Library of Science and Technology. The web-based portal allows for extended search capabilities based on enhanced metadata derived by automated video analysis. By combining state-of-the-art multimedia retrieval techniques such as speech-, text-, and image recognition with semantic analysis, content-based access to videos at the segment level is provided. Further, by using the open standard Media Fragment Identifier (MFID), a citable Digital Object Identifier is displayed for each video segment. In addition to the continuously growing footprint of contemporary content, the importance of vintage audiovisual information needs to be considered: This paper showcases the successful application of the TIB|AV-Portal in the preservation and provision of a newly discovered version of a GRASS GIS promotional video produced by US Army -Corps of Enginers Laboratory (US-CERL) in 1987. The video is provides insight into the constraints of the very early days of the GRASS GIS project, which is the oldest active Free and Open Source Software (FOSS) GIS project which has been active for over thirty years. GRASS itself has turned into a collaborative scientific platform and a repository of scientific peer-reviewed code and algorithm/knowledge hub for future generation of scientists [1]. This is a reference case for future preservation activities regarding semantic-enhanced Web 2.0 content from geospatial software projects within Academia and beyond. References: [1] Chemin, Y., Petras V., Petrasova, A., Landa, M., Gebbert, S., Zambelli, P., Neteler, M., Löwe, P.: GRASS GIS: a peer-reviewed scientific platform and future research Repository, Geophysical Research Abstracts, Vol. 17, EGU2015-8314-1, 2015 (submitted)

  8. WPBMB Entrez: An interface to NCBI Entrez for Wordpress.

    PubMed

    Gohara, David W

    2018-03-01

    Research-oriented websites are an important means for the timely communication of information. These websites fall under a number of categories including: research laboratories, training grant and program projects, and online service portals. Invariably there is content on a site, such as publication listings, that require frequent updating. A number of content management systems exist to aid in the task of developing and managing a website, each with their strengths and weaknesses. One popular choice is Wordpress, a free, open source and actively developed application for the creation of web content. During a recent site redesign for our department, the need arose to ensure publications were up to date for each of the research labs and department as a whole. Several plugins for Wordpress offer this type of functionality, but in many cases the plugins are either no longer maintained, are missing features that would require the use of several, possibly incompatible, plugins or lack features for layout on a webpage. WPBMB Entrez was developed to address these needs. WPBMB Entrez utilizes a subset of NCBI Entrez and RCSB databases to maintain up to date records of publications, and publication related information on Wordpress-based websites. The core functionality uses the same search query syntax as on the NCBI Entrez site, including advanced query syntaxes. The plugin is extensible allowing for rapid development and addition of new data sources as the need arises. WPBMB Entrez was designed to be easy to use, yet flexible enough to address more complex usage scenarios. Features of the plugin include: an easy to use interface, design customization, multiple templates for displaying publication results, a caching mechanism to reduce page load times, supports multiple distinct queries and retrieval modes, and the ability to aggregate multiple queries into unified lists. Additionally, developer documentation is provided to aid in customization of the plugin. WPBMB Entrez is available at no cost, is open source and works with all recent versions of Wordpress. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Open Content in Open Context

    ERIC Educational Resources Information Center

    Kansa, Sarah Whitcher; Kansa, Eric C.

    2007-01-01

    This article presents the challenges and rewards of sharing research content through a discussion of Open Context, a new open access data publication system for field sciences and museum collections. Open Context is the first data repository of its kind, allowing self-publication of research data, community commentary through tagging, and clear…

  10. Developing Online Communities with LAMP (Linux, Apache, MySQL, PHP) - the IMIA OSNI and CHIRAD Experiences.

    PubMed

    Murray, Peter J; Oyri, Karl

    2005-01-01

    Many health informatics organisations do not seem to use, on a practical basis, for the benefit of their activities and interaction with their members, the very technologies that they often promote for use within healthcare environments. In particular, many organisations seem to be slow to take up the benefits of interactive web technologies. This paper presents an introduction to some of the many free/libre and open source (FLOSS) applications currently available and using the LAMP - Linux, Apache, MySQL, PHP architecture - as a way of cheaply deploying reliable, scalable, and secure web applications. The experience of moving to applications using LAMP architecture, in particular that of the Open Source Nursing Informatics (OSNI) Working Group of the Special Interest Group in Nursing Informatics of the International Medical Informatics Association (IMIA-NI), in using PostNuke, a FLOSS Content Management System (CMS) illustrates many of the benefits of such applications. The experiences of the authors in installing and maintaining a large number of websites using FLOSS CMS to develop dynamic, interactive websites that facilitate real engagement with the members of IMIA-NI OSNI, the IMIA Open Source Working Group, and the Centre for Health Informatics Research and Development (CHIRAD), as well as other organisations, is used as the basis for discussing the potential benefits that could be realised by others within the health informatics community.

  11. Open Data and Open Science for better Research in the Geo and Space Domain

    NASA Astrophysics Data System (ADS)

    Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.

    2015-12-01

    Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data catalog based on semantical interoperability including the transparent access to data in relational data bases. References: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/207772/Open_Data_Charter.pdfhttp://www.openscience.org/blog/wp-content/uploads/2013/06/OpenSciencePoster.pdf

  12. Content management systems and E-commerce: a comparative case study

    NASA Astrophysics Data System (ADS)

    Al Rasheed, Amal A.; El-Masri, Samir D.

    2011-12-01

    The need for CMS's to create and edit e-commerce websites has increased with the growing importance of e-commerce. In this paper, the various features essential for e-commerce CMS's are explored. The aim of the paper was to find the best CMS solution for e-commerce which includes the best of both CMS and store management. Accordingly, we conducted a study on three popular open source CMS's for e-commerce: VirtueMart from Joomla!, Ubercart from Drupal, and Magento. We took into account features like hosting and installation, performance, support/community, content management, add on modules and functional features. We concluded with improvements that could be made in order to alleviate problems.

  13. The Role of Semantics in Open-World, Integrative, Collaborative Science Data Platforms

    NASA Astrophysics Data System (ADS)

    Fox, Peter; Chen, Yanning; Wang, Han; West, Patrick; Erickson, John; Ma, Marshall

    2014-05-01

    As collaborative science spreads into more and more Earth and space science fields, both participants and funders are expressing stronger needs for highly functional data and information capabilities. Characteristics include a) easy to use, b) highly integrated, c) leverage investments, d) accommodate rapid technical change, and e) do not incur undue expense or time to build or maintain - these are not a small set of requirements. Based on our accumulated experience over the last ~ decade and several key technical approaches, we adapt, extend, and integrate several open source applications and frameworks to handle major portions of functionality for these platforms. This includes: an object-type repository, collaboration tools, identity management, all within a portal managing diverse content and applications. In this contribution, we present our methods and results of information models, adaptation, integration and evolution of a networked data science architecture based on several open source technologies (Drupal, VIVO, the Comprehensive Knowledge Archive Network; CKAN, and the Global Handle System; GHS). In particular we present the Deep Carbon Observatory - a platform for international science collaboration. We present and discuss key functional and non-functional attributes, and discuss the general applicability of the platform.

  14. Heliospotlight: An Information Resource for Heliophysics

    NASA Astrophysics Data System (ADS)

    Young, C.; Wawro, M.; Schenk, L. C.

    2013-12-01

    The NASA Goddard Heliophysics Science Division (HSD) EPO and mission websites are rich with content covering the broad subject of heliophysics. This includes detailed information for many age groups, a large range of descriptive imagery and dynamic video and interactive material. The weakness of all this content is that it is scattered over so many websites as opposed to being organized and focused in one user friendly location. The website heliospotlight.org is being developed to address all these concerns, leveraging the vast content already developed while using state-of-the-art web technologies. This will provide a rich user experience simultaneously tailoring to the needs of the broad audience of students, educators, scientists, journalists and the general public. The website will use well supported, open source technologies enabling future flexibility and expansion. HSD EPO will support the development of this information resource.

  15. The Emergence of Open-Source Software in China

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…

  16. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  17. A spatial information crawler for OpenGIS WFS

    NASA Astrophysics Data System (ADS)

    Jiang, Jun; Yang, Chong-jun; Ren, Ying-chao

    2008-10-01

    The growth of the internet makes it non-trivial to search for the accuracy information efficiently. Topical crawler, which is aiming at a certain area, attracts more and more intention now because it can help people to find out what they need. Furthermore, with the OpenGIS WFS (Web Feature Service) Specification developed by OGC (Open GIS Consortium), much more geospatial data providers adopt this protocol to publish their data on the internet. In this case, a crawler which is aiming at the WFS servers can help people to find the geospatial data from WFS servers. In this paper, we propose a prototype system of a WFS crawler based on the OpenGIS WFS Specification. The crawler architecture, working principles, and detailed function of each component are introduced. This crawler is capable of discovering WFS servers dynamically, saving and updating the service contents of the servers. The data collect by the crawler can be supported to a geospatial data search engine as its data source.

  18. Open Access, Open Source and Digital Libraries: A Current Trend in University Libraries around the World

    ERIC Educational Resources Information Center

    Krishnamurthy, M.

    2008-01-01

    Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…

  19. Sensing Slow Mobility and Interesting Locations for Lombardy Region (italy): a Case Study Using Pointwise Geolocated Open Data

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Oxoli, D.; Zurbarán, M. A.

    2016-06-01

    During the past years Web 2.0 technologies have caused the emergence of platforms where users can share data related to their activities which in some cases are then publicly released with open licenses. Popular categories for this include community platforms where users can upload GPS tracks collected during slow travel activities (e.g. hiking, biking and horse riding) and platforms where users share their geolocated photos. However, due to the high heterogeneity of the information available on the Web, the sole use of these user-generated contents makes it an ambitious challenge to understand slow mobility flows as well as to detect the most visited locations in a region. Exploiting the available data on community sharing websites allows to collect near real-time open data streams and enables rigorous spatial-temporal analysis. This work presents an approach for collecting, unifying and analysing pointwise geolocated open data available from different sources with the aim of identifying the main locations and destinations of slow mobility activities. For this purpose, we collected pointwise open data from the Wikiloc platform, Twitter, Flickr and Foursquare. The analysis was confined to the data uploaded in Lombardy Region (Northern Italy) - corresponding to millions of pointwise data. Collected data was processed through the use of Free and Open Source Software (FOSS) in order to organize them into a suitable database. This allowed to run statistical analyses on data distribution in both time and space by enabling the detection of users' slow mobility preferences as well as places of interest at a regional scale.

  20. Massive Open Online Courses in Dental Education: Two Viewpoints: Viewpoint 1: Massive Open Online Courses Offer Transformative Technology for Dental Education and Viewpoint 2: Massive Open Online Courses Are Not Ready for Primetime.

    PubMed

    Kearney, Rachel C; Premaraj, Sundaralingam; Smith, Becky M; Olson, Gregory W; Williamson, Anne E; Romanos, Georgios

    2016-02-01

    This point/counterpoint article discusses the strengths and weaknesses of incorporating Massive Open Online Courses (MOOCs) into dental education, focusing on whether this relatively new educational modality could impact traditional dental curricula. Viewpoint 1 asserts that MOOCs can be useful in dental education because they offer an opportunity for students to learn through content and assessment that is delivered online. While specific research on MOOCs is limited, some evidence shows that online courses may produce similar learning outcomes to those in face-to-face courses. Given that MOOCs are intended to be open source, there could be opportunities for dental schools with faculty shortages and financial constraints to incorporate these courses into their curricula. In addition to saving money, dental schools could use MOOCs as revenue sources in areas such as continuing education. Viewpoint 2 argues that the hype over MOOCs is subsiding due in part to weaker than expected evidence about their value. Because direct contact between students, instructors, and patients is essential to the dental curriculum, MOOCs have yet to demonstrate their usefulness in replacing more than a subset of didactic courses. Additionally, learning professionalism, a key component of health professions education, is best supported by mentorship that provides significant interpersonal interaction. In spite of the potential of early MOOC ideology, MOOCs in their current form require either further development or altered expectations to significantly impact dental education.

  1. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    PubMed

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.

  2. Resource sharing of online teaching materials: The lon-capa project

    NASA Astrophysics Data System (ADS)

    Bauer, Wolfgang

    2004-03-01

    The use of information technology resources in conventional lecture-based courses, in distance-learning offerings, as well as hybrid courses, is increasing. But this may put additional burden on faculty, who are now asked to deliver this new content. Additionally, it may require the installation of commercial courseware systems, putting the colleges and universities in new financial licensing dependencies. To address exactly these two problems, the lon-capa system was invented to provide an open-source, gnu public license based, courseware system that allows for sharing of educational resources across institutional and disciplinary boundaries. This presentation will focus on both aspects of the system, the courseware capabilities that allow for customized environments for individual students, and the educational resources library that enables teachers to take full advantages of the work of their colleagues. Research results on learning effectiveness, resource and system usage patterns, and customization for different learning styles will be shown. Institutional perceptions of and responses to open source courseware systems will be discussed.

  3. Design of a Community-Engaged Health Informatics Platform with an Architecture of Participation.

    PubMed

    Millery, Mari; Ramos, Wilson; Lien, Chueh; Aguirre, Alejandra N; Kukafka, Rita

    2015-01-01

    Community-engaged health informatics (CEHI) applies information technology and participatory approaches to improve the health of communities. Our objective was to translate the concept of CEHI into a usable and replicable informatics platform that will facilitate community-engaged practice and research. The setting is a diverse urban neighborhood in New York City. The methods included community asset mapping, stakeholder interviews, logic modeling, analysis of affordances in open-source tools, elicitation of use cases and requirements, and a survey of early adopters. Based on synthesis of data collected, GetHealthyHeigths.org (GHH) was developed using open-source LAMP stack and Drupal content management software. Drupal's organic groups module was used for novel participatory functionality, along with detailed user roles and permissions. Future work includes evaluation of GHH and its impact on agency and service networks. We plan to expand GHH with additional functionality to further support CEHI by combining informatics solutions with community engagement to improve health.

  4. Design of a Community-Engaged Health Informatics Platform with an Architecture of Participation

    PubMed Central

    Millery, Mari; Ramos, Wilson; Lien, Chueh; Aguirre, Alejandra N.; Kukafka, Rita

    2015-01-01

    Community-engaged health informatics (CEHI) applies information technology and participatory approaches to improve the health of communities. Our objective was to translate the concept of CEHI into a usable and replicable informatics platform that will facilitate community-engaged practice and research. The setting is a diverse urban neighborhood in New York City. The methods included community asset mapping, stakeholder interviews, logic modeling, analysis of affordances in open-source tools, elicitation of use cases and requirements, and a survey of early adopters. Based on synthesis of data collected, GetHealthyHeigths.org (GHH) was developed using open-source LAMP stack and Drupal content management software. Drupal’s organic groups module was used for novel participatory functionality, along with detailed user roles and permissions. Future work includes evaluation of GHH and its impact on agency and service networks. We plan to expand GHH with additional functionality to further support CEHI by combining informatics solutions with community engagement to improve health. PMID:26958227

  5. A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data

    DOE PAGES

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...

    2016-01-01

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  6. New Open-Source Version of FLORIS Released | News | NREL

    Science.gov Websites

    New Open-Source Version of FLORIS Released New Open-Source Version of FLORIS Released January 26 , 2018 National Renewable Energy Laboratory (NREL) researchers recently released an updated open-source simplified and documented. Because of the living, open-source nature of the newly updated utility, NREL

  7. The Organizational Impact of Open Educational Resources

    NASA Astrophysics Data System (ADS)

    Sclater, Niall

    The open educational resource (OER) movement has been growing rapidly since 2001, stimulated by funding from benefactors such as the Hewlett Foundation and UNESCO, and providing educational content freely to institutions and learners across the world. Individuals and organizations are motivated by a variety of drivers to produce OERs, both altruistic and self-interested. There are parallels with the open source movement, where authors and others combine their efforts to provide a product which they and others can use freely and adapt to their own purposes. There are many different ways in which OER initiatives are organized and an infinite range of possibilities for how the OERs themselves are constituted. If institutions are to develop sustainable OER initiatives, they need to build successful change management initiatives, developing models for the production and quality assurance of OERs, licensing them through appropriate mechanisms such as the Creative Commons, and considering how the resources will be discovered and used by learners.

  8. Stewardship and management challenges within a cloud-based open data ecosystem (Invited Paper 211863)

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2017-12-01

    NOAA's Big Data Project is conducting an experiment in the collaborative distribution of open government data to non-governmental cloud-based systems. Through Cooperative Research and Development Agreements signed in 2015 between NOAA and Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium, NOAA is distributing open government data to a wide community of potential users. There are a number of significant advantages related to the use of open data on commercial cloud platforms, but through this experiment NOAA is also discovering significant challenges for those stewarding and maintaining NOAA's data resources in support of users in the wider open data ecosystem. Among the challenges that will be discussed are: the need to provide effective interpretation of the data content to enable their use by data scientists from other expert communities; effective maintenance of Collaborators' open data stores through coordinated publication of new data and new versions of older data; the provenance and verification of open data as authentic NOAA-sourced data across multiple management boundaries and analytical tools; and keeping pace with the accelerating expectations of users with regard to improved quality control, data latency, availability, and discoverability. Suggested strategies to address these challenges will also be described.

  9. Macedonian journal of chemistry and chemical engineering: open journal systems--editor's perspective.

    PubMed

    Zdravkovski, Zoran

    2014-01-01

    The development and availability of personal computers and software as well as printing techniques in the last twenty years have made a profound change in the publication of scientific journals. Additionally, the Internet in the last decade has revolutionized the publication process to the point of changing the basic paradigm of printed journals. The Macedonian Journal of Chemistry and Chemical Engineering in its 40-year history has adopted and adapted to all these transformations. In order to keep up with the inevitable changes, as editor-in-chief I felt my responsibility was to introduce an electronic editorial managing of the journal. The choice was between commercial and open source platforms, and because of the limited funding of the journal we chose the latter. We decided on Open Journal Systems, which provided online submission and management of all content, had flexible configuration--requirements, sections, review process, etc., had options for comprehensive indexing, offered various reading tools, had email notification and commenting ability for readers, had an option for thesis abstracts and was installed locally. However, since there is limited support it requires a moderate computer knowledge/skills and effort in order to set up. Overall, it is an excellent editorial platform and a convenient solution for journals with a low budget or journals that do not want to spend their resources on commercial platforms or simply support the idea of open source software.

  10. The successes and challenges of open-source biopharmaceutical innovation.

    PubMed

    Allarakhia, Minna

    2014-05-01

    Increasingly, open-source-based alliances seek to provide broad access to data, research-based tools, preclinical samples and downstream compounds. The challenge is how to create value from open-source biopharmaceutical innovation. This value creation may occur via transparency and usage of data across the biopharmaceutical value chain as stakeholders move dynamically between open source and open innovation. In this article, several examples are used to trace the evolution of biopharmaceutical open-source initiatives. The article specifically discusses the technological challenges associated with the integration and standardization of big data; the human capacity development challenges associated with skill development around big data usage; and the data-material access challenge associated with data and material access and usage rights, particularly as the boundary between open source and open innovation becomes more fluid. It is the author's opinion that the assessment of when and how value creation will occur, through open-source biopharmaceutical innovation, is paramount. The key is to determine the metrics of value creation and the necessary technological, educational and legal frameworks to support the downstream outcomes of now big data-based open-source initiatives. The continued focus on the early-stage value creation is not advisable. Instead, it would be more advisable to adopt an approach where stakeholders transform open-source initiatives into open-source discovery, crowdsourcing and open product development partnerships on the same platform.

  11. Use of a Machine Learning-Based High Content Analysis Approach to Identify Photoreceptor Neurite Promoting Molecules.

    PubMed

    Fuller, John A; Berlinicke, Cynthia A; Inglese, James; Zack, Donald J

    2016-01-01

    High content analysis (HCA) has become a leading methodology in phenotypic drug discovery efforts. Typical HCA workflows include imaging cells using an automated microscope and analyzing the data using algorithms designed to quantify one or more specific phenotypes of interest. Due to the richness of high content data, unappreciated phenotypic changes may be discovered in existing image sets using interactive machine-learning based software systems. Primary postnatal day four retinal cells from the photoreceptor (PR) labeled QRX-EGFP reporter mice were isolated, seeded, treated with a set of 234 profiled kinase inhibitors and then cultured for 1 week. The cells were imaged with an Acumen plate-based laser cytometer to determine the number and intensity of GFP-expressing, i.e. PR, cells. Wells displaying intensities and counts above threshold values of interest were re-imaged at a higher resolution with an INCell2000 automated microscope. The images were analyzed with an open source HCA analysis tool, PhenoRipper (Rajaram et al., Nat Methods 9:635-637, 2012), to identify the high GFP-inducing treatments that additionally resulted in diverse phenotypes compared to the vehicle control samples. The pyrimidinopyrimidone kinase inhibitor CHEMBL-1766490, a pan kinase inhibitor whose major known targets are p38α and the Src family member lck, was identified as an inducer of photoreceptor neuritogenesis by using the open-source HCA program PhenoRipper. This finding was corroborated using a cell-based method of image analysis that measures quantitative differences in the mean neurite length in GFP expressing cells. Interacting with data using machine learning algorithms may complement traditional HCA approaches by leading to the discovery of small molecule-induced cellular phenotypes in addition to those upon which the investigator is initially focusing.

  12. Apparatus for unloading pressurized fluid

    DOEpatents

    Rehberger, K.M.

    1994-01-04

    An apparatus is described for unloading fluid, preferably pressurized gas, from containers in a controlled manner that protects the immediate area from exposure to the container contents. The device consists of an unloading housing, which is enclosed within at least one protective structure, for receiving the dispensed contents of the steel container, and a laser light source, located external to the protective structure, for opening the steel container instantaneously. The neck or stem of the fluid container is placed within the sealed interior environment of the unloading housing. The laser light passes through both the protective structure and the unloading housing to instantaneously pierce a small hole within the stem of the container. Both the protective structure and the unloading housing are specially designed to allow laser light passage without compromising the light's energy level. Also, the unloading housing allows controlled flow of the gas once it has been dispensed from the container. The external light source permits remote operation of the unloading device. 2 figures.

  13. VisBOL: Web-Based Tools for Synthetic Biology Design Visualization.

    PubMed

    McLaughlin, James Alastair; Pocock, Matthew; Mısırlı, Göksel; Madsen, Curtis; Wipat, Anil

    2016-08-19

    VisBOL is a Web-based application that allows the rendering of genetic circuit designs, enabling synthetic biologists to visually convey designs in SBOL visual format. VisBOL designs can be exported to formats including PNG and SVG images to be embedded in Web pages, presentations and publications. The VisBOL tool enables the automated generation of visualizations from designs specified using the Synthetic Biology Open Language (SBOL) version 2.0, as well as a range of well-known bioinformatics formats including GenBank and Pigeoncad notation. VisBOL is provided both as a user accessible Web site and as an open-source (BSD) JavaScript library that can be used to embed diagrams within other content and software.

  14. Power Balance and Impurity Studies in TCS

    NASA Astrophysics Data System (ADS)

    Grossnickle, J. A.; Pietrzyk, Z. A.; Vlases, G. C.

    2003-10-01

    A "zero-dimension" power balance model was developed based on measurements of absorbed power, radiated power, absolute D_α, temperature, and density for the TCS device. Radiation was determined to be the dominant source of power loss for medium to high density plasmas. The total radiated power was strongly correlated with the Oxygen line radiation. This suggests Oxygen is the dominant radiating species, which was confirmed by doping studies. These also extrapolate to a Carbon content below 1.5%. Determining the source of the impurities is an important question that must be answered for the TCS upgrade. Preliminary indications are that the primary sources of Oxygen are the stainless steel end cones. A Ti gettering system is being installed to reduce this Oxygen source. A field line code has been developed for use in tracking where open field lines terminate on the walls. Output from this code is also used to generate grids for an impurity tracking code.

  15. Open for Business

    ERIC Educational Resources Information Center

    Voyles, Bennett

    2007-01-01

    People know about the Sakai Project (open source course management system); they may even know about Kuali (open source financials). So, what is the next wave in open source software? This article discusses business intelligence (BI) systems. Though open source BI may still be only a rumor in most campus IT departments, some brave early adopters…

  16. Healthy Harlem: empowering health consumers through social networking, tailoring and web 2.0 technologies.

    PubMed

    Khan, Sharib A; McFarlane, Delano J; Li, Jianhua; Ancker, Jessica S; Hutchinson, Carly; Cohall, Alwyn; Kukafka, Rita

    2007-10-11

    Consumer health informatics has emerged as a strategy to inform and empower patients for self management of their health. The emergence of and explosion in use of user-generated online media (e.g.,blogs) has created new opportunities to inform and educate people about healthy living. Under a prevention research project, we are developing a website that utilizes social content collaboration mediums in conjunction with open-source technologies to create a community-driven resource that provides users with tailored health information.

  17. Small stones sets Web site apart. Froedtert Hospital updates provide valuable healthcare information.

    PubMed

    Rees, Tom

    2002-01-01

    Froedtert & Medical College, an academic medical center, has adopted a proactive approach to providing consumers with reliable sources of information. The Milwaukee institution has redesigned its Web site, which first opened in 1995. The new version has simplified the navigation process and added new content. Small Stones, a health resource center, also a brick-and-mortar shop, went online Feb. 1. Online bill paying was launched in May. Pharmacy refill functions are expected to be online this summer.

  18. The Commercial Open Source Business Model

    NASA Astrophysics Data System (ADS)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  19. A study of innovative features in scholarly open access journals.

    PubMed

    Björk, Bo-Christer

    2011-12-16

    The emergence of the Internet has triggered tremendous changes in the publication of scientific peer-reviewed journals. Today, journals are usually available in parallel electronic versions, but the way the peer-review process works, the look of articles and journals, and the rigid and slow publication schedules have remained largely unchanged, at least for the vast majority of subscription-based journals. Those publishing firms and scholarly publishers who have chosen the more radical option of open access (OA), in which the content of journals is freely accessible to anybody with Internet connectivity, have had a much bigger degree of freedom to experiment with innovations. The objective was to study how open access journals have experimented with innovations concerning ways of organizing the peer review, the format of journals and articles, new interactive and media formats, and novel publishing revenue models. The features of 24 open access journals were studied. The journals were chosen in a nonrandom manner from the approximately 7000 existing OA journals based on available information about interesting journals and include both representative cases and highly innovative outlier cases. Most early OA journals in the 1990s were founded by individual scholars and used a business model based on voluntary work close in spirit to open-source development of software. In the next wave, many long-established journals, in particular society journals and journals from regions such as Latin America, made their articles OA when they started publishing parallel electronic versions. From about 2002 on, newly founded professional OA publishing firms using article-processing charges to fund their operations have emerged. Over the years, there have been several experiments with new forms of peer review, media enhancements, and the inclusion of structured data sets with articles. In recent years, the growth of OA publishing has also been facilitated by the availability of open-source software for journal publishing. The case studies illustrate how a new technology and a business model enabled by new technology can be harnessed to find new innovative ways for the organization and content of scholarly publishing. Several recent launches of OA journals by major subscription publishers demonstrate that OA is rapidly gaining acceptance as a sustainable alternative to subscription-based scholarly publishing.

  20. A Study of Innovative Features in Scholarly Open Access Journals

    PubMed Central

    2011-01-01

    Background The emergence of the Internet has triggered tremendous changes in the publication of scientific peer-reviewed journals. Today, journals are usually available in parallel electronic versions, but the way the peer-review process works, the look of articles and journals, and the rigid and slow publication schedules have remained largely unchanged, at least for the vast majority of subscription-based journals. Those publishing firms and scholarly publishers who have chosen the more radical option of open access (OA), in which the content of journals is freely accessible to anybody with Internet connectivity, have had a much bigger degree of freedom to experiment with innovations. Objective The objective was to study how open access journals have experimented with innovations concerning ways of organizing the peer review, the format of journals and articles, new interactive and media formats, and novel publishing revenue models. Methods The features of 24 open access journals were studied. The journals were chosen in a nonrandom manner from the approximately 7000 existing OA journals based on available information about interesting journals and include both representative cases and highly innovative outlier cases. Results Most early OA journals in the 1990s were founded by individual scholars and used a business model based on voluntary work close in spirit to open-source development of software. In the next wave, many long-established journals, in particular society journals and journals from regions such as Latin America, made their articles OA when they started publishing parallel electronic versions. From about 2002 on, newly founded professional OA publishing firms using article-processing charges to fund their operations have emerged. Over the years, there have been several experiments with new forms of peer review, media enhancements, and the inclusion of structured data sets with articles. In recent years, the growth of OA publishing has also been facilitated by the availability of open-source software for journal publishing. Conclusions The case studies illustrate how a new technology and a business model enabled by new technology can be harnessed to find new innovative ways for the organization and content of scholarly publishing. Several recent launches of OA journals by major subscription publishers demonstrate that OA is rapidly gaining acceptance as a sustainable alternative to subscription-based scholarly publishing. PMID:22173122

  1. Science Concierge: A Fast Content-Based Recommendation System for Scientific Publications.

    PubMed

    Achakulvisut, Titipat; Acuna, Daniel E; Ruangrong, Tulakan; Kording, Konrad

    2016-01-01

    Finding relevant publications is important for scientists who have to cope with exponentially increasing numbers of scholarly material. Algorithms can help with this task as they help for music, movie, and product recommendations. However, we know little about the performance of these algorithms with scholarly material. Here, we develop an algorithm, and an accompanying Python library, that implements a recommendation system based on the content of articles. Design principles are to adapt to new content, provide near-real time suggestions, and be open source. We tested the library on 15K posters from the Society of Neuroscience Conference 2015. Human curated topics are used to cross validate parameters in the algorithm and produce a similarity metric that maximally correlates with human judgments. We show that our algorithm significantly outperformed suggestions based on keywords. The work presented here promises to make the exploration of scholarly material faster and more accurate.

  2. Science Concierge: A Fast Content-Based Recommendation System for Scientific Publications

    PubMed Central

    Achakulvisut, Titipat; Acuna, Daniel E.; Ruangrong, Tulakan; Kording, Konrad

    2016-01-01

    Finding relevant publications is important for scientists who have to cope with exponentially increasing numbers of scholarly material. Algorithms can help with this task as they help for music, movie, and product recommendations. However, we know little about the performance of these algorithms with scholarly material. Here, we develop an algorithm, and an accompanying Python library, that implements a recommendation system based on the content of articles. Design principles are to adapt to new content, provide near-real time suggestions, and be open source. We tested the library on 15K posters from the Society of Neuroscience Conference 2015. Human curated topics are used to cross validate parameters in the algorithm and produce a similarity metric that maximally correlates with human judgments. We show that our algorithm significantly outperformed suggestions based on keywords. The work presented here promises to make the exploration of scholarly material faster and more accurate. PMID:27383424

  3. Authentic Astronomical Discovery in Planetariums: Bringing Data to Domes

    NASA Astrophysics Data System (ADS)

    Wyatt, Ryan Jason; Subbarao, Mark; Christensen, Lars; Emmons, Ben; Hurt, Robert

    2018-01-01

    Planetariums offer a unique opportunity to disseminate astronomical discoveries using data visualization at all levels of complexity: the technical infrastructure to display data and a sizeable cohort of enthusiastic educators to interpret results. “Data to Dome” is an initiative the International Planetarium Society to develop our community’s capacity to integrate data in fulldome planetarium systems—including via open source software platforms such as WorldWide Telescope and OpenSpace. We are cultivating a network of planetarium professionals who integrate data into their presentations and share their content with others. Furthermore, we propose to shorten the delay between discovery and dissemination in planetariums. Currently, the “latest science” is often presented days or weeks after discoveries are announced, and we can shorten this to hours or even minutes. The Data2Dome (D2D) initiative, led by the European Southern Observatory, proposes technical infrastructure and data standards that will streamline content flow from research institutions to planetariums, offering audiences a unique opportunity to access to the latest astronomical data in near real time.

  4. NASA Sea Level Change Portal - It not just another portal site

    NASA Astrophysics Data System (ADS)

    Huang, T.; Quach, N.; Abercrombie, S. P.; Boening, C.; Brennan, H. P.; Gill, K. M.; Greguska, F. R., III; Jackson, R.; Larour, E. Y.; Shaftel, H.; Tenenbaum, L. F.; Zlotnicki, V.; Moore, B.; Moore, J.; Boeck, A.

    2017-12-01

    The NASA Sea Level Change Portal (https://sealevel.nasa.gov) is designed as a "one-stop" source for current sea level change information, including interactive tools for accessing and viewing regional data, a virtual dashboard of sea level indicators, and ongoing updates through a suite of editorial products that include content articles, graphics, videos, and animations. With increasing global temperatures warming the ocean and melting ice sheets and glaciers, there is an immediate need both for accelerating sea level change research and for making this research accessible to scientists in disparate discipline, to the general public, to policy makers and business. The immersive and innovative NASA portal debuted at the 2015 AGU attracts thousands of daily visitors and over 30K followers on Facebook®. Behind its intuitive interface is an extensible architecture that integrates site contents, data for various sources, visualization, horizontal-scale geospatial data analytic technology (called NEXUS), and an interactive 3D simulation platform (called the Virtual Earth System Laboratory). We will present an overview of our NASA portal and some of our architectural decisions along with discussion on our open-source, cloud-based data analytic technology that enables on-the-fly analysis of heterogeneous data.

  5. Space Place Prime

    NASA Technical Reports Server (NTRS)

    Fitzpatrick, Austin J.; Novati, Alexander; Fisher, Diane K.; Leon, Nancy J.; Netting, Ruth

    2013-01-01

    Space Place Prime is public engagement and education software for use on iPad. It targets a multi-generational audience with news, images, videos, and educational articles from the Space Place Web site and other NASA sources. New content is downloaded daily (or whenever the user accesses the app) via the wireless connection. In addition to the Space Place Web site, several NASA RSS feeds are tapped to provide new content. Content is retained for the previous several days, or some number of editions of each feed. All content is controlled on the server side, so features about the latest news, or changes to any content, can be made without updating the app in the Apple Store. It gathers many popular NASA features into one app. The interface is a boundless, slidable- in-any-direction grid of images, unique for each feature, and iconized as image, video, or article. A tap opens the feature. An alternate list mode presents menus of images, videos, and articles separately. Favorites can be tagged for permanent archive. Face - book, Twitter, and e-mail connections make any feature shareable.

  6. Openly Published Environmental Sensing (OPEnS) | Advancing Open-Source Research, Instrumentation, and Dissemination

    NASA Astrophysics Data System (ADS)

    Udell, C.; Selker, J. S.

    2017-12-01

    The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.

  7. Open-source software: not quite endsville.

    PubMed

    Stahl, Matthew T

    2005-02-01

    Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.

  8. Developing an Open Source Option for NASA Software

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Parks, John W. (Technical Monitor)

    2003-01-01

    We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.

  9. Spatial Information Processing: Standards-Based Open Source Visualization Technology

    NASA Astrophysics Data System (ADS)

    Hogan, P.

    2009-12-01

    . Spatial information intelligence is a global issue that will increasingly affect our ability to survive as a species. Collectively we must better appreciate the complex relationships that make life on Earth possible. Providing spatial information in its native context can accelerate our ability to process that information. To maximize this ability to process information, three basic elements are required: data delivery (server technology), data access (client technology), and data processing (information intelligence). NASA World Wind provides open source client and server technologies based on open standards. The possibilities for data processing and data sharing are enhanced by this inclusive infrastructure for geographic information. It is interesting that this open source and open standards approach, unfettered by proprietary constraints, simultaneously provides for entirely proprietary use of this same technology. 1. WHY WORLD WIND? NASA World Wind began as a single program with specific functionality, to deliver NASA content. But as the possibilities for virtual globe technology became more apparent, we found that while enabling a new class of information technology, we were also getting in the way. Researchers, developers and even users expressed their desire for World Wind functionality in ways that would service their specific needs. They want it in their web pages. They want to add their own features. They want to manage their own data. They told us that only with this kind of flexibility, could their objectives and the potential for this technology be truly realized. World Wind client technology is a set of development tools, a software development kit (SDK) that allows a software engineer to create applications requiring geographic visualization technology. 2. MODULAR COMPONENTRY Accelerated evolution of a technology requires that the essential elements of that technology be modular components such that each can advance independent of the other elements. World Wind therefore changed its mission from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating a single program, World Wind is a suite of components that can be selectively used in any number of programs. World Wind technology can be a part of any application, or it can be a window in a web page. Or it can be extended with additional functionalities by application and web developers. World Wind makes it possible to include virtual globe visualization and server technology in support of any objective. The world community can continually benefit from advances made in the technology by NASA in concert with the world community. 3. OPEN SOURCE AND OPEN STANDARDS NASA World Wind is NASA Open Source software. This means that the source code is fully accessible for anyone to freely use, even in association with proprietary technology. Imagery and other data provided by the World Wind servers reside in the public domain, including the data server technology itself. This allows others to deliver their own geospatial data and to provide custom solutions based on users specific needs.

  10. Mantle to surface degassing of alkalic magmas at Erebus volcano, Antarctica

    USGS Publications Warehouse

    Oppenheimer, C.; Moretti, R.; Kyle, P.R.; Eschenbacher, A.; Lowenstern, J. B.; Hervig, R.L.; Dunbar, N.W.

    2011-01-01

    Continental intraplate volcanoes, such as Erebus volcano, Antarctica, are associated with extensional tectonics, mantle upwelling and high heat flow. Typically, erupted magmas are alkaline and rich in volatiles (especially CO2), inherited from low degrees of partial melting of mantle sources. We examine the degassing of the magmatic system at Erebus volcano using melt inclusion data and high temporal resolution open-path Fourier transform infrared (FTIR) spectroscopic measurements of gas emissions from the active lava lake. Remarkably different gas signatures are associated with passive and explosive gas emissions, representative of volatile contents and redox conditions that reveal contrasting shallow and deep degassing sources. We show that this unexpected degassing signature provides a unique probe for magma differentiation and transfer of CO2-rich oxidised fluids from the mantle to the surface, and evaluate how these processes operate in time and space. Extensive crystallisation driven by CO2 fluxing is responsible for isobaric fractionation of parental basanite magmas close to their source depth. Magma deeper than 4kbar equilibrates under vapour-buffered conditions. At shallower depths, CO2-rich fluids accumulate and are then released either via convection-driven, open-system gas loss or as closed-system slugs that ascend and result in Strombolian eruptions in the lava lake. The open-system gases have a reduced state (below the QFM buffer) whereas the closed-system gases preserve their deep oxidised signatures (close to the NNO buffer). ?? 2011 Elsevier B.V.

  11. A 868MHz-based wireless sensor network for ground truthing of soil moisture for a hyperspectral remote sensing campaign - design and preliminary results

    NASA Astrophysics Data System (ADS)

    Näthe, Paul; Becker, Rolf

    2014-05-01

    Soil moisture and plant available water are important environmental parameters that affect plant growth and crop yield. Hence, they are significant parameters for vegetation monitoring and precision agriculture. However, validation through ground-based soil moisture measurements is necessary for accessing soil moisture, plant canopy temperature, soil temperature and soil roughness with airborne hyperspectral imaging systems in a corresponding hyperspectral imaging campaign as a part of the INTERREG IV A-Project SMART INSPECTORS. At this point, commercially available sensors for matric potential, plant available water and volumetric water content are utilized for automated measurements with smart sensor nodes which are developed on the basis of open-source 868MHz radio modules, featuring a full-scale microcontroller unit that allows an autarkic operation of the sensor nodes on batteries in the field. The generated data from each of these sensor nodes is transferred wirelessly with an open-source protocol to a central node, the so-called "gateway". This gateway collects, interprets and buffers the sensor readings and, eventually, pushes the data-time series onto a server-based database. The entire data processing chain from the sensor reading to the final storage of data-time series on a server is realized with open-source hardware and software in such a way that the recorded data can be accessed from anywhere through the internet. It will be presented how this open-source based wireless sensor network is developed and specified for the application of ground truthing. In addition, the system's perspectives and potentials with respect to usability and applicability for vegetation monitoring and precision agriculture shall be pointed out. Regarding the corresponding hyperspectral imaging campaign, results from ground measurements will be discussed in terms of their contributing aspects to the remote sensing system. Finally, the significance of the wireless sensor network for the application of ground truthing shall be determined.

  12. Open-Source Data and the Study of Homicide.

    PubMed

    Parkin, William S; Gruenewald, Jeff

    2015-07-20

    To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.

  13. Pharmacology Portal: An Open Database for Clinical Pharmacologic Laboratory Services.

    PubMed

    Karlsen Bjånes, Tormod; Mjåset Hjertø, Espen; Lønne, Lars; Aronsen, Lena; Andsnes Berg, Jon; Bergan, Stein; Otto Berg-Hansen, Grim; Bernard, Jean-Paul; Larsen Burns, Margrete; Toralf Fosen, Jan; Frost, Joachim; Hilberg, Thor; Krabseth, Hege-Merete; Kvan, Elena; Narum, Sigrid; Austgulen Westin, Andreas

    2016-01-01

    More than 50 Norwegian public and private laboratories provide one or more analyses for therapeutic drug monitoring or testing for drugs of abuse. Practices differ among laboratories, and analytical repertoires can change rapidly as new substances become available for analysis. The Pharmacology Portal was developed to provide an overview of these activities and to standardize the practices and terminology among laboratories. The Pharmacology Portal is a modern dynamic web database comprising all available analyses within therapeutic drug monitoring and testing for drugs of abuse in Norway. Content can be retrieved by using the search engine or by scrolling through substance lists. The core content is a substance registry updated by a national editorial board of experts within the field of clinical pharmacology. This ensures quality and consistency regarding substance terminologies and classification. All laboratories publish their own repertoires in a user-friendly workflow, adding laboratory-specific details to the core information in the substance registry. The user management system ensures that laboratories are restricted from editing content in the database core or in repertoires within other laboratory subpages. The portal is for nonprofit use, and has been fully funded by the Norwegian Medical Association, the Norwegian Society of Clinical Pharmacology, and the 8 largest pharmacologic institutions in Norway. The database server runs an open-source content management system that ensures flexibility with respect to further development projects, including the potential expansion of the Pharmacology Portal to other countries. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.

  14. Evaluation of the free, open source software WordPress as electronic portfolio system in undergraduate medical education.

    PubMed

    Avila, Javier; Sostmann, Kai; Breckwoldt, Jan; Peters, Harm

    2016-06-03

    Electronic portfolios (ePortfolios) are used to document and support learning activities. E-portfolios with mobile capabilities allow even more flexibility. However, the development or acquisition of ePortfolio software is often costly, and at the same time, commercially available systems may not sufficiently fit the institution's needs. The aim of this study was to design and evaluate an ePortfolio system with mobile capabilities using a commercially free and open source software solution. We created an online ePortfolio environment using the blogging software WordPress based on reported capability features of such software by a qualitative weight and sum method. Technical implementation and usability were evaluated by 25 medical students during their clinical training by quantitative and qualitative means using online questionnaires and focus groups. The WordPress ePortfolio environment allowed students a broad spectrum of activities - often documented via mobile devices - like collection of multimedia evidences, posting reflections, messaging, web publishing, ePortfolio searches, collaborative learning, knowledge management in a content management system including a wiki and RSS feeds, and the use of aid tools for studying. The students' experience with WordPress revealed a few technical problems, and this report provides workarounds. The WordPress ePortfolio was rated positively by the students as a content management system (67 % of the students), for exchange with other students (74 %), as a note pad for reflections (53 %) and for its potential as an information source for assessment (48 %) and exchange with a mentor (68 %). On the negative side, 74 % of the students in this pilot study did not find it easy to get started with the system, and 63 % rated the ePortfolio as not being user-friendly. Qualitative analysis indicated a need for more introductory information and training. It is possible to build an advanced ePortfolio system with mobile capabilities with the free and open source software WordPress. This allows institutions without proprietary software to build a sophisticated ePortfolio system adapted to their needs with relatively few resources. The implementation of WordPress should be accompanied by introductory courses in the use of the software and its apps in order to facilitate its usability.

  15. How Is Open Source Special?

    ERIC Educational Resources Information Center

    Kapor, Mitchell

    2005-01-01

    Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…

  16. Common characteristics of open source software development and applicability for drug discovery: a systematic review.

    PubMed

    Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne

    2011-09-28

    Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.

  17. Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.

    PubMed

    Benson, Tim

    2016-07-04

    Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.

  18. Weather forecasting with open source software

    NASA Astrophysics Data System (ADS)

    Rautenhaus, Marc; Dörnbrack, Andreas

    2013-04-01

    To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.

  19. Open Source Software Development

    DTIC Science & Technology

    2011-01-01

    Software, 2002, 149(1), 3-17. 3. DiBona , C., Cooper, D., and Stone, M. (Eds.), Open Sources 2.0, 2005, O’Reilly Media, Sebastopol, CA. Also see, C... DiBona , S. Ockman, and M. Stone (Eds.). Open Sources: Vocides from the Open Source Revolution, 1999. O’Reilly Media, Sebastopol, CA. 4. Ducheneaut, N

  20. The microbial quality of drinking water in Manonyane community: Maseru District (Lesotho).

    PubMed

    Gwimbi, P

    2011-09-01

    Provision of good quality household drinking water is an important means of improving public health in rural communities especially in Africa; and is the rationale behind protecting drinking water sources and promoting healthy practices at and around such sources. To examine the microbial content of drinking water from different types of drinking water sources in Manonyane community of Lesotho. The community's hygienic practices around the water sources are also assessed to establish their contribution to water quality. Water samples from thirty five water sources comprising 22 springs, 6 open wells, 6 boreholes and 1 open reservoir were assessed. Total coliform and Escherichia coli bacteria were analyzed in water sampled. Results of the tests were compared with the prescribed World Health Organization desirable limits. A household survey and field observations were conducted to assess the hygienic conditions and practices at and around the water sources. Total coliform were detected in 97% and Escherichia coli in 71% of the water samples. The concentration levels of Total coliform and Escherichia coli were above the permissible limits of the World Health Organization drinking water quality guidelines in each case. Protected sources had significantly less number of colony forming units (cfu) per 100 ml of water sample compared to unprotected sources (56% versus 95%, p < 0.05). Similarly in terms of Escherichia coli, protected sources had less counts (7% versus 40%, p < 0.05) compared with those from unprotected sources. Hygiene conditions and practices that seemed to potentially contribute increased total coliform and Escherichia coli counts included non protection of water sources from livestock faeces, laundry practices, and water sources being down slope of pit latrines in some cases. These findings suggest source water protection and good hygiene practices can improve the quality of household drinking water where disinfection is not available. The results also suggest important lines of inquiry and provide support and input for environmental and public health programmes, particularly those related to water and sanitation.

  1. Open-source hardware for medical devices

    PubMed Central

    2016-01-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528

  2. Open-source hardware for medical devices.

    PubMed

    Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold

    2016-04-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.

  3. The case for open-source software in drug discovery.

    PubMed

    DeLano, Warren L

    2005-02-01

    Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.

  4. 48 CFR 6.303-2 - Content.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... full and open competition, such as: (i) Explanation of why technical data packages, specifications... COMPETITION REQUIREMENTS Other Than Full and Open Competition 6.303-2 Content. (a) Each justification shall... than full and open competition.” (2) Nature and/or description of the action being approved. (3) A...

  5. Choosing Open Source ERP Systems: What Reasons Are There For Doing So?

    NASA Astrophysics Data System (ADS)

    Johansson, Björn; Sudzina, Frantisek

    Enterprise resource planning (ERP) systems attract a high attention and open source software does it as well. The question is then if, and if so, when do open source ERP systems take off. The paper describes the status of open source ERP systems. Based on literature review of ERP system selection criteria based on Web of Science articles, it discusses reported reasons for choosing open source or proprietary ERP systems. Last but not least, the article presents some conclusions that could act as input for future research. The paper aims at building up a foundation for the basic question: What are the reasons for an organization to adopt open source ERP systems.

  6. Developing open-source codes for electromagnetic geophysics using industry support

    NASA Astrophysics Data System (ADS)

    Key, K.

    2017-12-01

    Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.

  7. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    ERIC Educational Resources Information Center

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  8. Implementing Open Source Platform for Education Quality Enhancement in Primary Education: Indonesia Experience

    ERIC Educational Resources Information Center

    Kisworo, Marsudi Wahyu

    2016-01-01

    Information and Communication Technology (ICT)-supported learning using free and open source platform draws little attention as open source initiatives were focused in secondary or tertiary educations. This study investigates possibilities of ICT-supported learning using open source platform for primary educations. The data of this study is taken…

  9. An Analysis of Open Source Security Software Products Downloads

    ERIC Educational Resources Information Center

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  10. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  11. CellProfiler Tracer: exploring and validating high-throughput, time-lapse microscopy image data.

    PubMed

    Bray, Mark-Anthony; Carpenter, Anne E

    2015-11-04

    Time-lapse analysis of cellular images is an important and growing need in biology. Algorithms for cell tracking are widely available; what researchers have been missing is a single open-source software package to visualize standard tracking output (from software like CellProfiler) in a way that allows convenient assessment of track quality, especially for researchers tuning tracking parameters for high-content time-lapse experiments. This makes quality assessment and algorithm adjustment a substantial challenge, particularly when dealing with hundreds of time-lapse movies collected in a high-throughput manner. We present CellProfiler Tracer, a free and open-source tool that complements the object tracking functionality of the CellProfiler biological image analysis package. Tracer allows multi-parametric morphological data to be visualized on object tracks, providing visualizations that have already been validated within the scientific community for time-lapse experiments, and combining them with simple graph-based measures for highlighting possible tracking artifacts. CellProfiler Tracer is a useful, free tool for inspection and quality control of object tracking data, available from http://www.cellprofiler.org/tracer/.

  12. CACTI: Free, Open-Source Software for the Sequential Coding of Behavioral Interactions

    PubMed Central

    Glynn, Lisa H.; Hallgren, Kevin A.; Houck, Jon M.; Moyers, Theresa B.

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery. PMID:22815713

  13. Archetype-based conversion of EHR content models: pilot experience with a regional EHR system.

    PubMed

    Chen, Rong; Klein, Gunnar O; Sundvall, Erik; Karlsson, Daniel; Ahlfeldt, Hans

    2009-07-01

    Exchange of Electronic Health Record (EHR) data between systems from different suppliers is a major challenge. EHR communication based on archetype methodology has been developed by openEHR and CEN/ISO. The experience of using archetypes in deployed EHR systems is quite limited today. Currently deployed EHR systems with large user bases have their own proprietary way of representing clinical content using various models. This study was designed to investigate the feasibility of representing EHR content models from a regional EHR system as openEHR archetypes and inversely to convert archetypes to the proprietary format. The openEHR EHR Reference Model (RM) and Archetype Model (AM) specifications were used. The template model of the Cambio COSMIC, a regional EHR product from Sweden, was analyzed and compared to the openEHR RM and AM. This study was focused on the convertibility of the EHR semantic models. A semantic mapping between the openEHR RM/AM and the COSMIC template model was produced and used as the basis for developing prototype software that performs automated bi-directional conversion between openEHR archetypes and COSMIC templates. Automated bi-directional conversion between openEHR archetype format and COSMIC template format has been achieved. Several archetypes from the openEHR Clinical Knowledge Repository have been imported into COSMIC, preserving most of the structural and terminology related constraints. COSMIC templates from a large regional installation were successfully converted into the openEHR archetype format. The conversion from the COSMIC templates into archetype format preserves nearly all structural and semantic definitions of the original content models. A strategy of gradually adding archetype support to legacy EHR systems was formulated in order to allow sharing of clinical content models defined using different formats. The openEHR RM and AM are expressive enough to represent the existing clinical content models from the template based EHR system tested and legacy content models can automatically be converted to archetype format for sharing of knowledge. With some limitations, internationally available archetypes could be converted to the legacy EHR models. Archetype support can be added to legacy EHR systems in an incremental way allowing a migration path to interoperability based on standards.

  14. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  15. The 2017 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year’s theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest. PMID:29118973

  16. The 2017 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year's theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest.

  17. The Efficient Utilization of Open Source Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baty, Samuel R.

    These are a set of slides on the efficient utilization of open source information. Open source information consists of a vast set of information from a variety of sources. Not only does the quantity of open source information pose a problem, the quality of such information can hinder efforts. To show this, two case studies are mentioned: Iran and North Korea, in order to see how open source information can be utilized. The huge breadth and depth of open source information can complicate an analysis, especially because open information has no guarantee of accuracy. Open source information can provide keymore » insights either directly or indirectly: looking at supporting factors (flow of scientists, products and waste from mines, government budgets, etc.); direct factors (statements, tests, deployments). Fundamentally, it is the independent verification of information that allows for a more complete picture to be formed. Overlapping sources allow for more precise bounds on times, weights, temperatures, yields or other issues of interest in order to determine capability. Ultimately, a "good" answer almost never comes from an individual, but rather requires the utilization of a wide range of skill sets held by a team of people.« less

  18. OER as Online Edutainment Resources: A Critical Look at Open Content, Branded Content, and How Both Affect the OER Movement

    ERIC Educational Resources Information Center

    Moe, Rolin

    2015-01-01

    Despite a rise in awareness and production of open education resources (OER) over the past decade, mainstream media outlets continue to define open in economic terms of consumer cost and not in theoretical terms of remix or appropriation. This period in the "open access" debate has coincided with a proliferation of free-of-charge video…

  19. Collaborative development of predictive toxicology applications

    PubMed Central

    2010-01-01

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436

  20. Collaborative development of predictive toxicology applications.

    PubMed

    Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia

    2010-08-31

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.

  1. The 2015 Bioinformatics Open Source Conference (BOSC 2015).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Lapp, Hilmar; Chapman, Brad; Davey, Rob; Fields, Christopher; Hokamp, Karsten; Munoz-Torres, Monica

    2016-02-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.

  2. Open Source, Openness, and Higher Education

    ERIC Educational Resources Information Center

    Wiley, David

    2006-01-01

    In this article David Wiley provides an overview of how the general expansion of open source software has affected the world of education in particular. In doing so, Wiley not only addresses the development of open source software applications for teachers and administrators, he also discusses how the fundamental philosophy of the open source…

  3. The Emergence of Open-Source Software in North America

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…

  4. Estimation of black carbon content for biomass burning aerosols from multi-channel Raman lidar data

    NASA Astrophysics Data System (ADS)

    Talianu, Camelia; Marmureanu, Luminita; Nicolae, Doina

    2015-04-01

    Biomass burning due to natural processes (forest fires) or anthropical activities (agriculture, thermal power stations, domestic heating) is an important source of aerosols with a high content of carbon components (black carbon and organic carbon). Multi-channel Raman lidars provide information on the spectral dependence of the backscatter and extinction coefficients, embedding information on the black carbon content. Aerosols with a high content of black carbon have large extinction coefficients and small backscatter coefficients (strong absorption), while aerosols with high content of organic carbon have large backscatter coefficients (weak absorption). This paper presents a method based on radiative calculations to estimate the black carbon content of biomass burning aerosols from 3b+2a+1d lidar signals. Data is collected at Magurele, Romania, at the cross-road of air masses coming from Ukraine, Russia and Greece, where burning events are frequent during both cold and hot seasons. Aerosols are transported in the free troposphere, generally in the 2-4 km altitude range, and reaches the lidar location after 2-3 days. Optical data are collected between 2011-2012 by a multi-channel Raman lidar and follows the quality assurance program of EARLINET. Radiative calculations are made with libRadTran, an open source radiative model developed by ESA. Validation of the retrievals is made by comparison to a co-located C-ToF Aerosol Mass Spectrometer. Keywords: Lidar, aerosols, biomass burning, radiative model, black carbon Acknowledgment: This work has been supported by grants of the Romanian National Authority for Scientific Research, Programme for Research- Space Technology and Advanced Research - STAR, project no. 39/2012 - SIAFIM, and by Romanian Partnerships in priority areas PNII implemented with MEN-UEFISCDI support, project no. 309/2014 - MOBBE

  5. Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on

    PubMed Central

    2011-01-01

    Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community. PMID:21999342

  6. Open Genetic Code: on open source in the life sciences.

    PubMed

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  7. Open Educational Practices and Resources. OLCOS Roadmap, 2012

    ERIC Educational Resources Information Center

    Geser, Guntram, Ed.

    2007-01-01

    As a Transversal Action under the European eLearning Programme, the Open e-Learning Content Observatory Services (OLCOS) project carries out a set of activities that aim at fostering the creation, sharing and re-use of Open Educational Resources (OER) in Europe and beyond. OER are understood to comprise content for teaching and learning,…

  8. The Open Source Teaching Project (OSTP): Research Note.

    ERIC Educational Resources Information Center

    Hirst, Tony

    The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…

  9. Free for All: Open Source Software

    ERIC Educational Resources Information Center

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  10. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  11. Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness

    ERIC Educational Resources Information Center

    Committee for Economic Development, 2006

    2006-01-01

    Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…

  12. The 2015 Bioinformatics Open Source Conference (BOSC 2015)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J. A.; Lapp, Hilmar

    2016-01-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included “Data Science;” “Standards and Interoperability;” “Open Science and Reproducibility;” “Translational Bioinformatics;” “Visualization;” and “Bioinformatics Open Source Project Updates”. In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled “Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community,” that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule. PMID:26914653

  13. Arctic Deep Water Ferromanganese-Oxide Deposits Reflect the Unique Characteristics of the Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Hein, James R.; Konstantinova, Natalia; Mikesell, Mariah; Mizell, Kira; Fitzsimmons, Jessica N.; Lam, Phoebe J.; Jensen, Laramie T.; Xiang, Yang; Gartman, Amy; Cherkashov, Georgy; Hutchinson, Deborah R.; Till, Claire P.

    2017-11-01

    Little is known about marine mineral deposits in the Arctic Ocean, an ocean dominated by continental shelf and basins semi-closed to deep-water circulation. Here, we present data for ferromanganese crusts and nodules collected from the Amerasia Arctic Ocean in 2008, 2009, and 2012 (HLY0805, HLY0905, and HLY1202). We determined mineral and chemical compositions of the crusts and nodules and the onset of their formation. Water column samples from the GEOTRACES program were analyzed for dissolved and particulate scandium concentrations, an element uniquely enriched in these deposits. The Arctic crusts and nodules are characterized by unique mineral and chemical compositions with atypically high growth rates, detrital contents, Fe/Mn ratios, and low Si/Al ratios, compared to deposits found elsewhere. High detritus reflects erosion of submarine outcrops and North America and Siberia cratons, transport by rivers and glaciers to the sea, and distribution by sea ice, brines, and currents. Uniquely high Fe/Mn ratios are attributed to expansive continental shelves, where diagenetic cycling releases Fe to bottom waters, and density flows transport shelf bottom water to the open Arctic Ocean. Low Mn contents reflect the lack of a mid-water oxygen minimum zone that would act as a reservoir for dissolved Mn. The potential host phases and sources for elements with uniquely high contents are discussed with an emphasis on scandium. Scandium sorption onto Fe oxyhydroxides and Sc-rich detritus account for atypically high scandium contents. The opening of Fram Strait in the Miocene and ventilation of the deep basins initiated Fe-Mn crust growth ˜15 Myr ago.

  14. Arctic deep-water ferromanganese-oxide deposits reflect the unique characteristics of the Arctic Ocean

    USGS Publications Warehouse

    Hein, James; Konstantinova, Natalia; Mikesell, Mariah; Mizell, Kira; Fitzsimmons, Jessica N.; Lam, Phoebe; Jensen, Laramie T.; Xiang, Yang; Gartman, Amy; Cherkashov, Georgy; Hutchinson, Deborah; Till, Claire P.

    2017-01-01

    Little is known about marine mineral deposits in the Arctic Ocean, an ocean dominated by continental shelf and basins semi-closed to deep-water circulation. Here, we present data for ferromanganese crusts and nodules collected from the Amerasia Arctic Ocean in 2008, 2009, and 2012 (HLY0805, HLY0905, HLY1202). We determined mineral and chemical compositions of the crusts and nodules and the onset of their formation. Water column samples from the GEOTRACES program were analyzed for dissolved and particulate scandium concentrations, an element uniquely enriched in these deposits.The Arctic crusts and nodules are characterized by unique mineral and chemical compositions with atypically high growth rates, detrital contents, Fe/Mn ratios, and low Si/Al ratios, compared to deposits found elsewhere. High detritus reflects erosion of submarine outcrops and North America and Siberia cratons, transport by rivers and glaciers to the sea, and distribution by sea ice, brines, and currents. Uniquely high Fe/Mn ratios are attributed to expansive continental shelves, where diagenetic cycling releases Fe to bottom waters, and density flows transport shelf bottom water to the open Arctic Ocean. Low Mn contents reflect the lack of a mid-water oxygen minimum zone that would act as a reservoir for dissolved Mn. The potential host phases and sources for elements with uniquely high contents are discussed with an emphasis on scandium. Scandium sorption onto Fe oxyhydroxides and Sc-rich detritus account for atypically high scandium contents. The opening of Fram Strait in the Miocene and ventilation of the deep basins initiated Fe-Mn crust growth ∼15 Myr ago.

  15. Design of Open Content Social Learning Based on the Activities of Learner and Similar Learners

    ERIC Educational Resources Information Center

    John, Benneaser; Jayakumar, J.; Thavavel, V.; Arumugam, Muthukumar; Poornaselvan, K. J.

    2017-01-01

    Teaching and learning are increasingly taking advantage of the rapid growth in Internet resources, open content, mobile technologies and social media platforms. However, due to the generally unstructured nature and overwhelming quantity of learning content, effective learning remains challenging. In an effort to close this gap, the authors…

  16. The 2016 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science.

  17. Beyond Open Source: According to Jim Hirsch, Open Technology, Not Open Source, Is the Wave of the Future

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    This article presents an interview with Jim Hirsch, an associate superintendent for technology at Piano Independent School District in Piano, Texas. Hirsch serves as a liaison for the open technologies committee of the Consortium for School Networking. In this interview, he shares his opinion on the significance of open source in K-12.

  18. Nutrient Dynamics of Estuarine Invertebrates Are Shaped by Feeding Guild Rather than Seasonal River Flow

    PubMed Central

    Ortega-Cisneros, Kelly; Scharler, Ursula M.

    2015-01-01

    This study aimed to determine the variability of carbon and nitrogen elemental content, stoichiometry and diet proportions of invertebrates in two sub-tropical estuaries in South Africa experiencing seasonal changes in rainfall and river inflow. The elemental ratios and stable isotopes of abiotic sources, zooplankton and macrozoobenthos taxa were analyzed over a dry/wet seasonal cycle. Nutrient content (C, N) and stoichiometry of suspended particulate matter exhibited significant spatio-temporal variations in both estuaries, which were explained by the variability in river inflow. Sediment particulate matter (%C, %N and C:N) was also influenced by the variability in river flow but to a lesser extent. The nutrient content and ratios of the analyzed invertebrates did not significantly vary among seasons with the exception of the copepod Pseudodiaptomus spp. (C:N) and the tanaid Apseudes digitalis (%N, C:N). These changes did not track the seasonal variations of the suspended or sediment particulate matter. Our results suggest that invertebrates managed to maintain their stoichiometry independent of the seasonality in river flow. A significant variability in nitrogen content among estuarine invertebrates was recorded, with highest % N recorded from predators and lowest %N from detritivores. Due to the otherwise general lack of seasonal differences in elemental content and stoichiometry, feeding guild was a major factor shaping the nutrient dynamics of the estuarine invertebrates. The nutrient richer suspended particulate matter was the preferred food source over sediment particulate matter for most invertebrate consumers in many, but not all seasons. The most distinct preference for suspended POM as a food source was apparent from the temporarily open/closed system after the estuary had breached, highlighting the importance of river flow as a driver of invertebrate nutrient dynamics under extreme events conditions. Moreover, our data showed that estuarine invertebrates concentrated C and N between 10–100 fold from trophic level I (POM) to trophic level II (detritivores/deposit feeders) and thus highlighted their importance not only as links to higher trophic level organisms in the food web, but also as providers of a stoichiometrically homeostatic food source for such consumers. As climate change scenarios for the east coast of South Africa predict increased rainfall as a higher number of rainy days and days with higher rainfall, our results suggest that future changes in rainfall and river inflow will have measurable effects on the nutrient content and stoichiometry of food sources and possibly also in estuarine consumers. PMID:26352433

  19. Nutrient Dynamics of Estuarine Invertebrates Are Shaped by Feeding Guild Rather than Seasonal River Flow.

    PubMed

    Ortega-Cisneros, Kelly; Scharler, Ursula M

    2015-01-01

    This study aimed to determine the variability of carbon and nitrogen elemental content, stoichiometry and diet proportions of invertebrates in two sub-tropical estuaries in South Africa experiencing seasonal changes in rainfall and river inflow. The elemental ratios and stable isotopes of abiotic sources, zooplankton and macrozoobenthos taxa were analyzed over a dry/wet seasonal cycle. Nutrient content (C, N) and stoichiometry of suspended particulate matter exhibited significant spatio-temporal variations in both estuaries, which were explained by the variability in river inflow. Sediment particulate matter (%C, %N and C:N) was also influenced by the variability in river flow but to a lesser extent. The nutrient content and ratios of the analyzed invertebrates did not significantly vary among seasons with the exception of the copepod Pseudodiaptomus spp. (C:N) and the tanaid Apseudes digitalis (%N, C:N). These changes did not track the seasonal variations of the suspended or sediment particulate matter. Our results suggest that invertebrates managed to maintain their stoichiometry independent of the seasonality in river flow. A significant variability in nitrogen content among estuarine invertebrates was recorded, with highest % N recorded from predators and lowest %N from detritivores. Due to the otherwise general lack of seasonal differences in elemental content and stoichiometry, feeding guild was a major factor shaping the nutrient dynamics of the estuarine invertebrates. The nutrient richer suspended particulate matter was the preferred food source over sediment particulate matter for most invertebrate consumers in many, but not all seasons. The most distinct preference for suspended POM as a food source was apparent from the temporarily open/closed system after the estuary had breached, highlighting the importance of river flow as a driver of invertebrate nutrient dynamics under extreme events conditions. Moreover, our data showed that estuarine invertebrates concentrated C and N between 10-100 fold from trophic level I (POM) to trophic level II (detritivores/deposit feeders) and thus highlighted their importance not only as links to higher trophic level organisms in the food web, but also as providers of a stoichiometrically homeostatic food source for such consumers. As climate change scenarios for the east coast of South Africa predict increased rainfall as a higher number of rainy days and days with higher rainfall, our results suggest that future changes in rainfall and river inflow will have measurable effects on the nutrient content and stoichiometry of food sources and possibly also in estuarine consumers.

  20. EMISSIONS OF ORGANIC AIR TOXICS FROM OPEN ...

    EPA Pesticide Factsheets

    A detailed literature search was performed to collect and collate available data reporting emissions of toxic organic substances into the air from open burning sources. Availability of data varied according to the source and the class of air toxics of interest. Volatile organic compound (VOC) and polycyclic aromatic hydrocarbon (PAH) data were available for many of the sources. Data on semivolatile organic compounds (SVOCs) that are not PAHs were available for several sources. Carbonyl and polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofuran (PCDD/F) data were available for only a few sources. There were several sources for which no emissions data were available at all. Several observations were made including: 1) Biomass open burning sources typically emitted less VOCs than open burning sources with anthropogenic fuels on a mass emitted per mass burned basis, particularly those where polymers were concerned; 2) Biomass open burning sources typically emitted less SVOCs and PAHs than anthropogenic sources on a mass emitted per mass burned basis. Burning pools of crude oil and diesel fuel produced significant amounts of PAHs relative to other types of open burning. PAH emissions were highest when combustion of polymers was taking place; and 3) Based on very limited data, biomass open burning sources typically produced higher levels of carbonyls than anthropogenic sources on a mass emitted per mass burned basis, probably due to oxygenated structures r

  1. CuboCube: Student creation of a cancer genetics e-textbook using open-access software for social learning.

    PubMed

    Seid-Karbasi, Puya; Ye, Xin C; Zhang, Allen W; Gladish, Nicole; Cheng, Suzanne Y S; Rothe, Katharina; Pilsworth, Jessica A; Kang, Min A; Doolittle, Natalie; Jiang, Xiaoyan; Stirling, Peter C; Wasserman, Wyeth W

    2017-03-01

    Student creation of educational materials has the capacity both to enhance learning and to decrease costs. Three successive honors-style classes of undergraduate students in a cancer genetics class worked with a new software system, CuboCube, to create an e-textbook. CuboCube is an open-source learning materials creation system designed to facilitate e-textbook development, with an ultimate goal of improving the social learning experience for students. Equipped with crowdsourcing capabilities, CuboCube provides intuitive tools for nontechnical and technical authors alike to create content together in a structured manner. The process of e-textbook development revealed both strengths and challenges of the approach, which can inform future efforts. Both the CuboCube platform and the Cancer Genetics E-textbook are freely available to the community.

  2. CuboCube: Student creation of a cancer genetics e-textbook using open-access software for social learning

    PubMed Central

    Seid-Karbasi, Puya; Ye, Xin C.; Zhang, Allen W.; Gladish, Nicole; Cheng, Suzanne Y. S.; Rothe, Katharina; Pilsworth, Jessica A.; Kang, Min A.; Doolittle, Natalie; Jiang, Xiaoyan; Stirling, Peter C.; Wasserman, Wyeth W.

    2017-01-01

    Student creation of educational materials has the capacity both to enhance learning and to decrease costs. Three successive honors-style classes of undergraduate students in a cancer genetics class worked with a new software system, CuboCube, to create an e-textbook. CuboCube is an open-source learning materials creation system designed to facilitate e-textbook development, with an ultimate goal of improving the social learning experience for students. Equipped with crowdsourcing capabilities, CuboCube provides intuitive tools for nontechnical and technical authors alike to create content together in a structured manner. The process of e-textbook development revealed both strengths and challenges of the approach, which can inform future efforts. Both the CuboCube platform and the Cancer Genetics E-textbook are freely available to the community. PMID:28267757

  3. Automated analysis of high-content microscopy data with deep learning.

    PubMed

    Kraus, Oren Z; Grys, Ben T; Ba, Jimmy; Chong, Yolanda; Frey, Brendan J; Boone, Charles; Andrews, Brenda J

    2017-04-18

    Existing computational pipelines for quantitative analysis of high-content microscopy data rely on traditional machine learning approaches that fail to accurately classify more than a single dataset without substantial tuning and training, requiring extensive analysis. Here, we demonstrate that the application of deep learning to biological image data can overcome the pitfalls associated with conventional machine learning classifiers. Using a deep convolutional neural network (DeepLoc) to analyze yeast cell images, we show improved performance over traditional approaches in the automated classification of protein subcellular localization. We also demonstrate the ability of DeepLoc to classify highly divergent image sets, including images of pheromone-arrested cells with abnormal cellular morphology, as well as images generated in different genetic backgrounds and in different laboratories. We offer an open-source implementation that enables updating DeepLoc on new microscopy datasets. This study highlights deep learning as an important tool for the expedited analysis of high-content microscopy data. © 2017 The Authors. Published under the terms of the CC BY 4.0 license.

  4. Advanced Cell Classifier: User-Friendly Machine-Learning-Based Software for Discovering Phenotypes in High-Content Imaging Data.

    PubMed

    Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter

    2017-06-28

    High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Applying Content Management to Automated Provenance Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuchardt, Karen L.; Gibson, Tara D.; Stephan, Eric G.

    2008-04-10

    Workflows and data pipelines are becoming increasingly valuable in both computational and experimen-tal sciences. These automated systems are capable of generating significantly more data within the same amount of time than their manual counterparts. Automatically capturing and recording data prove-nance and annotation as part of these workflows is critical for data management, verification, and dis-semination. Our goal in addressing the provenance challenge was to develop and end-to-end system that demonstrates real-time capture, persistent content management, and ad-hoc searches of both provenance and metadata using open source software and standard protocols. We describe our prototype, which extends the Kepler workflow toolsmore » for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to pro-vide access to the provenance record to a variety of commonly available client tools.« less

  6. Behavior of suspended particles in the Changjiang Estuary: Size distribution and trace metal contamination.

    PubMed

    Yao, Qingzhen; Wang, Xiaojing; Jian, Huimin; Chen, Hongtao; Yu, Zhigang

    2016-02-15

    Suspended particulate matter (SPM) samples were collected along a salinity gradient in the Changjiang Estuary in June 2011. A custom-built water elutriation apparatus was used to separate the suspended sediments into five size fractions. The results indicated that Cr and Pb originated from natural weathering processes, whereas Cu, Zn, and Cd originated from other sources. The distribution of most trace metals in different particle sizes increased with decreasing particle size. The contents of Fe/Mn and organic matter were confirmed to play an important role in increasing the level of heavy metal contents. The Cu, Pb, Zn, and Cd contents varied significantly with increasing salinity in the medium-low salinity region, thus indicating the release of Cu, Pb, Zn, and Cd particles. Thus, the transfer of polluted fine particles into the open sea is probably accompanied by release of pollutants into the dissolved compartment, thereby amplifying the potential harmful effects to marine organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Open-Ended (Extended/Constructed) Response Questions as Predictors of Success on Subsequent State Mathematics Examination: The Influence of Mathematical Awareness and Conceptual Knowledge

    ERIC Educational Resources Information Center

    Gullie, Kathy A.

    2011-01-01

    This study investigated the predictive ability of students' responses to open-ended, constructed/extended questions in third and fourth grade mathematics content subcategories on subsequent fifth grade mathematics achievement proficiency levels. Open-ended, extended/constructed response questions reflected content as outlined by the National…

  8. Open-Source 3D-Printable Optics Equipment

    PubMed Central

    Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104

  9. Open-source 3D-printable optics equipment.

    PubMed

    Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.

  10. Archetype-based conversion of EHR content models: pilot experience with a regional EHR system

    PubMed Central

    2009-01-01

    Background Exchange of Electronic Health Record (EHR) data between systems from different suppliers is a major challenge. EHR communication based on archetype methodology has been developed by openEHR and CEN/ISO. The experience of using archetypes in deployed EHR systems is quite limited today. Currently deployed EHR systems with large user bases have their own proprietary way of representing clinical content using various models. This study was designed to investigate the feasibility of representing EHR content models from a regional EHR system as openEHR archetypes and inversely to convert archetypes to the proprietary format. Methods The openEHR EHR Reference Model (RM) and Archetype Model (AM) specifications were used. The template model of the Cambio COSMIC, a regional EHR product from Sweden, was analyzed and compared to the openEHR RM and AM. This study was focused on the convertibility of the EHR semantic models. A semantic mapping between the openEHR RM/AM and the COSMIC template model was produced and used as the basis for developing prototype software that performs automated bi-directional conversion between openEHR archetypes and COSMIC templates. Results Automated bi-directional conversion between openEHR archetype format and COSMIC template format has been achieved. Several archetypes from the openEHR Clinical Knowledge Repository have been imported into COSMIC, preserving most of the structural and terminology related constraints. COSMIC templates from a large regional installation were successfully converted into the openEHR archetype format. The conversion from the COSMIC templates into archetype format preserves nearly all structural and semantic definitions of the original content models. A strategy of gradually adding archetype support to legacy EHR systems was formulated in order to allow sharing of clinical content models defined using different formats. Conclusion The openEHR RM and AM are expressive enough to represent the existing clinical content models from the template based EHR system tested and legacy content models can automatically be converted to archetype format for sharing of knowledge. With some limitations, internationally available archetypes could be converted to the legacy EHR models. Archetype support can be added to legacy EHR systems in an incremental way allowing a migration path to interoperability based on standards. PMID:19570196

  11. Aerostat-Lofted Instrument Platform and Sampling Method for Determination of Emissions from Open Area Sources

    EPA Science Inventory

    Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...

  12. White Matter Fiber-based Analysis of T1w/T2w Ratio Map.

    PubMed

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  13. White matter fiber-based analysis of T1w/T2w ratio map

    NASA Astrophysics Data System (ADS)

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D.; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    Purpose: To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. Background: The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. Methods: We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. Results: We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  14. Identifying elements of the plumbing system beneath Kilauea Volcano, Hawaii, from the source locations of very-long-period signals

    USGS Publications Warehouse

    Almendros, J.; Chouet, B.; Dawson, P.; Bond, T.

    2002-01-01

    We analyzed 16 seismic events recorded by the Hawaiian broad-band seismic network at Kilauca Volcano during the period September 9-26, 1999. Two distinct types of event are identified based on their spectral content, very-long-period (VLP) waveform, amplitude decay pattern and particle motion. We locate the VLP signals with a method based on analyses of semblance and particle motion. Different source regions are identified for the two event types. One source region is located at depths of ~1 km beneath the northeast edge of the Halemaumau pit crater. A second region is located at depths of ~8 km below the northwest quadrant of Kilauea caldera. Our study represents the first time that such deep sources have been identified in VLP data at Kilauea. This discovery opens the possibility of obtaining a detailed image of the location and geometry of the magma plumbing system beneath this volcano based on source locations and moment tensor inversions of VLP signals recorded by a permanent, large-aperture broad-band network.

  15. The Visible Human Data Sets (VHD) and Insight Toolkit (ITk): Experiments in Open Source Software

    PubMed Central

    Ackerman, Michael J.; Yoo, Terry S.

    2003-01-01

    From its inception in 1989, the Visible Human Project was designed as an experiment in open source software. In 1994 and 1995 the male and female Visible Human data sets were released by the National Library of Medicine (NLM) as open source data sets. In 2002 the NLM released the first version of the Insight Toolkit (ITk) as open source software. PMID:14728278

  16. The 2016 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science. PMID:27781083

  17. OpenMx: An Open Source Extended Structural Equation Modeling Framework

    ERIC Educational Resources Information Center

    Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John

    2011-01-01

    OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are…

  18. Capitalizing on global demands for open data access and interoperability - the USGIN story

    NASA Astrophysics Data System (ADS)

    Richard, Stephen; Allison, Lee

    2016-04-01

    U.S. Geoscience Information Network (USGIN - http://usgin.org) data integration framework packages data so that it can be accessible through a broad array of open-source software and applications, including GeoServer, QGIS, GrassGIS, uDig, and gvSIG. USGIN data-sharing networks are designed to interact with other data exchange systems and have the ability to connect information on a granular level without jeopardizing data ownership. The system is compliant with international standards and protocols, scalable, extensible, and can be deployed throughout the world for a myriad of applications. Using GeoSciML as its data transfer standard and a collaborative approach to Content Model development and management, much of the architecture is publically available through GitHub. Initially developed by the USGS and Association of American State Geologists as a distributed, self-maintained platform for sharing geoscience information, USGIN meets all the requirements of the White House Open Data Access Initiative that applies to (almost) all federally-funded research and all federally-maintained data, opening up huge opportunities for further deployment. In December 2015, the USGIN Content Model schema was recommended for adoption by the White House-led US Group on Earth Observations (USGEO) "Draft Common Framework for Earth-Observation Data" for all US earth observation (i.e., satellite) data. The largest USGIN node is the U.S. National Geothermal Data System (NGDS - www.geothermaldata.org). NGDS provides free open access to ~ 10 million data records, maps, and reports, sharing relevant geoscience and land use data to propel geothermal development and production in the U.S. NGDS currently serves information from hundreds of the U.S. Department of Energy's sponsored projects and geologic data feeds from 60+ data providers in all 50 states, using free and open source software, in a federated system where data owners maintain control of their data. This interactive online system is opening new exploration opportunities and shortening project development by making data easily discoverable, accessible, and interoperable at no cost to users. USGIN Foundation, Inc. was established in 2014 as a not-for-profit company to deploy the USGIN data integration framework for other natural resource (energy, water, and minerals), natural hazards, and geoscience investigations applications, nationally and worldwide. The USGIN vision is that as each data node adds to its data repositories, the system-wide USGIN functions become increasingly valuable to it. The long term goal is that the data network reach a 'tipping point' at which it becomes like a data equivalent to the World Wide Web - where everyone will maintain the function because it is expected by its clientele and it fills critical needs.

  19. openBIS: a flexible framework for managing and analyzing complex data in biology research

    PubMed Central

    2011-01-01

    Background Modern data generation techniques used in distributed systems biology research projects often create datasets of enormous size and diversity. We argue that in order to overcome the challenge of managing those large quantitative datasets and maximise the biological information extracted from them, a sound information system is required. Ease of integration with data analysis pipelines and other computational tools is a key requirement for it. Results We have developed openBIS, an open source software framework for constructing user-friendly, scalable and powerful information systems for data and metadata acquired in biological experiments. openBIS enables users to collect, integrate, share, publish data and to connect to data processing pipelines. This framework can be extended and has been customized for different data types acquired by a range of technologies. Conclusions openBIS is currently being used by several SystemsX.ch and EU projects applying mass spectrometric measurements of metabolites and proteins, High Content Screening, or Next Generation Sequencing technologies. The attributes that make it interesting to a large research community involved in systems biology projects include versatility, simplicity in deployment, scalability to very large data, flexibility to handle any biological data type and extensibility to the needs of any research domain. PMID:22151573

  20. Portability and Usability of Open Educational Resources on Mobile Devices: A Study in the Context of Brazilian Educational Portals and Android-Based Devices

    ERIC Educational Resources Information Center

    da Silva, André Constantino; Freire, Fernanda Maria Pereira; Mourão, Vitor Hugo Miranda; da Cruz, Márcio Diógenes de Oliveira; da Rocha, Heloísa Vieira

    2014-01-01

    Open Educational Resources (OER) are freely accessible, openly licensed hypertext, audio, video, simulations, games and animations that are useful for teaching and learning purposes. In order to facilitate the location of such resources, educational content portals are being created, crowding contents that were produced by different teams with…

  1. The Case for Open Source: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    Open source has continued to evolve and in the past three years the development of a graphical user interface has made it increasingly accessible and viable for end users without special training. Open source relies to a great extent on the free software movement. In this context, the term free refers not to cost, but to the freedom users have to…

  2. Temperature, productivity and sediment characteristics as drivers of seasonal and spatial variations of dissolved methane in the near-shore coastal areas (Belgian coastal zone, North Sea)

    NASA Astrophysics Data System (ADS)

    Borges, Alberto V.; Speeckaert, Gaëlle; Champenois, Willy; Scranton, Mary I.; Gypens, Nathalie

    2017-04-01

    The open ocean is a modest source of CH4 to the atmosphere compared to other natural and anthropogenic CH4 emissions. Coastal regions are more intense sources of CH4 to the atmosphere than open oceanic waters, in particular estuarine zones. The CH4 emission to the atmosphere from coastal areas is sustained by riverine inputs and methanogenesis in the sediments due to high organic matter (OM) deposition. Additionally, natural gas seeps are sources of CH4 to bottom waters leading to high dissolved CH4 concentrations in bottom waters (from tenths of nmol L-1 up to several µmol L-1). We report a data set of dissolved CH4 concentrations obtained at nine fixed stations in the Belgian coastal zone (Southern North Sea), during one yearly cycle, with a bi-monthly frequency in spring, and a monthly frequency during the rest of the year. This is a coastal area with multiple possible sources of CH4 such as from rivers and gassy sediments, and where intense phytoplankton blooms are dominated by the high dimethylsulfoniopropionate (DMSP) producing micro-algae Phaeocystis globosa, leading to DMSP and dimethylsulfide (DMS) concentrations. Furthermore, the BCZ is a site of important OM sedimentation and accumulation unlike the rest of the North Sea. Spatial variations of dissolved CH4 concentrations were very marked with a minimum yearly average of 9 nmol L-1 in one of the most off-shore stations and maximum yearly average of 139 nmol L-1 at one of the most near-shore stations. The spatial variations of dissolved CH4 concentrations were related to the organic matter (OM) content of sediments, although the highest concentrations seemed to also be related to inputs of CH4 from gassy sediments associated to submerged peat. In the near-shore stations with fine sand or muddy sediments with a high OM content, the seasonal cycle of dissolved CH4 concentration closely followed the seasonal cycle of water temperature, suggesting the control of methanogenesis by temperature in these OM replete sediments. In the off-shore stations with permeable sediments with a low OM content, the seasonal cycle of dissolved CH4 concentration showed a yearly peak following the chlorophyll-a spring peak. This suggests that in these OM poor sediments, methanogenesis depended on the delivery to the sediments of freshly produced OM. In both types of sediments, the seasonal cycle of dissolved CH4 concentrations was unrelated the seasonal cycles of DMS, and DMSP, despite the fact that these quantities were very high during the spring Phaeocystis globosa bloom. This suggests that in this shallow coastal environment CH4 production is overwhelmingly related to benthic processes and unrelated to DMS(P) transformations in the water column as recently suggested in several open ocean regions. The annual average CH4 emission was 41 mmol m-2 yr-1 in the most near-shore stations ( 4 km from the coast) and 10 mmol m-2 yr-1 in the most off-shore stations ( 23 km from the coast), 410-100 times higher than the average value in the open ocean (0.1 mmol m-2 yr-1). The strong control of CH4 concentrations by sediment OM content and by temperature suggests that marine coastal CH4 emissions, in particular shallow coastal areas, should respond in future to eutrophication and warming of climate. This is confirmed by the comparison of CH4 concentrations at five stations obtained in March in years 1990 and 2016, showing a decreasing trend consistent with alleviation of eutrophication in the area.

  3. Involving Practicing Scientists in K-12 Science Teacher Professional Development

    NASA Astrophysics Data System (ADS)

    Bertram, K. B.

    2011-12-01

    The Science Teacher Education Program (STEP) offered a unique framework for creating professional development courses focused on Arctic research from 2006-2009. Under the STEP framework, science, technology, engineering, and math (STEM) training was delivered by teams of practicing Arctic researchers in partnership with master teachers with 20+ years experience teaching STEM content in K-12 classrooms. Courses based on the framework were offered to educators across Alaska. STEP offered in-person summer-intensive institutes and follow-on audio-conferenced field-test courses during the academic year, supplemented by online scientist mentorship for teachers. During STEP courses, teams of scientists offered in-depth STEM content instruction at the graduate level for teachers of all grade levels. STEP graduate-level training culminated in the translation of information and data learned from Arctic scientists into standard-aligned lessons designed for immediate use in K-12 classrooms. This presentation will focus on research that explored the question: To what degree was scientist involvement beneficial to teacher training and to what degree was STEP scientist involvement beneficial to scientist instructors? Data sources reveal consistently high levels of ongoing (4 year) scientist and teacher participation; high STEM content learning outcomes for teachers; high STEM content learning outcomes for students; high ratings of STEP courses by scientists and teachers; and a discussion of the reasons scientists indicate they benefited from STEP involvement. Analyses of open-ended comments by teachers and scientists support and clarify these findings. A grounded theory approach was used to analyze teacher and scientist qualitative feedback. Comments were coded and patterns analyzed in three databases. The vast majority of teacher open-ended comments indicate that STEP involvement improved K-12 STEM classroom instruction, and the vast majority of scientist open-ended comments focus on the benefits scientists received from networking with K-12 teachers. The classroom lessons resulting from STEP have been so popular among teachers, the Alaska Department of Education and Early Development recently contracted with the PI to create a website that will make the STEP database open to teachers across Alaska. When the Alaska Department of Education and Early Development launched the new website in August 2011, the name of the STEP program was changed to the Alaska K-12 Science Curricular Initiative (AKSCI). The STEP courses serving as the foundation to the new AKSCI site are located under the "History" tab of the new website.

  4. Accessorizing Building Science – A Web Platform to Support Multiple Market Transformation Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madison, Michael C.; Antonopoulos, Chrissi A.; Dowson, Scott T.

    As demand for improved energy efficiency in homes increases, builders need information on the latest findings in building science, rapidly ramping-up energy codes, and technical requirements for labeling programs. The Building America Solution Center is a Department of Energy (DOE) website containing hundreds of expert guides designed to help residential builders install efficiency measures in new and existing homes. Builders can package measures with other media for customized content. Website content provides technical support to market transformation programs such as ENERGY STAR and has been cloned and adapted to provide content for the Better Buildings Residential Program. The Solution Centermore » uses the Drupal open source content management platform to combine a variety of media in an interactive manner to make information easily accessible. Developers designed a unique taxonomy to organize and manage content. That taxonomy was translated into web-based modules that allow users to rapidly traverse structured content with related topics, and media. We will present information on the current design of the Solution Center and the underlying technology used to manage the content. The paper will explore development of features, such as “Field Kits” that allow users to bundle and save content for quick access, along with the ability to export PDF versions of content. Finally, we will discuss development of an Android based mobile application, and a visualization tool for interacting with Building Science Publications that allows the user to dynamically search the entire Building America Library.« less

  5. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  6. User Expectations for Media Sharing Practices in Open Display Networks

    PubMed Central

    Jose, Rui; Cardoso, Jorge C. S.; Hong, Jason

    2015-01-01

    Open Display Networks have the potential to allow many content creators to publish their media to an open-ended set of screen displays. However, this raises the issue of how to match that content to the right displays. In this study, we aim to understand how the perceived utility of particular media sharing scenarios is affected by three independent variables, more specifically: (a) the locativeness of the content being shared; (b) how personal that content is and (c) the scope in which it is being shared. To assess these effects, we composed a set of 24 media sharing scenarios embedded with different treatments of our three independent variables. We then asked 100 participants to express their perception of the relevance of those scenarios. The results suggest a clear preference for scenarios where content is both local and directly related to the person that is publishing it. This is in stark contrast to the types of content that are commonly found in public displays, and confirms the opportunity that open displays networks may represent a new media for self-expression. This novel understanding may inform the design of new publication paradigms that will enable people to share media across the display networks. PMID:26153770

  7. When Free Isn't Free: The Realities of Running Open Source in School

    ERIC Educational Resources Information Center

    Derringer, Pam

    2009-01-01

    Despite the last few years' growth in awareness of open-source software in schools and the potential savings it represents, its widespread adoption is still hampered. Randy Orwin, technology director of the Bainbridge Island School District in Washington State and a strong open-source advocate, cautions that installing an open-source…

  8. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.

  9. Transparent ICD and DRG Coding Using Information Technology: Linking and Associating Information Sources with the eXtensible Markup Language

    PubMed Central

    Hoelzer, Simon; Schweiger, Ralf K.; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or “semantically associated” parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach. PMID:12807813

  10. Neoproterozoic rift basins and their control on the development of hydrocarbon source rocks in the Tarim Basin, NW China

    NASA Astrophysics Data System (ADS)

    Zhu, Guang-You; Ren, Rong; Chen, Fei-Ran; Li, Ting-Ting; Chen, Yong-Quan

    2017-12-01

    The Proterozoic is demonstrated to be an important period for global petroleum systems. Few exploration breakthroughs, however, have been obtained on the system in the Tarim Basin, NW China. Outcrop, drilling, and seismic data are integrated in this paper to focus on the Neoproterozoic rift basins and related hydrocarbon source rocks in the Tarim Basin. The basin consists of Cryogenian to Ediacaran rifts showing a distribution of N-S differentiation. Compared to the Cryogenian basins, those of the Ediacaran are characterized by deposits in small thickness and wide distribution. Thus, the rifts have a typical dual structure, namely the Cryogenian rifting and Ediacaran depression phases that reveal distinct structural and sedimentary characteristics. The Cryogenian rifting basins are dominated by a series of grabens or half grabens, which have a wedge-shaped rapid filling structure. The basins evolved into Ediacaran depression when the rifting and magmatic activities diminished, and extensive overlapping sedimentation occurred. The distributions of the source rocks are controlled by the Neoproterozoic rifts as follows. The present outcrops lie mostly at the margins of the Cryogenian rifting basins where the rapid deposition dominates and the argillaceous rocks have low total organic carbon (TOC) contents; however, the source rocks with high TOC contents should develop in the center of the basins. The Ediacaran source rocks formed in deep water environment of the stable depressions evolving from the previous rifting basins, and are thus more widespread in the Tarim Basin. The confirmation of the Cryogenian to Ediacaran source rocks would open up a new field for the deep hydrocarbon exploration in the Tarim Basin.

  11. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  12. Open Source Vision

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    Increasingly, colleges and universities are turning to open source as a way to meet their technology infrastructure and application needs. Open source has changed life for visionary CIOs and their campus communities nationwide. The author discusses what these technologists see as the benefits--and the considerations.

  13. Panorama: A Targeted Proteomics Knowledge Base

    PubMed Central

    2015-01-01

    Panorama is a web application for storing, sharing, analyzing, and reusing targeted assays created and refined with Skyline,1 an increasingly popular Windows client software tool for targeted proteomics experiments. Panorama allows laboratories to store and organize curated results contained in Skyline documents with fine-grained permissions, which facilitates distributed collaboration and secure sharing of published and unpublished data via a web-browser interface. It is fully integrated with the Skyline workflow and supports publishing a document directly to a Panorama server from the Skyline user interface. Panorama captures the complete Skyline document information content in a relational database schema. Curated results published to Panorama can be aggregated and exported as chromatogram libraries. These libraries can be used in Skyline to pick optimal targets in new experiments and to validate peak identification of target peptides. Panorama is open-source and freely available. It is distributed as part of LabKey Server,2 an open source biomedical research data management system. Laboratories and organizations can set up Panorama locally by downloading and installing the software on their own servers. They can also request freely hosted projects on https://panoramaweb.org, a Panorama server maintained by the Department of Genome Sciences at the University of Washington. PMID:25102069

  14. Web Monitoring of EOS Front-End Ground Operations, Science Downlinks and Level 0 Processing

    NASA Technical Reports Server (NTRS)

    Cordier, Guy R.; Wilkinson, Chris; McLemore, Bruce

    2008-01-01

    This paper addresses the efforts undertaken and the technology deployed to aggregate and distribute the metadata characterizing the real-time operations associated with NASA Earth Observing Systems (EOS) high-rate front-end systems and the science data collected at multiple ground stations and forwarded to the Goddard Space Flight Center for level 0 processing. Station operators, mission project management personnel, spacecraft flight operations personnel and data end-users for various EOS missions can retrieve the information at any time from any location having access to the internet. The users are distributed and the EOS systems are distributed but the centralized metadata accessed via an external web server provide an effective global and detailed view of the enterprise-wide events as they are happening. The data-driven architecture and the implementation of applied middleware technology, open source database, open source monitoring tools, and external web server converge nicely to fulfill the various needs of the enterprise. The timeliness and content of the information provided are key to making timely and correct decisions which reduce project risk and enhance overall customer satisfaction. The authors discuss security measures employed to limit access of data to authorized users only.

  15. Investigation of asphalt content design for open-graded bituminous mixes.

    DOT National Transportation Integrated Search

    1974-01-01

    Several design procedures associated with determining the proper asphalt content for open-graded bituminous mixes were investigated. Also considered was the proper amount of tack coat that should be placed on the old surface prior to paving operation...

  16. 76 FR 34634 - Federal Acquisition Regulation; Prioritizing Sources of Supplies and Services for Use by the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-14

    ... contracts before commercial sources in the open market. The proposed rule amends FAR 8.002 as follows: The... requirements for supplies and services from commercial sources in the open market. The proposed FAR 8.004 would... subpart 8.6). (b) Commercial sources (including educational and non-profit institutions) in the open...

  17. Primordial binary populations in low-density star clusters as seen by Chandra: globular clusters versus old open clusters

    NASA Astrophysics Data System (ADS)

    van den Berg, Maureen C.

    2015-08-01

    The binaries in the core of a star cluster are the energy source that prevents the cluster from experiencing core collapse. To model the dynamical evolution of a cluster, it is important to have constraints on the primordial binary content. X-ray observations of old star clusters are very efficient in detecting the close interacting binaries among the cluster members. The X-ray sources in star clusters are a mix of binaries that were dynamically formed and primordial binaries. In massive, dense star clusters, dynamical encounters play an important role in shaping the properties and numbers of the binaries. In contrast, in the low-density clusters the impact of dynamical encounters is presumed to be very small, and the close binaries detected in X-rays represent a primordial population. The lowest density globular clusters have current masses and central densities similar to those of the oldest open clusters in our Milky Way. I will discuss the results of studies with the Chandra X-ray Observatory that have nevertheless revealed a clear dichotomy: far fewer (if any at all) X-ray sources are detected in the central regions of the low-density globular clusters compared to the number of secure cluster members that have been detected in old open clusters (above a limiting X-ray luminosity of typically 4e30 erg/s). The low stellar encounter rates imply that dynamical destruction of binaries can be ignored at present, therefore an explanation must be sought elsewhere. I will discuss several factors that can shed light on the implied differences between the primordial close binary populations in the two types of star clusters.

  18. An integrated SNP mining and utilization (ISMU) pipeline for next generation sequencing data.

    PubMed

    Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A V S K; Varshney, Rajeev K

    2014-01-01

    Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software.

  19. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    PubMed

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  20. Layout-aware text extraction from full-text PDF of scientific articles.

    PubMed

    Ramakrishnan, Cartic; Patnia, Abhishek; Hovy, Eduard; Burns, Gully Apc

    2012-05-28

    The Portable Document Format (PDF) is the most commonly used file format for online scientific publications. The absence of effective means to extract text from these PDF files in a layout-aware manner presents a significant challenge for developers of biomedical text mining or biocuration informatics systems that use published literature as an information source. In this paper we introduce the 'Layout-Aware PDF Text Extraction' (LA-PDFText) system to facilitate accurate extraction of text from PDF files of research articles for use in text mining applications. Our paper describes the construction and performance of an open source system that extracts text blocks from PDF-formatted full-text research articles and classifies them into logical units based on rules that characterize specific sections. The LA-PDFText system focuses only on the textual content of the research articles and is meant as a baseline for further experiments into more advanced extraction methods that handle multi-modal content, such as images and graphs. The system works in a three-stage process: (1) Detecting contiguous text blocks using spatial layout processing to locate and identify blocks of contiguous text, (2) Classifying text blocks into rhetorical categories using a rule-based method and (3) Stitching classified text blocks together in the correct order resulting in the extraction of text from section-wise grouped blocks. We show that our system can identify text blocks and classify them into rhetorical categories with Precision1 = 0.96% Recall = 0.89% and F1 = 0.91%. We also present an evaluation of the accuracy of the block detection algorithm used in step 2. Additionally, we have compared the accuracy of the text extracted by LA-PDFText to the text from the Open Access subset of PubMed Central. We then compared this accuracy with that of the text extracted by the PDF2Text system, 2commonly used to extract text from PDF. Finally, we discuss preliminary error analysis for our system and identify further areas of improvement. LA-PDFText is an open-source tool for accurately extracting text from full-text scientific articles. The release of the system is available at http://code.google.com/p/lapdftext/.

  1. Layout-aware text extraction from full-text PDF of scientific articles

    PubMed Central

    2012-01-01

    Background The Portable Document Format (PDF) is the most commonly used file format for online scientific publications. The absence of effective means to extract text from these PDF files in a layout-aware manner presents a significant challenge for developers of biomedical text mining or biocuration informatics systems that use published literature as an information source. In this paper we introduce the ‘Layout-Aware PDF Text Extraction’ (LA-PDFText) system to facilitate accurate extraction of text from PDF files of research articles for use in text mining applications. Results Our paper describes the construction and performance of an open source system that extracts text blocks from PDF-formatted full-text research articles and classifies them into logical units based on rules that characterize specific sections. The LA-PDFText system focuses only on the textual content of the research articles and is meant as a baseline for further experiments into more advanced extraction methods that handle multi-modal content, such as images and graphs. The system works in a three-stage process: (1) Detecting contiguous text blocks using spatial layout processing to locate and identify blocks of contiguous text, (2) Classifying text blocks into rhetorical categories using a rule-based method and (3) Stitching classified text blocks together in the correct order resulting in the extraction of text from section-wise grouped blocks. We show that our system can identify text blocks and classify them into rhetorical categories with Precision1 = 0.96% Recall = 0.89% and F1 = 0.91%. We also present an evaluation of the accuracy of the block detection algorithm used in step 2. Additionally, we have compared the accuracy of the text extracted by LA-PDFText to the text from the Open Access subset of PubMed Central. We then compared this accuracy with that of the text extracted by the PDF2Text system, 2commonly used to extract text from PDF. Finally, we discuss preliminary error analysis for our system and identify further areas of improvement. Conclusions LA-PDFText is an open-source tool for accurately extracting text from full-text scientific articles. The release of the system is available at http://code.google.com/p/lapdftext/. PMID:22640904

  2. Using R to implement spatial analysis in open source environment

    NASA Astrophysics Data System (ADS)

    Shao, Yixi; Chen, Dong; Zhao, Bo

    2007-06-01

    R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.

  3. User Generated Spatial Content Sources for Land Use/Land Cover Validation Purposes: Suitability Analysis and Integration Model

    NASA Astrophysics Data System (ADS)

    Estima, Jacinto Paulo Simoes

    Traditional geographic information has been produced by mapping agencies and corporations, using high skilled people as well as expensive precision equipment and procedures, in a very costly approach. The production of land use and land cover databases are just one example of such traditional approach. On the other side, The amount of Geographic Information created and shared by citizens through the Web has been increasing exponentially during the last decade, resulting from the emergence and popularization of technologies such as the Web 2.0, cloud computing, GPS, smart phones, among others. Such comprehensive amount of free geographic data might have valuable information to extract and thus opening great possibilities to improve significantly the production of land use and land cover databases. In this thesis we explored the feasibility of using geographic data from different user generated spatial content initiatives in the process of land use and land cover database production. Data from Panoramio, Flickr and OpenStreetMap were explored in terms of their spatial and temporal distribution, and their distribution over the different land use and land cover classes. We then proposed a conceptual model to integrate data from suitable user generated spatial content initiatives based on identified dissimilarities among a comprehensive list of initiatives. Finally we developed a prototype implementing the proposed integration model, which was then validated by using the prototype to solve four identified use cases. We concluded that data from user generated spatial content initiatives has great value but should be integrated to increase their potential. The possibility of integrating data from such initiatives in an integration model was proved. Using the developed prototype, the relevance of the integration model was also demonstrated for different use cases. None None None

  4. [The use of open source software in graphic anatomic reconstructions and in biomechanic simulations].

    PubMed

    Ciobanu, O

    2009-01-01

    The objective of this study was to obtain three-dimensional (3D) images and to perform biomechanical simulations starting from DICOM images obtained by computed tomography (CT). Open source software were used to prepare digitized 2D images of tissue sections and to create 3D reconstruction from the segmented structures. Finally, 3D images were used in open source software in order to perform biomechanic simulations. This study demonstrates the applicability and feasibility of open source software developed in our days for the 3D reconstruction and biomechanic simulation. The use of open source software may improve the efficiency of investments in imaging technologies and in CAD/CAM technologies for implants and prosthesis fabrication which need expensive specialized software.

  5. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  6. Rapid development of medical imaging tools with open-source libraries.

    PubMed

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  7. Open-Source RTOS Space Qualification: An RTEMS Case Study

    NASA Technical Reports Server (NTRS)

    Zemerick, Scott

    2017-01-01

    NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.

  8. Learning from hackers: open-source clinical trials.

    PubMed

    Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico

    2012-05-02

    Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.

  9. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    PubMed

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Influence of three different concentration techniques on evaporation rate, color and phenolics content of blueberry juice.

    PubMed

    Elik, Aysel; Yanık, Derya Koçak; Maskan, Medeni; Göğüş, Fahrettin

    2016-05-01

    The present study was undertaken to assess the effects of three different concentration processes open-pan, rotary vacuum evaporator and microwave heating on evaporation rate, the color and phenolics content of blueberry juice. Kinetics model study for changes in soluble solids content (°Brix), color parameters and phenolics content during evaporation was also performed. The final juice concentration of 65° Brix was achieved in 12, 15, 45 and 77 min, for microwave at 250 and 200 W, rotary vacuum and open-pan evaporation processes, respectively. Color changes associated with heat treatment were monitored using Hunter colorimeter (L*, a* and b*). All Hunter color parameters decreased with time and dependently studied concentration techniques caused color degradation. It was observed that the severity of color loss was higher in open-pan technique than the others. Evaporation also affected total phenolics content in blueberry juice. Total phenolics loss during concentration was highest in open-pan technique (36.54 %) and lowest in microwave heating at 200 W (34.20 %). So, the use of microwave technique could be advantageous in food industry because of production of blueberry juice concentrate with a better quality and short time of operation. A first-order kinetics model was applied to modeling changes in soluble solids content. A zero-order kinetics model was used to modeling changes in color parameters and phenolics content.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  12. Open Governance in Higher Education: Extending the Past to the Future

    ERIC Educational Resources Information Center

    Masson, Patrick

    2011-01-01

    Open educational resources, open content, open access, open research, open courseware--all of these open initiatives share, and benefit from, a vision of access and a collaborative framework that often result in improved outcomes. Many of these open initiatives have gained adoption within higher education and are now serving in mission-critical…

  13. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  14. Describing environmental public health data: implementing a descriptive metadata standard on the environmental public health tracking network.

    PubMed

    Patridge, Jeff; Namulanda, Gonza

    2008-01-01

    The Environmental Public Health Tracking (EPHT) Network provides an opportunity to bring together diverse environmental and health effects data by integrating}?> local, state, and national databases of environmental hazards, environmental exposures, and health effects. To help users locate data on the EPHT Network, the network will utilize descriptive metadata that provide critical information as to the purpose, location, content, and source of these data. Since 2003, the Centers for Disease Control and Prevention's EPHT Metadata Subgroup has been working to initiate the creation and use of descriptive metadata. Efforts undertaken by the group include the adoption of a metadata standard, creation of an EPHT-specific metadata profile, development of an open-source metadata creation tool, and promotion of the creation of descriptive metadata by changing the perception of metadata in the public health culture.

  15. Lexington Children`s Museum final report on EnergyQuest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    EnergyQuest is a museum-wide exhibit that familiarizes children and their families with energy sources, uses, and issues and with the impact of those issues on their lives. It was developed and built by Lexington Children`s Museum with support from the US Department of Energy, Kentucky Utilities, and the Kentucky Coal Marketing and Export Council. EnergyQuest featured six hands-on exhibit stations in each of six museum galleries. Collectively, the exhibits examine the sources, uses and conservation of energy. Each EnergyQuest exhibit reflects the content of its gallery setting. During the first year after opening EnergyQuest, a series of 48 public educationalmore » programs on energy were conducted at the Museum as part of the Museum`s ongoing schedule of demonstrations, performances, workshops and classes. In addition, teacher training was conducted.« less

  16. Chemistry Based on Renewable Raw Materials: Perspectives for a Sugar Cane-Based Biorefinery

    PubMed Central

    Villela Filho, Murillo; Araujo, Carlos; Bonfá, Alfredo; Porto, Weber

    2011-01-01

    Carbohydrates are nowadays a very competitive feedstock for the chemical industry because their availability is compatible with world-scale chemical production and their price, based on the carbon content, is comparable to that of petrochemicals. At the same time, demand is rising for biobased products. Brazilian sugar cane is a competitive feedstock source that is opening the door to a wide range of bio-based products. This essay begins with the importance of the feedstock for the chemical industry and discusses developments in sugar cane processing that lead to low cost feedstocks. Thus, sugar cane enables a new chemical industry, as it delivers a competitive raw material and a source of energy. As a result, sugar mills are being transformed into sustainable biorefineries that fully exploit the potential of sugar cane. PMID:21637329

  17. Chemistry based on renewable raw materials: perspectives for a sugar cane-based biorefinery.

    PubMed

    Villela Filho, Murillo; Araujo, Carlos; Bonfá, Alfredo; Porto, Weber

    2011-01-01

    Carbohydrates are nowadays a very competitive feedstock for the chemical industry because their availability is compatible with world-scale chemical production and their price, based on the carbon content, is comparable to that of petrochemicals. At the same time, demand is rising for biobased products. Brazilian sugar cane is a competitive feedstock source that is opening the door to a wide range of bio-based products. This essay begins with the importance of the feedstock for the chemical industry and discusses developments in sugar cane processing that lead to low cost feedstocks. Thus, sugar cane enables a new chemical industry, as it delivers a competitive raw material and a source of energy. As a result, sugar mills are being transformed into sustainable biorefineries that fully exploit the potential of sugar cane.

  18. The validity of open-source data when assessing jail suicides.

    PubMed

    Thomas, Amanda L; Scott, Jacqueline; Mellow, Jeff

    2018-05-09

    The Bureau of Justice Statistics' Deaths in Custody Reporting Program is the primary source for jail suicide research, though the data is restricted from general dissemination. This study is the first to examine whether jail suicide data obtained from publicly available sources can help inform our understanding of this serious public health problem. Of the 304 suicides that were reported through the DCRP in 2009, roughly 56 percent (N = 170) of those suicides were identified through the open-source search protocol. Each of the sources was assessed based on how much information was collected on the incident and the types of variables available. A descriptive analysis was then conducted on the variables that were present in both data sources. The four variables present in each data source were: (1) demographic characteristics of the victim, (2) the location of occurrence within the facility, (3) the location of occurrence by state, and (4) the size of the facility. Findings demonstrate that the prevalence and correlates of jail suicides are extremely similar in both open-source and official data. However, for almost every variable measured, open-source data captured as much information as official data did, if not more. Further, variables not found in official data were identified in the open-source database, thus allowing researchers to have a more nuanced understanding of the situational characteristics of the event. This research provides support for the argument in favor of including open-source data in jail suicide research as it illustrates how open-source data can be used to provide additional information not originally found in official data. In sum, this research is vital in terms of possible suicide prevention, which may be directly linked to being able to manipulate environmental factors.

  19. Genes2WordCloud: a quick way to identify biological themes from gene lists and free text.

    PubMed

    Baroukh, Caroline; Jenkins, Sherry L; Dannenfelser, Ruth; Ma'ayan, Avi

    2011-10-13

    Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications.

  20. Genes2WordCloud: a quick way to identify biological themes from gene lists and free text

    PubMed Central

    2011-01-01

    Background Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Results Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Methods Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Conclusions Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications. PMID:21995939

  1. DynAOI: a tool for matching eye-movement data with dynamic areas of interest in animations and movies.

    PubMed

    Papenmeier, Frank; Huff, Markus

    2010-02-01

    Analyzing gaze behavior with dynamic stimulus material is of growing importance in experimental psychology; however, there is still a lack of efficient analysis tools that are able to handle dynamically changing areas of interest. In this article, we present DynAOI, an open-source tool that allows for the definition of dynamic areas of interest. It works automatically with animations that are based on virtual three-dimensional models. When one is working with videos of real-world scenes, a three-dimensional model of the relevant content needs to be created first. The recorded eye-movement data are matched with the static and dynamic objects in the model underlying the video content, thus creating static and dynamic areas of interest. A validation study asking participants to track particular objects demonstrated that DynAOI is an efficient tool for handling dynamic areas of interest.

  2. E-learning: controlling costs and increasing value.

    PubMed

    Walsh, Kieran

    2015-04-01

    E-learning now accounts for a substantial proportion of medical education provision. This progress has required significant investment and this investment has in turn come under increasing scrutiny so that the costs of e-learning may be controlled and its returns maximised. There are multiple methods by which the costs of e-learning can be controlled and its returns maximised. This short paper reviews some of those methods that are likely to be most effective and that are likely to save costs without compromising quality. Methods might include accessing free or low-cost resources from elsewhere; create short learning resources that will work on multiple devices; using open source platforms to host content; using in-house faculty to create content; sharing resources between institutions; and promoting resources to ensure high usage. Whatever methods are used to control costs or increase value, it is most important to evaluate the impact of these methods.

  3. Consistent criticality and radiation studies of Swiss spent nuclear fuel: The CS2M approach.

    PubMed

    Rochman, D; Vasiliev, A; Ferroukhi, H; Pecchia, M

    2018-06-15

    In this paper, a new method is proposed to systematically calculate at the same time canister loading curves and radiation sources, based on the inventory information from an in-core fuel management system. As a demonstration, the isotopic contents of the assemblies come from a Swiss PWR, considering more than 6000 cases from 34 reactor cycles. The CS 2 M approach consists in combining four codes: CASMO and SIMULATE to extract the assembly characteristics (based on validated models), the SNF code for source emission and MCNP for criticality calculations for specific canister loadings. The considered cases cover enrichments from 1.9 to 5.0% for the UO 2 assemblies and 4.8% for the MOX, with assembly burnup values from 7 to 74 MWd/kgU. Because such a study is based on the individual fuel assembly history, it opens the possibility to optimize canister loadings from the point-of-view of criticality, decay heat and emission sources. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Open source tools and toolkits for bioinformatics: significance, and where are we?

    PubMed

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  5. Open Source 2010: Reflections on 2007

    ERIC Educational Resources Information Center

    Wheeler, Brad

    2007-01-01

    Colleges and universities and commercial firms have demonstrated great progress in realizing the vision proffered for "Open Source 2007," and 2010 will mark even greater progress. Although much work remains in refining open source for higher education applications, the signals are now clear: the collaborative development of software can provide…

  6. Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments

    ERIC Educational Resources Information Center

    Wang, Shuo; Wang, Jing; Gao, Yanjing

    2017-01-01

    An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…

  7. Integrating an Automatic Judge into an Open Source LMS

    ERIC Educational Resources Information Center

    Georgouli, Katerina; Guerreiro, Pedro

    2011-01-01

    This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…

  8. 76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... be held in the General Services Administration (GSA), Central Office Auditorium, 1800 F Street NW...

  9. The open-source movement: an introduction for forestry professionals

    Treesearch

    Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove

    2005-01-01

    In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....

  10. Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.

    ERIC Educational Resources Information Center

    Newby, Gregory B.; Greenberg, Jane; Jones, Paul

    2003-01-01

    Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)

  11. Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat.

    PubMed

    Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart

    2015-04-21

    Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation.

  12. Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat

    PubMed Central

    Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart

    2015-01-01

    Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation. PMID:25897892

  13. Open Source and ROI: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    A switch to free open source software can minimize cost and allow funding to be diverted to equipment and other programs. For instance, the OpenOffice suite is an alternative to expensive basic application programs offered by major vendors. Many such programs on the market offer features seldom used in education but for which educators must pay.…

  14. Free-access open-source e-learning in comprehensive neurosurgery skills training.

    PubMed

    Jotwani, Payal; Srivastav, Vinkle; Tripathi, Manjul; Deo, Rama Chandra; Baby, Britty; Damodaran, Natesan; Singh, Ramandeep; Suri, Ashish; Bettag, Martin; Roy, Tara Sankar; Busert, Christoph; Mehlitz, Marcus; Lalwani, Sanjeev; Garg, Kanwaljeet; Paul, Kolin; Prasad, Sanjiva; Banerjee, Subhashis; Kalra, Prem; Kumar, Subodh; Sharma, Bhavani Shankar; Mahapatra, Ashok Kumar

    2014-01-01

    Since the end of last century, technology has taken a front seat in dispersion of medical education. Advancements of technology in neurosurgery and traditional training methods are now being challenged by legal and ethical concerns of patient safety, resident work-hour restriction and cost of operating-room time. To supplement the existing neurosurgery education pattern, various e-learning platforms are introduced as structured, interactive learning system. This study focuses on the concept, formulation, development and impact of web based learning platforms dedicated to neurosurgery discipline to disseminate education, supplement surgical knowledge and improve skills of neurosurgeons. 'Neurosurgery Education and Training School (NETS), e-learning platform' has integration of web-based technologies like 'Content Management System' for organizing the education material and 'Learning Management System' for updating neurosurgeons. NETS discussion forum networks neurosurgeons, neuroscientists and neuro-technologists across the globe facilitating collaborative translational research. Multi-authored neurosurgical e-learning material supplements the deficiencies of regular time-bound education. Interactive open-source, global, free-access e-learning platform of NETS has around 1) 425 visitors/month from 73 countries; ratio of new visitors to returning visitors 42.3; 57.7 (2); 64,380 views from 190 subscribers for surgical videos, 3-D animation, graphics based training modules (3); average 402 views per post. The e-Learning platforms provide updated educational content that make them "quick, surf, find and extract" resources. e-Learning tools like web-based education, social interactive platform and question-answer forum will save unnecessary expenditure of time and travel of neurosurgeons seeking knowledge. The need for free access platforms is more pronounced for the neurosurgeons and patients in developing nations.

  15. Reactome graph database: Efficient access to complex pathway data

    PubMed Central

    Korninger, Florian; Viteri, Guilherme; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D’Eustachio, Peter

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types. PMID:29377902

  16. Cafe Variome: general-purpose software for making genotype-phenotype data discoverable in restricted or open access contexts.

    PubMed

    Lancaster, Owen; Beck, Tim; Atlan, David; Swertz, Morris; Thangavelu, Dhiwagaran; Veal, Colin; Dalgleish, Raymond; Brookes, Anthony J

    2015-10-01

    Biomedical data sharing is desirable, but problematic. Data "discovery" approaches-which establish the existence rather than the substance of data-precisely connect data owners with data seekers, and thereby promote data sharing. Cafe Variome (http://www.cafevariome.org) was therefore designed to provide a general-purpose, Web-based, data discovery tool that can be quickly installed by any genotype-phenotype data owner, or network of data owners, to make safe or sensitive content appropriately discoverable. Data fields or content of any type can be accommodated, from simple ID and label fields through to extensive genotype and phenotype details based on ontologies. The system provides a "shop window" in front of data, with main interfaces being a simple search box and a powerful "query-builder" that enable very elaborate queries to be formulated. After a successful search, counts of records are reported grouped by "openAccess" (data may be directly accessed), "linkedAccess" (a source link is provided), and "restrictedAccess" (facilitated data requests and subsequent provision of approved records). An administrator interface provides a wide range of options for system configuration, enabling highly customized single-site or federated networks to be established. Current uses include rare disease data discovery, patient matchmaking, and a Beacon Web service. © 2015 WILEY PERIODICALS, INC.

  17. Reactome graph database: Efficient access to complex pathway data.

    PubMed

    Fabregat, Antonio; Korninger, Florian; Viteri, Guilherme; Sidiropoulos, Konstantinos; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D'Eustachio, Peter; Hermjakob, Henning

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  18. DOD Open Government

    Science.gov Websites

    Skip to main content (Press Enter). DOD Open Government Logo DOD Open Government U.S. Department of Defense Search DOD Open Government: Home Open Government @ DoD Transparency Congressional Inquiries Cooperation Regulatory Program Initiatives FRD Declassification DARPA Open Catalog Contact Us 2016

  19. Open source drug discovery--a new paradigm of collaborative research in tuberculosis drug development.

    PubMed

    Bhardwaj, Anshu; Scaria, Vinod; Raghava, Gajendra Pal Singh; Lynn, Andrew Michael; Chandra, Nagasuma; Banerjee, Sulagna; Raghunandanan, Muthukurussi V; Pandey, Vikas; Taneja, Bhupesh; Yadav, Jyoti; Dash, Debasis; Bhattacharya, Jaijit; Misra, Amit; Kumar, Anil; Ramachandran, Srinivasan; Thomas, Zakir; Brahmachari, Samir K

    2011-09-01

    It is being realized that the traditional closed-door and market driven approaches for drug discovery may not be the best suited model for the diseases of the developing world such as tuberculosis and malaria, because most patients suffering from these diseases have poor paying capacity. To ensure that new drugs are created for patients suffering from these diseases, it is necessary to formulate an alternate paradigm of drug discovery process. The current model constrained by limitations for collaboration and for sharing of resources with confidentiality hampers the opportunities for bringing expertise from diverse fields. These limitations hinder the possibilities of lowering the cost of drug discovery. The Open Source Drug Discovery project initiated by Council of Scientific and Industrial Research, India has adopted an open source model to power wide participation across geographical borders. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multi-faceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Since the inventions are community generated, the new chemical entities developed by Open Source Drug Discovery will be taken up for clinical trial in a non-exclusive manner by participation of multiple companies with majority funding from Open Source Drug Discovery. This will ensure availability of drugs through a lower cost community driven drug discovery process for diseases afflicting people with poor paying capacity. Hopefully what LINUX the World Wide Web have done for the information technology, Open Source Drug Discovery will do for drug discovery. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. State-of-the-practice and lessons learned on implementing open data and open source policies.

    DOT National Transportation Integrated Search

    2012-05-01

    This report describes the current government, academic, and private sector practices associated with open data and open source application development. These practices are identified; and the potential uses with the ITS Programs Data Capture and M...

  1. Your Personal Analysis Toolkit - An Open Source Solution

    NASA Astrophysics Data System (ADS)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  2. All-source Information Management and Integration for Improved Collective Intelligence Production

    DTIC Science & Technology

    2011-06-01

    Intelligence (ELINT) • Open Source Intelligence ( OSINT ) • Technical Intelligence (TECHINT) These intelligence disciplines produce... intelligence , measurement and signature intelligence , signals intelligence , and open - source data, in the production of intelligence . All- source intelligence ...All- Source Information Integration and Management) R&D Project 3 All- Source Intelligence

  3. Science on Drupal: An evaluation of CMS Technologies

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Gonzalez, A.; Pinto, A.; Pascuzzi, F.; Gerard, A.

    2011-12-01

    We conducted an extensive evaluation of various Content Management System (CMS) technologies for implementing different websites supporting interdisciplinary science data and information. We chose two products, Drupal and Bluenog/Hippo CMS, to meet our specific needs and requirements. Drupal is an open source product that is quick and easy to setup and use. It is a very mature, stable, and widely used product. It has rich functionality supported by a large and active user base and developer community. There are many plugins available that provide additional features for managing citations, map gallery, semantic search, digital repositories (fedora), scientific workflows, collaborative authoring, social networking, and other functions. All of these work very well within the Drupal framework if minimal customization is needed. We have successfully implemented Drupal for multiple projects such as: 1) the Haiti Regeneration Initiative (http://haitiregeneration.org/); 2) the Consortium on Climate Risk in the Urban Northeast (http://beta.ccrun.org/); and 3) the Africa Soils Information Service (http://africasoils.net/). We are also developing two other websites, the Côte Sud Initiative (CSI) and Emerging Infectious Diseases, using Drupal. We are testing the Drupal multi-site install for managing different websites with one install to streamline the maintenance. In addition, paid support and consultancy for Drupal website development are available at affordable prices. All of these features make Drupal very attractive for implementing state-of-the-art scientific websites that do not have complex requirements. One of our major websites, the NASA Socioeconomic Data and Applications Center (SEDAC), has a very complex set of requirements. It has to easily re-purpose content across multiple web pages and sites with different presentations. It has to serve the content via REST or similar standard interfaces so that external client applications can access content in the CMS repository. This means the content repository and structure should be completely separated from the content presentation and site structure. In addition to the CMS repository, the front-end website has to be able to consume, integrate, and display diverse content flexibly from multiple back-end systems, including custom and legacy systems, such as Oracle, Geoserver, Flickr, Fedora, and other web services. We needed the ability to customize the workflow to author, edit, approve, and publish content based on different content types and project requirements. In addition, we required the ability to use the existing active directory for user management with support for roles and groups and permissions using Access Control List (ACL) model. The ability to version and lock content was also important. We determined that most of these capabilities are difficult to implement with Drupal and needed significant customization. The Bluenog eCMS (enterprise CMS) product satisfied most of these requirements. Bluenog eCMS is based on an open source product called Hippo with customizations and support provided by the vendor Bluenog. Our newly redesigned and recently released SEDAC website, http://sedac.ciesin.columbia.edu, is implemented using Bluenog eCMS. Other products we evaluated include WebLogic portal, Magnolia, Liferay portal, and Alfresco.

  4. Open source EMR software: profiling, insights and hands-on analysis.

    PubMed

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    The use of open source software in health informatics is increasingly advocated by authors in the literature. Although there is no clear evidence of the superiority of the current open source applications in the healthcare field, the number of available open source applications online is growing and they are gaining greater prominence. This repertoire of open source options is of a great value for any future-planner interested in adopting an electronic medical/health record system, whether selecting an existent application or building a new one. The following questions arise. How do the available open source options compare to each other with respect to functionality, usability and security? Can an implementer of an open source application find sufficient support both as a user and as a developer, and to what extent? Does the available literature provide adequate answers to such questions? This review attempts to shed some light on these aspects. The objective of this study is to provide more comprehensive guidance from an implementer perspective toward the available alternatives of open source healthcare software, particularly in the field of electronic medical/health records. The design of this study is twofold. In the first part, we profile the published literature on a sample of existent and active open source software in the healthcare area. The purpose of this part is to provide a summary of the available guides and studies relative to the sampled systems, and to identify any gaps in the published literature with respect to our research questions. In the second part, we investigate those alternative systems relative to a set of metrics, by actually installing the software and reporting a hands-on experience of the installation process, usability, as well as other factors. The literature covers many aspects of open source software implementation and utilization in healthcare practice. Roughly, those aspects could be distilled into a basic taxonomy, making the literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Getting Open Source Software into Schools: Strategies and Challenges

    ERIC Educational Resources Information Center

    Hepburn, Gary; Buley, Jan

    2006-01-01

    In this article Gary Hepburn and Jan Buley outline different approaches to implementing open source software (OSS) in schools; they also address the challenges that open source advocates should anticipate as they try to convince educational leaders to adopt OSS. With regard to OSS implementation, they note that schools have a flexible range of…

  6. Open Source Library Management Systems: A Multidimensional Evaluation

    ERIC Educational Resources Information Center

    Balnaves, Edmund

    2008-01-01

    Open source library management systems have improved steadily in the last five years. They now present a credible option for small to medium libraries and library networks. An approach to their evaluation is proposed that takes account of three additional dimensions that only open source can offer: the developer and support community, the source…

  7. Open Source as Appropriate Technology for Global Education

    ERIC Educational Resources Information Center

    Carmichael, Patrick; Honour, Leslie

    2002-01-01

    Economic arguments for the adoption of "open source" software in business have been widely discussed. In this paper we draw on personal experience in the UK, South Africa and Southeast Asia to forward compelling reasons why open source software should be considered as an appropriate and affordable alternative to the currently prevailing…

  8. Government Technology Acquisition Policy: The Case of Proprietary versus Open Source Software

    ERIC Educational Resources Information Center

    Hemphill, Thomas A.

    2005-01-01

    This article begins by explaining the concepts of proprietary and open source software technology, which are now competing in the marketplace. A review of recent individual and cooperative technology development and public policy advocacy efforts, by both proponents of open source software and advocates of proprietary software, subsequently…

  9. Open Source Communities in Technical Writing: Local Exigence, Global Extensibility

    ERIC Educational Resources Information Center

    Conner, Trey; Gresham, Morgan; McCracken, Jill

    2011-01-01

    By offering open-source software (OSS)-based networks as an affordable technology alternative, we partnered with a nonprofit community organization. In this article, we narrate the client-based experiences of this partnership, highlighting the ways in which OSS and open-source culture (OSC) transformed our students' and our own expectations of…

  10. Personal Electronic Devices and the ISR Data Explosion: The Impact of Cyber Cameras on the Intelligence Community

    DTIC Science & Technology

    2015-06-01

    ground.aspx?p=1 Texas Tech Security Group, “Automated Open Source Intelligence ( OSINT ) Using APIs.” RaiderSec, Sunday 30 December 2012, http...Open Source Intelligence ( OSINT ) Using APIs,” RaiderSec, Sunday 30 December 2012, http://raidersec.blogspot.com/2012/12/automated-open- source

  11. Open-Source Unionism: New Workers, New Strategies

    ERIC Educational Resources Information Center

    Schmid, Julie M.

    2004-01-01

    In "Open-Source Unionism: Beyond Exclusive Collective Bargaining," published in fall 2002 in the journal Working USA, labor scholars Richard B. Freeman and Joel Rogers use the term "open-source unionism" to describe a form of unionization that uses Web technology to organize in hard-to-unionize workplaces. Rather than depend on the traditional…

  12. Perceptions of Open Source versus Commercial Software: Is Higher Education Still on the Fence?

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2007-01-01

    This exploratory study investigated the perceptions of technology and academic decision-makers about open source benefits and risks versus commercial software applications. The study also explored reactions to a concept for outsourcing campus-wide deployment and maintenance of open source. Data collected from telephone interviews were analyzed,…

  13. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    ERIC Educational Resources Information Center

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  14. Open-Source Learning Management Systems: A Predictive Model for Higher Education

    ERIC Educational Resources Information Center

    van Rooij, S. Williams

    2012-01-01

    The present study investigated the role of pedagogical, technical, and institutional profile factors in an institution of higher education's decision to select an open-source learning management system (LMS). Drawing on the results of previous research that measured patterns of deployment of open-source software (OSS) in US higher education and…

  15. An Embedded Systems Course for Engineering Students Using Open-Source Platforms in Wireless Scenarios

    ERIC Educational Resources Information Center

    Rodriguez-Sanchez, M. C.; Torrado-Carvajal, Angel; Vaquero, Joaquin; Borromeo, Susana; Hernandez-Tamames, Juan A.

    2016-01-01

    This paper presents a case study analyzing the advantages and disadvantages of using project-based learning (PBL) combined with collaborative learning (CL) and industry best practices, integrated with information communication technologies, open-source software, and open-source hardware tools, in a specialized microcontroller and embedded systems…

  16. Technology collaboration by means of an open source government

    NASA Astrophysics Data System (ADS)

    Berardi, Steven M.

    2009-05-01

    The idea of open source software originally began in the early 1980s, but it never gained widespread support until recently, largely due to the explosive growth of the Internet. Only the Internet has made this kind of concept possible, bringing together millions of software developers from around the world to pool their knowledge. The tremendous success of open source software has prompted many corporations to adopt the culture of open source and thus share information they previously held secret. The government, and specifically the Department of Defense (DoD), could also benefit from adopting an open source culture. In acquiring satellite systems, the DoD often builds walls between program offices, but installing doors between programs can promote collaboration and information sharing. This paper addresses the challenges and consequences of adopting an open source culture to facilitate technology collaboration for DoD space acquisitions. DISCLAIMER: The views presented here are the views of the author, and do not represent the views of the United States Government, United States Air Force, or the Missile Defense Agency.

  17. 77 FR 8818 - Publication of FY 2011 Service Contract Inventory

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-15

    ... the inventory on the Bureau's Open Government homepage at the following link: http://www.consumerfinance.gov/open/ , specifically at http://www.consumerfinance.gov/wp-content/uploads/2012/01/Appendix-C-FY2011-Inventory-Data-Summary.pdf and http://www.consumerfinance.gov/wp-content/uploads/2012/01/Appendix...

  18. Essays in Information Economics

    ERIC Educational Resources Information Center

    Chiao, Hak Fung

    2010-01-01

    I study two economic responses to the challenges of copyright infringements and spam brought about by the birth of the Internet. These responses are anti-spam mechanisms and open contents. I derive conditions under which distribution and care level taken to avoid damages in open contents are socially efficient or inefficient. Then I report…

  19. [Learning strategies of autonomous medical students].

    PubMed

    Márquez U, Carolina; Fasce H, Eduardo; Ortega B, Javiera; Bustamante D, Carolina; Pérez V, Cristhian; Ibáñez G, Pilar; Ortiz M, Liliana; Espinoza P, Camila; Bastías V, Nancy

    2015-12-01

    Understanding how autonomous students are capable of regulating their own learning process is essential to develop self-directed teaching methods. To understand how self-directed medical students approach learning in medical schools at University of Concepción, Chile. A qualitative and descriptive study, performed according to Grounded Theory guidelines, following Strauss & Corbin was performed. Twenty medical students were selected by the maximum variation sampling method. The data collection technique was carried out by a semi-structured thematic interview. Students were interviewed by researchers after an informed consent procedure. Data were analyzed by the open coding method using Atlas-ti 7.5.2 software. Self-directed learners were characterized by being good planners and managing their time correctly. Students performed a diligent selection of contents to study based on reliable literature sources, theoretical relevance and type of evaluation. They also emphasized the discussion of clinical cases, where theoretical contents can be applied. This modality allows them to gain a global view of theoretical contents, to verbalize knowledge and to obtain a learning feedback. The learning process of autonomous students is intentional and planned.

  20. Climate and root proximity as dominant drivers of enzyme activity and C and N isotopic signature in soil

    NASA Astrophysics Data System (ADS)

    Stock, Svenja; Köster, Moritz; Dippold, Michaela; Boy, Jens; Matus, Francisco; Merino, Carolina; Nájera, Francisco; Spielvogel, Sandra; Gorbushina, Anna; Kuzyakov, Yakov

    2017-04-01

    The Chilean ecosystems provide a unique study area to investigate biotic controls on soil organic matter (SOM) decomposition and mineral weathering depending on climate (from hyper arid to temperate humid). Microorganisms play a crucial role in the SOM decomposition, nutrient release and cycling. By means of extracellular enzymes microorganisms break down organic compounds and provide nutrients for plants. Soil moisture (abiotic factor) and root carbon (biotic factor providing easily available energy source for microorganisms), are important factors for microbial decomposition of SOM and show strong gradients along the investigated climatic gradient. A high input of root carbon increases microbial activity and enzyme production, and facilitates SOM breakdown and nutrient release The aim of this study was to determine the potential enzymatic SOM decomposition and nutrient release depending on root proximity and precipitation. C and N contents, δ13C and δ15N values, and kinetics (Vmax, Km) of six extracellular enzymes, responsible for C, N, and P cycles, were quantified in vertical (soil depth) and horizontal (from roots to bulk soil) gradients in two climatic regions: within a humid temperate forest and a semiarid open forest. The greater productivity of the temperate forest was reflected by higher C and N contents compared to the semiarid forest. Regression lines between δ13C and -[ln(%C)] showed a stronger isotopic fractionation from top- to subsoil at the semiarid open forest, indicating a faster SOM turnover compared to the humid temperate forest. This is the result of more favorable soil conditions (esp. temperature and smaller C/N ratios) in the semiarid forest. Depth trends of δ15N values indicated N limitation in both soils, though the limitation at the temperate site was stronger. The activity of enzymes degrading cellulose and hemicellulose increased with C content. Activity of enzymes involved in C, N and P cycles decreased from top- to subsoil and with distance to roots. Chitinase and acid phosphatase activities increased with increasing C contents and indicated a faster substrate turnover in soil under the temperate forest compared to the semiarid forest. In contrast, Tyrosin-aminopeptidase activities indicated a faster substrate turnover under semiarid forest than the temperate forest, and strongly increased with increasing N content. We conclude that the N availability and SOM turnover under semiarid open forest is higher than under humid temperate forest. The enzyme activities are depending on depth only indirectly and are driven mainly by soil C content, which is directly affected by root carbon input.

  1. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  2. Embracing Open Source for NASA's Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin

    2017-01-01

    The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.

  3. Open source Modeling and optimization tools for Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peles, S.

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less

  4. Limitations of Phased Array Beamforming in Open Rotor Noise Source Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Csaba; Envia, Edmane; Podboy, Gary G.

    2013-01-01

    Phased array beamforming results of the F31/A31 historical baseline counter-rotating open rotor blade set were investigated for measurement data taken on the NASA Counter-Rotating Open Rotor Propulsion Rig in the 9- by 15-Foot Low-Speed Wind Tunnel of NASA Glenn Research Center as well as data produced using the LINPROP open rotor tone noise code. The planar microphone array was positioned broadside and parallel to the axis of the open rotor, roughly 2.3 rotor diameters away. The results provide insight as to why the apparent noise sources of the blade passing frequency tones and interaction tones appear at their nominal Mach radii instead of at the actual noise sources, even if those locations are not on the blades. Contour maps corresponding to the sound fields produced by the radiating sound waves, taken from the simulations, are used to illustrate how the interaction patterns of circumferential spinning modes of rotating coherent noise sources interact with the phased array, often giving misleading results, as the apparent sources do not always show where the actual noise sources are located. This suggests that a more sophisticated source model would be required to accurately locate the sources of each tone. The results of this study also have implications with regard to the shielding of open rotor sources by airframe empennages.

  5. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    NASA Astrophysics Data System (ADS)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  6. Changes in nutrient and antinutrient composition of Vigna racemosa flour in open and controlled fermentation.

    PubMed

    Difo, V H; Onyike, E; Ameh, D A; Njoku, G C; Ndidi, U S

    2015-09-01

    This study was conducted to investigate the effect of open and controlled fermentation on the proximate composition, mineral elements, antinutritional factors and flatulence-causing oligosaccharides in Vigna racemosa. The open fermentation was carried out using the microorganisms present in the atmosphere while the controlled fermentation was carried out using Aspergillus niger as a starter. The proximate composition of the Vigna racemosa, some anti-nutrients and the mineral elements were analyzed using standard procedures. The protein content was increased by 12.41 ± 1.73 % during open fermentation while it decreased by 29.42 ± 0.1 % during controlled fermentation. The lipids, carbohydrates, crude fibre and ash content were all reduced in both types of fermentation except the moisture content which increased in controlled fermentation. Apart from calcium, the other elements (Fe, Na, Mg, Zn, and K) suffered reduction in both types of fermentation. The phytate, tannin, alkaloids, hydrogen cyanide, lectins, trypsin inhibitors and oxalate content all had drastic reductions in both types of fermentation. Open and controlled fermentation reduced the levels of both raffinose and stachyose. The percentages of reduction due to controlled fermentation were higher than those of open fermentation in the antinutrients studied. Fermentation is an efficient method for detoxifying the antinutrients in the Vigna racemosa studied in this work.

  7. Development of an Open Source, Air-Deployable Weather Station

    NASA Astrophysics Data System (ADS)

    Krejci, A.; Lopez Alcala, J. M.; Nelke, M.; Wagner, J.; Udell, C.; Higgins, C. W.; Selker, J. S.

    2017-12-01

    We created a packaged weather station intended to be deployed in the air on tethered systems. The device incorporates lightweight sensors and parts and runs for up to 24 hours off of lithium polymer batteries, allowing the entire package to be supported by a thin fiber. As the fiber does not provide a stable platform, additional data (pitch and roll) from typical weather parameters (e.g. temperature, pressure, humidity, wind speed, and wind direction) are determined using an embedded inertial motion unit. All designs are open sourced including electronics, CAD drawings, and descriptions of assembly and can be found on the OPEnS lab website at http://www.open-sensing.org/lowcost-weather-station/. The Openly Published Environmental Sensing Lab (OPEnS: Open-Sensing.org) expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting-edge technology. New OPEnS labs are now being established in India, France, Switzerland, the Netherlands, and Ghana.

  8. Software for Real-Time Analysis of Subsonic Test Shot Accuracy

    DTIC Science & Technology

    2014-03-01

    used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains

  9. Open source electronic health records and chronic disease management.

    PubMed

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-02-01

    To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC.

  10. What an open source clinical trial community can learn from hackers

    PubMed Central

    Dunn, Adam G.; Day, Richard O.; Mandl, Kenneth D.; Coiera, Enrico

    2014-01-01

    Summary Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. Since a similar gap has already been addressed in the software industry by the open source software movement, we examine how the social and technical principles of the movement can be used to guide the growth of an open source clinical trial community. PMID:22553248

  11. The Open Microscopy Environment: open image informatics for the biological sciences

    NASA Astrophysics Data System (ADS)

    Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.

    2016-07-01

    Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).

  12. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, Madeline Louise; McMath, Garrett Earl

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  13. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE PAGES

    Lockhart, Madeline Louise; McMath, Garrett Earl

    2017-10-26

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  14. An Evaluation of Open Source Learning Management Systems According to Administration Tools and Curriculum Design

    ERIC Educational Resources Information Center

    Ozdamli, Fezile

    2007-01-01

    Distance education is becoming more important in the universities and schools. The aim of this research is to evaluate the current existing Open Source Learning Management Systems according to Administration tool and Curriculum Design. For this, seventy two Open Source Learning Management Systems have been subjected to a general evaluation. After…

  15. Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic Publishing

    ERIC Educational Resources Information Center

    Samuels, Ruth Gallegos; Griffy, Henry

    2012-01-01

    This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…

  16. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    ERIC Educational Resources Information Center

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  17. Open Source Meets Virtual Reality--An Instructor's Journey Unearths New Opportunities for Learning, Community, and Academia

    ERIC Educational Resources Information Center

    O'Connor, Eileen A.

    2015-01-01

    Opening with the history, recent advances, and emerging ways to use avatar-based virtual reality, an instructor who has used virtual environments since 2007 shares how these environments bring more options to community building, teaching, and education. With the open-source movement, where the source code for virtual environments was made…

  18. The Implications of Incumbent Intellectual Property Strategies for Open Source Software Success and Commercialization

    ERIC Educational Resources Information Center

    Wen, Wen

    2012-01-01

    While open source software (OSS) emphasizes open access to the source code and avoids the use of formal appropriability mechanisms, there has been little understanding of how the existence and exercise of formal intellectual property rights (IPR) such as patents influence the direction of OSS innovation. This dissertation seeks to bridge this gap…

  19. Migrations of the Mind: The Emergence of Open Source Education

    ERIC Educational Resources Information Center

    Glassman, Michael; Bartholomew, Mitchell; Jones, Travis

    2011-01-01

    The authors describe an Open Source approach to education. They define Open Source Education (OSE) as a teaching and learning framework where the use and presentation of information is non-hierarchical, malleable, and subject to the needs and contributions of students as they become "co-owners" of the course. The course transforms itself into an…

  20. Prepare for Impact

    ERIC Educational Resources Information Center

    Waters, John K.

    2010-01-01

    Open source software is poised to make a profound impact on K-12 education. For years industry experts have been predicting the widespread adoption of open source tools by K-12 school districts. They're about to be proved right. The impact may not yet have been profound, but it's fair to say that some open source systems and non-proprietary…

  1. 7 Questions to Ask Open Source Vendors

    ERIC Educational Resources Information Center

    Raths, David

    2012-01-01

    With their budgets under increasing pressure, many campus IT directors are considering open source projects for the first time. On the face of it, the savings can be significant. Commercial emergency-planning software can cost upward of six figures, for example, whereas the open source Kuali Ready might run as little as $15,000 per year when…

  2. Cognitive Readiness Assessment and Reporting: An Open Source Mobile Framework for Operational Decision Support and Performance Improvement

    ERIC Educational Resources Information Center

    Heric, Matthew; Carter, Jenn

    2011-01-01

    Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…

  3. WIRM: An Open Source Toolkit for Building Biomedical Web Applications

    PubMed Central

    Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.

    2002-01-01

    This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108

  4. TOPSAN: a dynamic web database for structural genomics.

    PubMed

    Ellrott, Kyle; Zmasek, Christian M; Weekes, Dana; Sri Krishna, S; Bakolitsa, Constantina; Godzik, Adam; Wooley, John

    2011-01-01

    The Open Protein Structure Annotation Network (TOPSAN) is a web-based collaboration platform for exploring and annotating structures determined by structural genomics efforts. Characterization of those structures presents a challenge since the majority of the proteins themselves have not yet been characterized. Responding to this challenge, the TOPSAN platform facilitates collaborative annotation and investigation via a user-friendly web-based interface pre-populated with automatically generated information. Semantic web technologies expand and enrich TOPSAN's content through links to larger sets of related databases, and thus, enable data integration from disparate sources and data mining via conventional query languages. TOPSAN can be found at http://www.topsan.org.

  5. Extending the Territory: From Open Educational Resources to Open Educational Practices

    ERIC Educational Resources Information Center

    Ehlers, Ulf-Daniel

    2011-01-01

    This article examines the findings of the recent OPAL report "Beyond OER: Shifting Focus from Resources to Practices". In doing so, it defines current understanding of open educational resources and open educational practices, and highlights the shift from open content to open practice. The article includes a framework for supporting…

  6. Re-utilization of Industrial CO 2 for Algae Production Using a Phase Change Material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, Brian

    This is the final report of a 36-month Phase II cooperative agreement. Under this project, Touchstone Research Laboratory (Touchstone) investigated the merits of incorporating a Phase Change Material (PCM) into an open-pond algae production system that can capture and re-use the CO 2 from a coal-fired flue gas source located in Wooster, OH. The primary objective of the project was to design, construct, and operate a series of open algae ponds that accept a slipstream of flue gas from a coal-fired source and convert a significant portion of the CO 2 to liquid biofuels, electricity, and specialty products, while demonstratingmore » the merits of the PCM technology. Construction of the pilot facility and shakedown of the facility in Wooster, OH, was completed during the first two years, and the focus of the last year was on operations and the cultivation of algae. During this Phase II effort a large-scale algae concentration unit from OpenAlgae was installed and utilized to continuously harvest algae from indoor raceways. An Algae Lysing Unit and Oil Recovery Unit were also received and installed. Initial parameters for lysing nanochloropsis were tested. Conditions were established that showed the lysing operation was effective at killing the algae cells. Continuous harvesting activities yielded over 200 kg algae dry weight for Ponds 1, 2 and 4. Studies were conducted to determine the effect of anaerobic digestion effluent as a nutrient source and the resulting lipid productivity of the algae. Lipid content and total fatty acids were unaffected by culture system and nutrient source, indicating that open raceway ponds fed diluted anaerobic digestion effluent can obtain similar lipid productivities to open raceway ponds using commercial nutrients. Data were also collected with respect to the performance of the PCM material on the pilot-scale raceway ponds. Parameters such as evaporative water loss, temperature differences, and growth/productivity were tracked. The pond with the PCM material was consistently 2 to 5°C warmer than the control pond. This difference did not seem to increase significantly over time. During phase transitions for the PCM, the magnitude of the difference between the daily minimum and maximum temperatures decreased, resulting in smaller daily temperature fluctuations. A thin layer of PCM material reduced overall water loss by 74% and consistently provided algae densities that were 80% greater than the control pond.« less

  7. FOIA

    Science.gov Websites

    Skip to main content (Press Enter). DOD Open Government Logo DOD Open Government U.S. Department of Defense Search DOD Open Government: Home Open Government @ DoD Transparency Congressional Inquiries Cooperation Regulatory Program Initiatives FRD Declassification DARPA Open Catalog Contact Us Freedom of

  8. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  9. Upon the Shoulders of Giants: Open-Source Hardware and Software in Analytical Chemistry.

    PubMed

    Dryden, Michael D M; Fobel, Ryan; Fobel, Christian; Wheeler, Aaron R

    2017-04-18

    Isaac Newton famously observed that "if I have seen further it is by standing on the shoulders of giants." We propose that this sentiment is a powerful motivation for the "open-source" movement in scientific research, in which creators provide everything needed to replicate a given project online, as well as providing explicit permission for users to use, improve, and share it with others. Here, we write to introduce analytical chemists who are new to the open-source movement to best practices and concepts in this area and to survey the state of open-source research in analytical chemistry. We conclude by considering two examples of open-source projects from our own research group, with the hope that a description of the process, motivations, and results will provide a convincing argument about the benefits that this movement brings to both creators and users.

  10. Open-Source 3-D Platform for Low-Cost Scientific Instrument Ecosystem.

    PubMed

    Zhang, C; Wijnen, B; Pearce, J M

    2016-08-01

    The combination of open-source software and hardware provides technically feasible methods to create low-cost, highly customized scientific research equipment. Open-source 3-D printers have proven useful for fabricating scientific tools. Here the capabilities of an open-source 3-D printer are expanded to become a highly flexible scientific platform. An automated low-cost 3-D motion control platform is presented that has the capacity to perform scientific applications, including (1) 3-D printing of scientific hardware; (2) laboratory auto-stirring, measuring, and probing; (3) automated fluid handling; and (4) shaking and mixing. The open-source 3-D platform not only facilities routine research while radically reducing the cost, but also inspires the creation of a diverse array of custom instruments that can be shared and replicated digitally throughout the world to drive down the cost of research and education further. © 2016 Society for Laboratory Automation and Screening.

  11. [Distribution and source of particulate organic carbon and particulate nitrogen in the Yangtze River Estuary in summer 2012].

    PubMed

    Xing, Jian-Wei; Xian, Wei-Wei; Sheng, Xiu-Zhen

    2014-07-01

    Based on the data from the cruise carried out in August 2012 in the Yangtze River Estuary and its adjacent waters, spatial distributions of particulate organic carbon (POC), particulate nitrogen (PN) and their relationships with environmental factors were studied, and the source of POC and the contribution of phytoplankton to POC were analyzed combined with n (C)/n (N) ratio and chlorophyll a (Chl a) in the Yangtze River Estuary in summer 2012. The results showed that the concentrations of POC in the Yangtze River Estuary ranged from 0.68 mg x L(-1) to 34.80 mg x L(-1) in summer and the average content was 3.74 mg x L(-1), and PN contents varied between 0.03 mg x L(-1) and 9.13 mg x L(-1) with an average value of 0.57 mg x L(-1). Both of them presented that the concentrations in bottom layers were higher than those in the surface. POC and PN as well as total suspended matter (TSM) showed a extremel similar horizontal distribution trend that the highest values appeared in the near of the mouth and southwest of the survey waters, and decreased rapidly as toward the open seas, both of them showed higher contents in coastal zones and lower in outer sea. There was a fairly good positive linear relationship between POC and PN, which indicated that they had the same source. POC and PN expressed significantly positive correlations with TSM and chemical oxygen demand (COD), but showed relatively weak correlations with salinit and chlorophyll a, which demonstrated that terrestrial inputs had a strong influence on the distribution of POC and PN, and phytoplankton production was not the major source of organic matters in the Yangtze River Estuary. Both the n (C)/n (N) ratio and POC/Chl a analysis showed that the main source of POC was terrestrial inputs, and organic debris was the main existence form of POC. Quantitative analysis showed the biomass of phytoplankton only made an average of 2.54% contribution to POC in the Yangtze Rive Estuary in summer and non-living POC occupied the overwhelming advantage.

  12. OpenSesame: an open-source, graphical experiment builder for the social sciences.

    PubMed

    Mathôt, Sebastiaan; Schreij, Daniel; Theeuwes, Jan

    2012-06-01

    In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality, such as support for eyetrackers, input devices, and video playback, is available through plug-ins. OpenSesame can be used in combination with existing software for creating experiments.

  13. Preliminary data on polybrominated diphenyl ethers (PBDEs) in farmed fish tissues (Salmo salar) and fish feed in Southern Chile.

    PubMed

    Montory, Mónica; Barra, Ricardo

    2006-05-01

    Polybrominated diphenyl ethers (PBDEs) have become an issue of global concern. Recent studies have shown that farmed salmon can accumulate high levels of brominated compounds in their tissues and consequently there is a growing concern on its industrial and public health impacts. Little information is found in the international literature on PBDEs in the biotic compartment of the Southern Hemisphere. This paper reports the levels of several PBDE congeners found in the tissues of farmed fish from five different farming areas of Southern Chile. PBDEs were analyzed by HRGC-MS. More analytical data were obtained by analyzing these same pollutants in fish feed. Our results indicate a general trend of PBDE levels averaging 1.46 ng g(-1) wet weight (wwt). The observed congeneric distribution that resulted was quite similar to data previously reported in the open literature. PBDE profiles were found to be dominated by BDE 47. No correlation was observed between levels found in the tissues and the lipid content in such tissues, although a high correlation with the fish feed data was observed indicating that this could probably be the main PDBE entry source into fish, although other sources cannot be excluded. Even though the samples were obtained from different geographical areas, they presented fairly similar profiles, indicating a potential common source. We concluded that PBDE levels in the farmed Chilean salmon are in the low average range of values published in the open literature.

  14. The Privacy and Security Implications of Open Data in Healthcare.

    PubMed

    Kobayashi, Shinji; Kane, Thomas B; Paton, Chris

    2018-04-22

     The International Medical Informatics Association (IMIA) Open Source Working Group (OSWG) initiated a group discussion to discuss current privacy and security issues in the open data movement in the healthcare domain from the perspective of the OSWG membership.  Working group members independently reviewed the recent academic and grey literature and sampled a number of current large-scale open data projects to inform the working group discussion.  This paper presents an overview of open data repositories and a series of short case reports to highlight relevant issues present in the recent literature concerning the adoption of open approaches to sharing healthcare datasets. Important themes that emerged included data standardisation, the inter-connected nature of the open source and open data movements, and how publishing open data can impact on the ethics, security, and privacy of informatics projects.  The open data and open source movements in healthcare share many common philosophies and approaches including developing international collaborations across multiple organisations and domains of expertise. Both movements aim to reduce the costs of advancing scientific research and improving healthcare provision for people around the world by adopting open intellectual property licence agreements and codes of practice. Implications of the increased adoption of open data in healthcare include the need to balance the security and privacy challenges of opening data sources with the potential benefits of open data for improving research and healthcare delivery. Georg Thieme Verlag KG Stuttgart.

  15. Simulation of partially coherent light propagation using parallel computing devices

    NASA Astrophysics Data System (ADS)

    Magalhães, Tiago C.; Rebordão, José M.

    2017-08-01

    Light acquires or loses coherence and coherence is one of the few optical observables. Spectra can be derived from coherence functions and understanding any interferometric experiment is also relying upon coherence functions. Beyond the two limiting cases (full coherence or incoherence) the coherence of light is always partial and it changes with propagation. We have implemented a code to compute the propagation of partially coherent light from the source plane to the observation plane using parallel computing devices (PCDs). In this paper, we restrict the propagation in free space only. To this end, we used the Open Computing Language (OpenCL) and the open-source toolkit PyOpenCL, which gives access to OpenCL parallel computation through Python. To test our code, we chose two coherence source models: an incoherent source and a Gaussian Schell-model source. In the former case, we divided into two different source shapes: circular and rectangular. The results were compared to the theoretical values. Our implemented code allows one to choose between the PyOpenCL implementation and a standard one, i.e using the CPU only. To test the computation time for each implementation (PyOpenCL and standard), we used several computer systems with different CPUs and GPUs. We used powers of two for the dimensions of the cross-spectral density matrix (e.g. 324, 644) and a significant speed increase is observed in the PyOpenCL implementation when compared to the standard one. This can be an important tool for studying new source models.

  16. An Open Source Simulation System

    NASA Technical Reports Server (NTRS)

    Slack, Thomas

    2005-01-01

    An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.

  17. BASKET on-board software library

    NASA Astrophysics Data System (ADS)

    Luntzer, Armin; Ottensamer, Roland; Kerschbaum, Franz

    2014-07-01

    The University of Vienna is a provider of on-board data processing software with focus on data compression, such as used on board the highly successful Herschel/PACS instrument, as well as in the small BRITE-Constellation fleet of cube-sats. Current contributions are made to CHEOPS, SAFARI and PLATO. The effort was taken to review the various functions developed for Herschel and provide a consolidated software library to facilitate the work for future missions. This library is a shopping basket of algorithms. Its contents are separated into four classes: auxiliary functions (e.g. circular buffers), preprocessing functions (e.g. for calibration), lossless data compression (arithmetic or Rice coding) and lossy reduction steps (ramp fitting etc.). The "BASKET" has all functionality that is needed to create an on-board data processing chain. All sources are written in C, supplemented by optimized versions in assembly, targeting popular CPU architectures for space applications. BASKET is open source and constantly growing

  18. ChoiceKey: a real-time speech recognition program for psychology experiments with a small response set.

    PubMed

    Donkin, Christopher; Brown, Scott D; Heathcote, Andrew

    2009-02-01

    Psychological experiments often collect choice responses using buttonpresses. However, spoken responses are useful in many cases-for example, when working with special clinical populations, or when a paradigm demands vocalization, or when accurate response time measurements are desired. In these cases, spoken responses are typically collected using a voice key, which usually involves manual coding by experimenters in a tedious and error-prone manner. We describe ChoiceKey, an open-source speech recognition package for MATLAB. It can be optimized by training for small response sets and different speakers. We show ChoiceKey to be reliable with minimal training for most participants in experiments with two different responses. Problems presented by individual differences, and occasional atypical responses, are examined, and extensions to larger response sets are explored. The ChoiceKey source files and instructions may be downloaded as supplemental materials for this article from brm.psychonomic-journals.org/content/supplemental.

  19. BioCIDER: a Contextualisation InDEx for biological Resources discovery

    PubMed Central

    Horro, Carlos; Cook, Martin; Attwood, Teresa K.; Brazas, Michelle D.; Hancock, John M.; Palagi, Patricia; Corpas, Manuel; Jimenez, Rafael

    2017-01-01

    Abstract Summary The vast, uncoordinated proliferation of bioinformatics resources (databases, software tools, training materials etc.) makes it difficult for users to find them. To facilitate their discovery, various services are being developed to collect such resources into registries. We have developed BioCIDER, which, rather like online shopping ‘recommendations’, provides a contextualization index to help identify biological resources relevant to the content of the sites in which it is embedded. Availability and Implementation BioCIDER (www.biocider.org) is an open-source platform. Documentation is available online (https://goo.gl/Klc51G), and source code is freely available via GitHub (https://github.com/BioCIDER). The BioJS widget that enables websites to embed contextualization is available from the BioJS registry (http://biojs.io/). All code is released under an MIT licence. Contact carlos.horro@earlham.ac.uk or rafael.jimenez@elixir-europe.org or manuel@repositive.io PMID:28407033

  20. A clinic compatible, open source electrophysiology system.

    PubMed

    Hermiz, John; Rogers, Nick; Kaestner, Erik; Ganji, Mehran; Cleary, Dan; Snider, Joseph; Barba, David; Dayeh, Shadi; Halgren, Eric; Gilja, Vikash

    2016-08-01

    Open source electrophysiology (ephys) recording systems have several advantages over commercial systems such as customization and affordability enabling more researchers to conduct ephys experiments. Notable open source ephys systems include Open-Ephys, NeuroRighter and more recently Willow, all of which have high channel count (64+), scalability, and advanced software to develop on top of. However, little work has been done to build an open source ephys system that is clinic compatible, particularly in the operating room where acute human electrocorticography (ECoG) research is performed. We developed an affordable (<; $10,000) and open system for research purposes that features power isolation for patient safety, compact and water resistant enclosures and 256 recording channels sampled up to 20ksam/sec, 16-bit. The system was validated by recording ECoG with a high density, thin film device for an acute, awake craniotomy study at UC San Diego, Thornton Hospital Operating Room.

  1. Freeing Worldview's development process: Open source everything!

    NASA Astrophysics Data System (ADS)

    Gunnoe, T.

    2016-12-01

    Freeing your code and your project are important steps for creating an inviting environment for collaboration, with the added side effect of keeping a good relationship with your users. NASA Worldview's codebase was released with the open source NOSA (NASA Open Source Agreement) license in 2014, but this is only the first step. We also have to free our ideas, empower our users by involving them in the development process, and open channels that lead to the creation of a community project. There are many highly successful examples of Free and Open Source Software (FOSS) projects of which we can take note: the Linux kernel, Debian, GNOME, etc. These projects owe much of their success to having a passionate mix of developers/users with a great community and a common goal in mind. This presentation will describe the scope of this openness and how Worldview plans to move forward with a more community-inclusive approach.

  2. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  3. Interim Open Source Software (OSS) Policy

    EPA Pesticide Factsheets

    This interim Policy establishes a framework to implement the requirements of the Office of Management and Budget's (OMB) Federal Source Code Policy to achieve efficiency, transparency and innovation through reusable and open source software.

  4. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  5. Open Source Software Development Experiences on the Students' Resumes: Do They Count?--Insights from the Employers' Perspectives

    ERIC Educational Resources Information Center

    Long, Ju

    2009-01-01

    Open Source Software (OSS) is a major force in today's Information Technology (IT) landscape. Companies are increasingly using OSS in mission-critical applications. The transparency of the OSS technology itself with openly available source codes makes it ideal for students to participate in the OSS project development. OSS can provide unique…

  6. Open Source Initiative Powers Real-Time Data Streams

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Under an SBIR contract with Dryden Flight Research Center, Creare Inc. developed a data collection tool called the Ring Buffered Network Bus. The technology has now been released under an open source license and is hosted by the Open Source DataTurbine Initiative. DataTurbine allows anyone to stream live data from sensors, labs, cameras, ocean buoys, cell phones, and more.

  7. Xtreme Learning Control: Examples of the Open Source Movement's Impact on Our Educational Practice in a University Setting.

    ERIC Educational Resources Information Center

    Dunlap, Joanna C.; Wilson, Brent G.; Young, David L.

    This paper describes how Open Source philosophy, a movement that has developed in opposition to the proprietary software industry, has influenced educational practice in the pursuit of scholarly freedom and authentic learning activities for students and educators. This paper provides a brief overview of the Open Source movement, and describes…

  8. Adopting Open-Source Software Applications in U. S. Higher Education: A Cross-Disciplinary Review of the Literature

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2009-01-01

    Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…

  9. Assessing the Impact of Security Behavior on the Awareness of Open-Source Intelligence: A Quantitative Study of IT Knowledge Workers

    ERIC Educational Resources Information Center

    Daniels, Daniel B., III

    2014-01-01

    There is a lack of literature linking end-user behavior to the availability of open-source intelligence (OSINT). Most OSINT literature has been focused on the use and assessment of open-source intelligence, not the proliferation of personally or organizationally identifiable information (PII/OII). Additionally, information security studies have…

  10. Looking toward the Future: A Case Study of Open Source Software in the Humanities

    ERIC Educational Resources Information Center

    Quamen, Harvey

    2006-01-01

    In this article Harvey Quamen examines how the philosophy of open source software might be of particular benefit to humanities scholars in the near future--particularly for academic journals with limited financial resources. To this end he provides a case study in which he describes his use of open source technology (MySQL database software and…

  11. Preparing a scientific manuscript in Linux: Today's possibilities and limitations.

    PubMed

    Tchantchaleishvili, Vakhtang; Schmitto, Jan D

    2011-10-22

    Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux.

  12. Open Source Service Agent (OSSA) in the intelligence community's Open Source Architecture

    NASA Technical Reports Server (NTRS)

    Fiene, Bruce F.

    1994-01-01

    The Community Open Source Program Office (COSPO) has developed an architecture for the intelligence community's new Open Source Information System (OSIS). The architecture is a multi-phased program featuring connectivity, interoperability, and functionality. OSIS is based on a distributed architecture concept. The system is designed to function as a virtual entity. OSIS will be a restricted (non-public), user configured network employing Internet communications. Privacy and authentication will be provided through firewall protection. Connection to OSIS can be made through any server on the Internet or through dial-up modems provided the appropriate firewall authentication system is installed on the client.

  13. Exploring the Role of Value Networks for Software Innovation

    NASA Astrophysics Data System (ADS)

    Morgan, Lorraine; Conboy, Kieran

    This paper describes a research-in-progress that aims to explore the applicability and implications of open innovation practices in two firms - one that employs agile development methods and another that utilizes open source software. The open innovation paradigm has a lot in common with open source and agile development methodologies. A particular strength of agile approaches is that they move away from 'introverted' development, involving only the development personnel, and intimately involves the customer in all areas of software creation, supposedly leading to the development of a more innovative and hence more valuable information system. Open source software (OSS) development also shares two key elements of the open innovation model, namely the collaborative development of the technology and shared rights to the use of the technology. However, one shortfall with agile development in particular is the narrow focus on a single customer representative. In response to this, we argue that current thinking regarding innovation needs to be extended to include multiple stakeholders both across and outside the organization. Additionally, for firms utilizing open source, it has been found that their position in a network of potential complementors determines the amount of superior value they create for their customers. Thus, this paper aims to get a better understanding of the applicability and implications of open innovation practices in firms that employ open source and agile development methodologies. In particular, a conceptual framework is derived for further testing.

  14. Design and Deployment of a General Purpose, Open Source LoRa to Wi-Fi Hub and Data Logger

    NASA Astrophysics Data System (ADS)

    DeBell, T. C.; Udell, C.; Kwon, M.; Selker, J. S.; Lopez Alcala, J. M.

    2017-12-01

    Methods and technologies facilitating internet connectivity and near-real-time status updates for in site environmental sensor data are of increasing interest in Earth Science. However, Open Source, Do-It-Yourself technologies that enable plug and play functionality for web-connected sensors and devices remain largely inaccessible for typical researchers in our community. The Openly Published Environmental Sensing Lab at Oregon State University (OPEnS Lab) constructed an Open Source 900 MHz Long Range Radio (LoRa) receiver hub with SD card data logger, Ethernet and Wi-Fi shield, and 3D printed enclosure that dynamically uploads transmissions from multiple wirelessly-connected environmental sensing devices. Data transmissions may be received from devices up to 20km away. The hub time-stamps, saves to SD card, and uploads all transmissions to a Google Drive spreadsheet to be accessed in near-real-time by researchers and GeoVisualization applications (such as Arc GIS) for access, visualization, and analysis. This research expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting edge technology. This poster details our methods and evaluates the application of using 3D printing, Arduino Integrated Development Environment (IDE), Adafruit's Open-Hardware Feather development boards, and the WIZNET5500 Ethernet shield for designing this open-source, general purpose LoRa to Wi-Fi data logger.

  15. Analysis of hopanes and steranes in single oil-bearing fluid inclusions using time-of-flight secondary ion mass spectrometry (ToF-SIMS).

    PubMed

    Siljeström, S; Lausmaa, J; Sjövall, P; Broman, C; Thiel, V; Hode, T

    2010-01-01

    Steranes and hopanes are organic biomarkers used as indicators for the first appearance of eukaryotes and cyanobacteria on Earth. Oil-bearing fluid inclusions may provide a contamination-free source of Precambrian biomarkers, as the oil has been secluded from the environment since the formation of the inclusion. However, analysis of biomarkers in single oil-bearing fluid inclusions, which is often necessary due to the presence of different generations of inclusions, has not been possible due to the small size of most inclusions. Here, we have used time-of-flight secondary ion mass spectrometry (ToF-SIMS) to monitor in real time the opening of individual inclusions trapped in hydrothermal veins of fluorite and calcite and containing oil from Ordovician source rocks. Opening of the inclusions was performed by using a focused C(60)(+) ion beam and the in situ content was precisely analysed for C(27)-C(29) steranes and C(29)-C(32) hopanes using Bi(3)(+) as primary ions. The capacity to unambiguously detect these biomarkers in the picoliter amount of crude oil from a single, normal-sized (15-30 mum in diameter) inclusion makes the approach promising in the search of organic biomarkers for life's early evolution on Earth.

  16. Migrating a lecture in nursing informatics to a blended learning format--A bottom-up approach to implement an open-source web-based learning management system.

    PubMed

    Schrader, Ulrich

    2006-01-01

    At the university of applied sciences in Germany a learning management system has been implemented. The migration of classic courses to a web-enhances curriculum can be categorized into three phases independent of the technology used. The first two phases "dedicated website" and "database supported content management system" are mainly concerned with bringing the learning material and current information online and making it available to the students. The goal is here to make the maintenance of the learning material easier. The third phase characterized by the use of a learning management system offers the support of more modern didactic principles like social constructionism or problem-oriented learning. In this papers the phases as they occurred with the migration of a course of nursing informatics are described and experiences discussed.. The absence of institutional goals associated with the use of a learning management system led to a bottom-up approach triggered by faculty activities that can be described by a promoter model rather than by a process management model. The use of an open source learning management systems made this process easier to realize since no financial commitment is required up front.

  17. Pollination effects on antioxidant content of Perilla frutescens seeds analysed by NMR spectroscopy.

    PubMed

    Ferrazzi, Paola; Vercelli, Monica; Chakir, Amina; Romane, Abderrahmane; Mattana, Monica; Consonni, Roberto

    2017-12-01

    The effects of Perilla frutescens pollination on the content of seed antioxidants were analysed by agronomical and pollination trials, comparing seeds produced from bagged plants in 2013 (A) to prevent access to pollinating insects, and seeds from open-pollinated plants in 2013 (B) and 2015 (C). The seeds of open-pollinated plants were significantly more numerous and heavier than those of self-pollinated plants. 1 H NMR seed analysis showed a higher presence of phenolic compounds in open-pollinated seeds, mainly rosmarinic acid and flavonoids, apigenin and luteolin. Flavonoids were present in the glucosylated form in seeds (A) and (C), and in the aglycone form in seeds from (B) plants. Saturated and unsaturated fatty acids (palmitic, linoleic and linolenic) were more abundant in seeds from self-pollinated flowers. Pollination performed almost exclusively by the honeybee notably increased the antioxidant content in perilla seeds and gave rise to a reduction in the fatty acid content.

  18. The use of open source electronic health records within the federal safety net.

    PubMed

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    To conduct a federally funded study that examines the acquisition, implementation and operation of open source electronic health records (EHR) within safety net medical settings, such as federally qualified health centers (FQHC). The study was conducted by the National Opinion Research Center (NORC) at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to West Virginia, California and Arizona FQHC that were currently using an open source EHR. Five of the six sites that were chosen as part of the study found a number of advantages in the use of their open source EHR system, such as utilizing a large community of users and developers to modify their EHR to fit the needs of their provider and patient communities, and lower acquisition and implementation costs as compared to a commercial system. Despite these advantages, many of the informants and site visit participants felt that widespread dissemination and use of open source was restrained due to a negative connotation regarding this type of software. In addition, a number of participants stated that there is a necessary level of technical acumen needed within the FQHC to make an open source EHR effective. An open source EHR provides advantages for FQHC that have limited resources to acquire and implement an EHR, but additional study is needed to evaluate its overall effectiveness.

  19. Open source electronic health records and chronic disease management

    PubMed Central

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    Objective To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). Methods and Materials The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Results Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. Discussion The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. Conclusions The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC. PMID:23813566

  20. GC-Content Normalization for RNA-Seq Data

    PubMed Central

    2011-01-01

    Background Transcriptome sequencing (RNA-Seq) has become the assay of choice for high-throughput studies of gene expression. However, as is the case with microarrays, major technology-related artifacts and biases affect the resulting expression measures. Normalization is therefore essential to ensure accurate inference of expression levels and subsequent analyses thereof. Results We focus on biases related to GC-content and demonstrate the existence of strong sample-specific GC-content effects on RNA-Seq read counts, which can substantially bias differential expression analysis. We propose three simple within-lane gene-level GC-content normalization approaches and assess their performance on two different RNA-Seq datasets, involving different species and experimental designs. Our methods are compared to state-of-the-art normalization procedures in terms of bias and mean squared error for expression fold-change estimation and in terms of Type I error and p-value distributions for tests of differential expression. The exploratory data analysis and normalization methods proposed in this article are implemented in the open-source Bioconductor R package EDASeq. Conclusions Our within-lane normalization procedures, followed by between-lane normalization, reduce GC-content bias and lead to more accurate estimates of expression fold-changes and tests of differential expression. Such results are crucial for the biological interpretation of RNA-Seq experiments, where downstream analyses can be sensitive to the supplied lists of genes. PMID:22177264

  1. Visualizing petroleum systems with a combination of GIS and multimedia technologies: An example from the West Siberia Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walsh, D.B.; Grace, J.D.

    1996-12-31

    Petroleum system studies provide an ideal application for the combination of Geographic Information System (GIS) and multimedia technologies. GIS technology is used to build and maintain the spatial and tabular data within the study region. Spatial data may comprise the zones of active source rocks and potential reservoir facies. Similarly, tabular data include the attendant source rock parameters (e.g. pyroloysis results, organic carbon content) and field-level exploration and production histories for the basin. Once the spatial and tabular data base has been constructed, GIS technology is useful in finding favorable exploration trends, such as zones of high organic content, maturemore » source rocks in positions adjacent to sealed, high porosity reservoir facies. Multimedia technology provides powerful visualization tools for petroleum system studies. The components of petroleum system development, most importantly generation, migration and trap development typically span periods of tens to hundreds of millions of years. The ability to animate spatial data over time provides an insightful alternative for studying the development of processes which are only captured in {open_quotes}snapshots{close_quotes} by static maps. New multimedia-authoring software provides this temporal dimension. The ability to record this data on CD-ROMs and allow user- interactivity further leverages the combination of spatial data bases, tabular data bases and time-based animations. The example used for this study was the Bazhenov-Neocomian petroleum system of West Siberia.« less

  2. Visualizing petroleum systems with a combination of GIS and multimedia technologies: An example from the West Siberia Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walsh, D.B.; Grace, J.D.

    1996-01-01

    Petroleum system studies provide an ideal application for the combination of Geographic Information System (GIS) and multimedia technologies. GIS technology is used to build and maintain the spatial and tabular data within the study region. Spatial data may comprise the zones of active source rocks and potential reservoir facies. Similarly, tabular data include the attendant source rock parameters (e.g. pyroloysis results, organic carbon content) and field-level exploration and production histories for the basin. Once the spatial and tabular data base has been constructed, GIS technology is useful in finding favorable exploration trends, such as zones of high organic content, maturemore » source rocks in positions adjacent to sealed, high porosity reservoir facies. Multimedia technology provides powerful visualization tools for petroleum system studies. The components of petroleum system development, most importantly generation, migration and trap development typically span periods of tens to hundreds of millions of years. The ability to animate spatial data over time provides an insightful alternative for studying the development of processes which are only captured in [open quotes]snapshots[close quotes] by static maps. New multimedia-authoring software provides this temporal dimension. The ability to record this data on CD-ROMs and allow user- interactivity further leverages the combination of spatial data bases, tabular data bases and time-based animations. The example used for this study was the Bazhenov-Neocomian petroleum system of West Siberia.« less

  3. Quantification of Organic richness through wireline logs: a case study of Roseneath shale formation, Cooper basin, Australia

    NASA Astrophysics Data System (ADS)

    Ahmad, Maqsood; Iqbal, Omer; Kadir, Askury Abd

    2017-10-01

    The late Carboniferous-Middle Triassic, intracratonic Cooper basin in northeastern South Australia and southwestern Queensland is Australia's foremost onshore hydrocarbon producing region. The basin compromises Permian carbonaceous shale like lacustrine Roseneath and Murteree shale formation which is acting as source and reservoir rock. The source rock can be distinguished from non-source intervals by lower density, higher transit time, higher gamma ray values, higher porosity and resistivity with increasing organic content. In current dissertation we have attempted to compare the different empirical approaches based on density relation and Δ LogR method through three overlays of sonic/resistivity, neutron/resistivity and density/resistivity to quantify Total organic content (TOC) of Permian lacustrine Roseneath shale formation using open hole wireline log data (DEN, GR, CNL, LLD) of Encounter 1 well. The TOC calculated from fourteen density relations at depth interval between 3174.5-3369 meters is averaged 0.56% while TOC from sonic/resistivity, neutron/resistivity and density/resistivity yielded an average value of 3.84%, 3.68%, 4.40%. The TOC from average of three overlay method is yielded to 3.98%. According to geochemical report in PIRSA the Roseneath shale formation has TOC from 1 - 5 wt %.There is unpromising correlations observed for calculated TOC from fourteen density relations and measured TOC on samples. The TOC from average value of three overlays using Δ LogR method showed good correlation with measured TOC on samples.

  4. Online Library Accessibility Support: A Case Study within the Open University Library

    ERIC Educational Resources Information Center

    Mears, Wendy; Clough, Helen

    2015-01-01

    The Open University (OU) is the UK's largest distance education provider and has a large and growing disabled student population. Disabled user support presents particular challenges for an online library service in the distance learning environment. The OU introduced guidelines for working with non-OU--authored content (external content) in 2011…

  5. Wide Open Spaces: Wikis Ready or Not

    ERIC Educational Resources Information Center

    Lamb, Brian

    2004-01-01

    Remember when the Internet was about opening up access to information and breaking down the barriers between content creators and content consumers? Think back to when spam was just a meatlike substance. To those heady days when Timothy Leary was predicting that the PC would be the LSD of the nineties. Before the DMCA. Before eBay. Back when the…

  6. Research Library Issues: A Bimonthly Report from ARL, CNI, and SPARC--A Special Issue on Strategies for Opening up Content. RLI 269

    ERIC Educational Resources Information Center

    Barrett, G. Jaia, Ed.

    2010-01-01

    "Research Library Issues" ("RLI") is a bimonthly report from ARL (Association of Research Libraries), CNI (Coalition of Networked Information), and SPARC (Scholarly Publishing and Academic Resources Coalition). This issue includes the following articles: (1) Strategies for Opening Up Content: Laying the Groundwork for an Open…

  7. A Near-Reality Approach to Improve the e-Learning Open Courseware

    ERIC Educational Resources Information Center

    Yu, Pao-Ta; Liao, Yuan-Hsun; Su, Ming-Hsiang

    2013-01-01

    The open courseware proposed by MIT with single streaming video has been widely accepted by most of the universities as their supplementary learning contents. In this streaming video, a digital video camera is used to capture the speaker's gesture and his/her PowerPoint presentation at the same time. However, the blurry content of PowerPoint…

  8. Lipid and fatty acid composition microalgae Chlorella vulgaris using photobioreactor and open pond

    NASA Astrophysics Data System (ADS)

    Jay, M. I.; Kawaroe, M.; Effendi, H.

    2018-03-01

    Microalgae contain lipids and fatty acids that can be the raw materials of biofuel. Previous studies have been known of using cultivation systems to obtain biomass of C. vulgaris which can be extracted to obtain lipid and fatty acid content. The observational step was observed ten days in photobioreactor and open pond for harvesting biomass using NaOH, lipid extraction using hexane and methanol, and fatty acid analysis using Gas Chromatography. Lipid content of microalgae biomass in photobioreactor and open pond was 2.26 ± 0.51% and 3.18 ± 0.80%, respectively. Fatty acid content ranged between 0.7-22.8% and 0.9-22.6% and the dominant fatty acids in both cultivating system was palmitic acid.

  9. OpenCFU, a new free and open-source software to count cell colonies and other circular objects.

    PubMed

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net.

  10. Using Open Source Software in Visual Simulation Development

    DTIC Science & Technology

    2005-09-01

    increased the use of the technology in training activities. Using open source/free software tools in the process can expand these possibilities...resulting in even greater cost reduction and allowing the flexibility needed in a training environment. This thesis presents a configuration and architecture...to be used when developing training visual simulations using both personal computers and open source tools. Aspects of the requirements needed in a

  11. Open-Source Intelligence in the Czech Military: Knowledge System and Process Design

    DTIC Science & Technology

    2002-06-01

    in Open-Source Intelligence OSINT, as one of the intelligence disciplines, bears some of the general problems of intelligence " business " OSINT...ADAPTING KNOWLEDGE MANAGEMENT THEORY TO THE CZECH MILITARY INTELLIGENCE Knowledge work is the core business of the military intelligence . As...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited OPEN-SOURCE INTELLIGENCE IN THE

  12. Writing in the Disciplines versus Corporate Workplaces: On the Importance of Conflicting Disciplinary Discourses in the Open Source Movement and the Value of Intellectual Property

    ERIC Educational Resources Information Center

    Ballentine, Brian D.

    2009-01-01

    Writing programs and more specifically, Writing in the Disciplines (WID) initiatives have begun to embrace the use of and the ideology inherent to, open source software. The Conference on College Composition and Communication has passed a resolution stating that whenever feasible educators and their institutions consider open source applications.…

  13. Anatomy of BioJS, an open source community for the life sciences.

    PubMed

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-07-08

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects.

  14. Build, Buy, Open Source, or Web 2.0?: Making an Informed Decision for Your Library

    ERIC Educational Resources Information Center

    Fagan, Jody Condit; Keach, Jennifer A.

    2010-01-01

    When improving a web presence, today's libraries have a choice: using a free Web 2.0 application, opting for open source, buying a product, or building a web application. This article discusses how to make an informed decision for one's library. The authors stress that deciding whether to use a free Web 2.0 application, to choose open source, to…

  15. Expanding Human Capabilities through the Adoption and Utilization of Free, Libre, and Open Source Software

    ERIC Educational Resources Information Center

    Simpson, James Daniel

    2014-01-01

    Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…

  16. A Framework for the Systematic Collection of Open Source Intelligence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pouchard, Line Catherine; Trien, Joseph P; Dobson, Jonathan D

    2009-01-01

    Following legislative directions, the Intelligence Community has been mandated to make greater use of Open Source Intelligence (OSINT). Efforts are underway to increase the use of OSINT but there are many obstacles. One of these obstacles is the lack of tools helping to manage the volume of available data and ascertain its credibility. We propose a unique system for selecting, collecting and storing Open Source data from the Web and the Open Source Center. Some data management tasks are automated, document source is retained, and metadata containing geographical coordinates are added to the documents. Analysts are thus empowered to search,more » view, store, and analyze Web data within a single tool. We present ORCAT I and ORCAT II, two implementations of the system.« less

  17. Endozoicomonas dominates the gill and intestinal content microbiomes of Mytilus edulis from Barnegat Bay, New Jersey

    USGS Publications Warehouse

    Schill, William B.; Iwanowicz, Deborah; Adams, Cynthia

    2017-01-01

    Blue mussels, Mytilus edulis, Linnaeus 1758 from southern Barnegat Bay, New Jersey were examined to determine the make-up of the normal blue mussel microbiome. Sequencing of 16S ribosomal DNA amplicons from gill and intestinal content microbiomes using the Illumina® MiSeq platform yielded 1,276,161 paired end sequence reads from the gill libraries and 1,092,333 paired end sequence reads from the intestinal content libraries. General bioinformatic analyses were conducted with the open-source packages Qiime and Mothur. Phylotype assignments to the genus level were made using the commercial One Codex platform. This resulted in 1,697,852 gill and 988,436 intestinal content sequences being classified to genus. A majority of these (67.6% and 37.2% respectively) were assigned to a single operational taxonomic unit (Mytilus edulis Symbiont, MeS) that has homologies with other recently described Endozoicomonas pathogens and symbionts of marine invertebrates. MeS shares 98% identity with an uncultured bacterium from the gill tissue of an invasive indo-Pacific oyster and with HQE1 and HQE2 isolated from the sea squirt, Styela clava. Other than MeS, most of the detected bacterial species are known from marine sediments and seawater.

  18. The open-source neutral-mass spectrometer on Atmosphere Explorer-C, -D, and -E.

    NASA Technical Reports Server (NTRS)

    Nier, A. O.; Potter, W. E.; Hickman, D. R.; Mauersberger, K.

    1973-01-01

    The open-source mass spectrometer will be used to obtain the number densities of the neutral atmospheric gases in the mass range 1 to 48 amu at the satellite location. The ion source has been designed to allow gas particles to enter the ionizing region with the minimum practicable number of prior collisions with surfaces. This design minimizes the loss of atomic oxygen and other reactive species due to reactions with the walls of the ion source. The principal features of the open-source spectrometer and the laboratory calibration system are discussed.

  19. Source identification and mass balance studies of mercury in Lake An-dong, S. Korea

    NASA Astrophysics Data System (ADS)

    Han, J.; Byeon, M.; Yoon, J.; Park, J.; Lee, M.; Huh, I.; Na, E.; Chung, D.; Shin, S.; Kim, Y.

    2009-12-01

    In this study, mercury and methylmercury were measured in atmospheric, tributary, open-lake water column, sediment, planktons and fish samples in the catchments area of Lake An-dong, S. Korea. Lake An-dong, an artificial freshwater lake is located on the upstream of River Nak-dong. It has 51.5 km2 of open surface water and 1.33 year of hydraulic residence time. It is a source of drinking water for 0.3 million S. Koreans. Recently, the possibilities of its mercury contamination became an issue since current studies showed that the lake had much higher mercury level in sediment and certain freshwater fish species than any other lakes in S. Korea. This catchments area has the possibilities of historical mercury pollution by the location of more than 50 abandoned gold mines and Young-poong zinc smelter. The objective of this study was to develop a mercury mass balance and identify possible mercury sources in the lake. The results of this study are thus expected to offer valuable insights for the sources of mercury loading through the watershed. In order to estimate the mercury flux, TGM, RGM and particulate mercury were measured using TEKRAN 2537 at the five sites surrounding Lake An-dong from May, 2009 with wet and dry deposition. The fate and transport of mercury in water body were predicted by using EFDC (Environmental Dynamic Fluid Code) and Mercury module in WASP7 (Water quality analysis program) after subsequent distribution into water body, sediments, followed by bioaccumulation and ultimate uptake by humans. The mercury mass balance in Young-poong zinc smelter was also pre-estimated by measuring mercury content in zinc ores, emission gases, sludge, wastewater and products.

  20. OpenFresco | OpenFresco

    Science.gov Websites

    Skip to content HOME NEWS USERS OpenFrescoExpress OpenFresco Examples & Tools Feedback staff and research students learning about hybrid simulation and starting to use this experimental the Pacific Earthquake Engineering Research Center (PEER) and others. Search Search for: Search Menu

  1. Pork as a Source of Omega-3 (n-3) Fatty Acids

    PubMed Central

    Dugan, Michael E.R.; Vahmani, Payam; Turner, Tyler D.; Mapiye, Cletos; Juárez, Manuel; Prieto, Nuria; Beaulieu, Angela D.; Zijlstra, Ruurd T.; Patience, John F.; Aalhus, Jennifer L.

    2015-01-01

    Pork is the most widely eaten meat in the world, but typical feeding practices give it a high omega-6 (n-6) to omega-3 (n-3) fatty acid ratio and make it a poor source of n-3 fatty acids. Feeding pigs n-3 fatty acids can increase their contents in pork, and in countries where label claims are permitted, claims can be met with limited feeding of n-3 fatty acid enrich feedstuffs, provided contributions of both fat and muscle are included in pork servings. Pork enriched with n-3 fatty acids is, however, not widely available. Producing and marketing n-3 fatty acid enriched pork requires regulatory approval, development costs, quality control costs, may increase production costs, and enriched pork has to be tracked to retail and sold for a premium. Mandatory labelling of the n-6/n-3 ratio and the n-3 fatty acid content of pork may help drive production of n-3 fatty acid enriched pork, and open the door to population-based disease prevention polices (i.e., food tax to provide incentives to improve production practices). A shift from the status-quo, however, will require stronger signals along the value chain indicating production of n-3 fatty acid enriched pork is an industry priority. PMID:26694475

  2. Pork as a Source of Omega-3 (n-3) Fatty Acids.

    PubMed

    Dugan, Michael E R; Vahmani, Payam; Turner, Tyler D; Mapiye, Cletos; Juárez, Manuel; Prieto, Nuria; Beaulieu, Angela D; Zijlstra, Ruurd T; Patience, John F; Aalhus, Jennifer L

    2015-12-16

    Pork is the most widely eaten meat in the world, but typical feeding practices give it a high omega-6 (n-6) to omega-3 (n-3) fatty acid ratio and make it a poor source of n-3 fatty acids. Feeding pigs n-3 fatty acids can increase their contents in pork, and in countries where label claims are permitted, claims can be met with limited feeding of n-3 fatty acid enrich feedstuffs, provided contributions of both fat and muscle are included in pork servings. Pork enriched with n-3 fatty acids is, however, not widely available. Producing and marketing n-3 fatty acid enriched pork requires regulatory approval, development costs, quality control costs, may increase production costs, and enriched pork has to be tracked to retail and sold for a premium. Mandatory labelling of the n-6/n-3 ratio and the n-3 fatty acid content of pork may help drive production of n-3 fatty acid enriched pork, and open the door to population-based disease prevention polices (i.e., food tax to provide incentives to improve production practices). A shift from the status-quo, however, will require stronger signals along the value chain indicating production of n-3 fatty acid enriched pork is an industry priority.

  3. A Three-Dimensional Approach and Open Source Structure for the Design and Experimentation of Teaching-Learning Sequences: The case of friction

    NASA Astrophysics Data System (ADS)

    Besson, Ugo; Borghi, Lidia; De Ambrosis, Anna; Mascheretti, Paolo

    2010-07-01

    We have developed a teaching-learning sequence (TLS) on friction based on a preliminary study involving three dimensions: an analysis of didactic research on the topic, an overview of usual approaches, and a critical analysis of the subject, considered also in its historical development. We found that mostly the usual presentations do not take into account the complexity of friction as it emerges from scientific research, may reinforce some inaccurate students' conceptions, and favour a limited vision of friction phenomena. The TLS we propose begins by considering a wide range of friction phenomena to favour an initial motivation and a broader view of the topic and then develops a path of interrelated observations, experiments, and theoretical aspects. It proposes the use of structural models, involving visual representations and stimulating intuition, aimed at helping students build mental models of friction mechanisms. To facilitate the reproducibility in school contexts, the sequence is designed as an open source structure, with a core of contents, conceptual correlations and methodological choices, and a cloud of elements that can be re-designed by teachers. The sequence has been tested in teacher education and in upper secondary school, and has shown positive results in overcoming student difficulties and stimulating richer reasoning based on the structural models we suggested. The proposed path has modified the teachers' view of the topic, producing a motivation to change their traditional presentations. The open structure of the sequence has facilitated its implementation by teachers in school in coherence with the rationale of the proposal.

  4. Preparing a scientific manuscript in Linux: Today's possibilities and limitations

    PubMed Central

    2011-01-01

    Background Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Findings Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux. PMID:22018246

  5. Open source bioimage informatics for cell biology.

    PubMed

    Swedlow, Jason R; Eliceiri, Kevin W

    2009-11-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.

  6. Implementation, reliability, and feasibility test of an Open-Source PACS.

    PubMed

    Valeri, Gianluca; Zuccaccia, Matteo; Badaloni, Andrea; Ciriaci, Damiano; La Riccia, Luigi; Mazzoni, Giovanni; Maggi, Stefania; Giovagnoni, Andrea

    2015-12-01

    To implement a hardware and software system able to perform the major functions of an Open-Source PACS, and to analyze it in a simulated real-world environment. A small home network was implemented, and the Open-Source operating system Ubuntu 11.10 was installed in a laptop containing the Dcm4chee suite with the software devices needed. The Open-Source PACS implemented is compatible with Linux OS, Microsoft OS, and Mac OS X; furthermore, it was used with operating systems that guarantee the operation in portable devices (smartphone, tablet) Android and iOS. An OSS PACS is useful for making tutorials and workshops on post-processing techniques for educational and training purposes.

  7. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  8. Opening up Education: The Collective Advancement of Education through Open Technology, Open Content, and Open Knowledge

    ERIC Educational Resources Information Center

    Iiyoshi, Toru, Ed.; Kumar, M. S. Vijay, Ed.

    2008-01-01

    Given the abundance of open education initiatives that aim to make educational assets freely available online, the time seems ripe to explore the potential of open education to transform the economics and ecology of education. Despite the diversity of tools and resources already available--from well-packaged course materials to simple games, for…

  9. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent processes from a different domain or have different spatial and temporal resolutions. An open source framework that bridges OpenMI and OpenDA is presented. The framework provides a generic and easy means for any OpenMI compliant model to assimilate observation measurements. An example test case will be presented using MikeSHE, and OpenMI compliant fully coupled integrated hydrological model that can accurately simulate the feedback dynamics of overland flow, unsaturated zone and saturated zone.

  10. The effect of crumb rubber particle size to the optimum binder content for open graded friction course.

    PubMed

    Ibrahim, Mohd Rasdan; Katman, Herda Yati; Karim, Mohamed Rehan; Koting, Suhana; Mashaan, Nuha S

    2014-01-01

    The main objective of this paper is to investigate the relations of rubber size, rubber content, and binder content in determination of optimum binder content for open graded friction course (OGFC). Mix gradation type B as specified in Specification for Porous Asphalt produced by the Road Engineering Association of Malaysia (REAM) was used in this study. Marshall specimens were prepared with four different sizes of rubber, namely, 20 mesh size [0.841 mm], 40 mesh [0.42 mm], 80 mesh [0.177 mm], and 100 mesh [0.149 mm] with different concentrations of rubberised bitumen (4%, 8%, and 12%) and different percentages of binder content (4%-7%). The appropriate optimum binder content is then selected according to the results of the air voids, binder draindown, and abrasion loss test. Test results found that crumb rubber particle size can affect the optimum binder content for OGFC.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.

    When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modularmore » In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.« less

  12. Open source OCR framework using mobile devices

    NASA Astrophysics Data System (ADS)

    Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan

    2008-02-01

    Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.

  13. Open-source colorimeter.

    PubMed

    Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M

    2013-04-19

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.

  14. Open-Source Colorimeter

    PubMed Central

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032

  15. OpenMebius: an open source software for isotopically nonstationary 13C-based metabolic flux analysis.

    PubMed

    Kajihata, Shuichi; Furusawa, Chikara; Matsuda, Fumio; Shimizu, Hiroshi

    2014-01-01

    The in vivo measurement of metabolic flux by (13)C-based metabolic flux analysis ((13)C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a (13)C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas (13)C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary (13)C metabolic flux analysis (INST-(13)C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-(13)C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-(13)C-MFA. Confidence intervals determined by INST-(13)C-MFA were less than those determined by conventional methods, indicating the potential of INST-(13)C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-(13)C-MFA.

  16. The Isonzo front in the First World War: glass ampoules found in the vicinity of the village Kred.

    PubMed

    Krbavcic, Ales

    2015-01-01

    To identify the contents of ampoules stored items at the WW I Kobarid Museum, Slovenia. Sources and methods: Analysis of ampoules from the Kobarid Museum using pharmacopoeial methods. The contents of the unlabelled ampoules were identified as calcium hypochlorite, a decontaminant for mustard gas (Yperite). The Isonzo front/Soška fronta was opened on May 24 1915 by the Kingdom of Italy according to a secret London Treaty. In exchange for the opening of this front, the Kingdom of Italy would be granted large tracts of territory in the wesster provinces of the Austro-Hungarian Empire and along the Adriatic coast. The ensuing trench-warfare during the eleven Isonzo battles ended with the 12th battle known as the Kobarid/Karfeit/Caporetto break-thru in October 1917. The joint German and Austro-Hungarian forces waged a massive gas-attack with dichloroarsine and phosgene, which was later disclosed as the horrifying overture to the general disordered retreat of the Italian troops to Piave. The possibility of a chemical attack was underestimated by the Italian high command as shown by the ineffective gas-masks issued to the troops. Hovewer, a recent find of ampoules with calcium hypochlorite at the village of Kred, now exhibited at the Kobarid WWI Museum, leads to the conclusion that the Italian IVth army's command, located in Kred, considered decontamination measures against Yperite necessary.

  17. Simulation for Dynamic Situation Awareness and Prediction III

    DTIC Science & Technology

    2010-03-01

    source Java ™ library for capturing and sending network packets; 4) Groovy – an open source, Java -based scripting language (version 1.6 or newer). Open...DMOTH Analyzer application. Groovy is an open source dynamic scripting language for the Java Virtual Machine. It is consistent with Java syntax...between temperature, pressure, wind and relative humidity, and 3) a precipitation editing algorithm. The Editor can be used to prepare scripted changes

  18. Transforming High School Classrooms with Free/Open Source Software: "It's Time for an Open Source Software Revolution"

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2008-01-01

    Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…

  19. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  20. Open source tools for fluorescent imaging.

    PubMed

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  2. Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.

    PubMed

    Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel

    2015-01-01

    There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).

  3. The use of open source electronic health records within the federal safety net

    PubMed Central

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    Objective To conduct a federally funded study that examines the acquisition, implementation and operation of open source electronic health records (EHR) within safety net medical settings, such as federally qualified health centers (FQHC). Methods and materials The study was conducted by the National Opinion Research Center (NORC) at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to West Virginia, California and Arizona FQHC that were currently using an open source EHR. Results Five of the six sites that were chosen as part of the study found a number of advantages in the use of their open source EHR system, such as utilizing a large community of users and developers to modify their EHR to fit the needs of their provider and patient communities, and lower acquisition and implementation costs as compared to a commercial system. Discussion Despite these advantages, many of the informants and site visit participants felt that widespread dissemination and use of open source was restrained due to a negative connotation regarding this type of software. In addition, a number of participants stated that there is a necessary level of technical acumen needed within the FQHC to make an open source EHR effective. Conclusions An open source EHR provides advantages for FQHC that have limited resources to acquire and implement an EHR, but additional study is needed to evaluate its overall effectiveness. PMID:23744787

  4. Library Subject Guides: A Content Management Case Study at the Open University, UK

    ERIC Educational Resources Information Center

    Wales, Tim

    2005-01-01

    Purpose: To share the experiences and challenges faced by the Open University Library (OUL) in developing a content management (CM) system for its subject guides. Design/methodology/approach: A summary of multi-format subject guide production at the OUL is provided to justify the decision to develop a new system for their production using a…

  5. APPLYING OPEN-PATH OPTICAL SPECTROSCOPY TO HEAVY-DUTY DIESEL EMISSIONS

    EPA Science Inventory

    Non-dispersive infrared absorption has been used to measure gaseous emissions for both stationary and mobile sources. Fourier transform infrared spectroscopy has been used for stationary sources as both extractive and open-path methods. We have applied the open-path method for bo...

  6. Virtual Patients on the Semantic Web: A Proof-of-Application Study

    PubMed Central

    Dafli, Eleni; Antoniou, Panagiotis; Ioannidis, Lazaros; Dombros, Nicholas; Topps, David

    2015-01-01

    Background Virtual patients are interactive computer simulations that are increasingly used as learning activities in modern health care education, especially in teaching clinical decision making. A key challenge is how to retrieve and repurpose virtual patients as unique types of educational resources between different platforms because of the lack of standardized content-retrieving and repurposing mechanisms. Semantic Web technologies provide the capability, through structured information, for easy retrieval, reuse, repurposing, and exchange of virtual patients between different systems. Objective An attempt to address this challenge has been made through the mEducator Best Practice Network, which provisioned frameworks for the discovery, retrieval, sharing, and reuse of medical educational resources. We have extended the OpenLabyrinth virtual patient authoring and deployment platform to facilitate the repurposing and retrieval of existing virtual patient material. Methods A standalone Web distribution and Web interface, which contains an extension for the OpenLabyrinth virtual patient authoring system, was implemented. This extension was designed to semantically annotate virtual patients to facilitate intelligent searches, complex queries, and easy exchange between institutions. The OpenLabyrinth extension enables OpenLabyrinth authors to integrate and share virtual patient case metadata within the mEducator3.0 network. Evaluation included 3 successive steps: (1) expert reviews; (2) evaluation of the ability of health care professionals and medical students to create, share, and exchange virtual patients through specific scenarios in extended OpenLabyrinth (OLabX); and (3) evaluation of the repurposed learning objects that emerged from the procedure. Results We evaluated 30 repurposed virtual patient cases. The evaluation, with a total of 98 participants, demonstrated the system’s main strength: the core repurposing capacity. The extensive metadata schema presentation facilitated user exploration and filtering of resources. Usability weaknesses were primarily related to standard computer applications’ ease of use provisions. Most evaluators provided positive feedback regarding educational experiences on both content and system usability. Evaluation results replicated across several independent evaluation events. Conclusions The OpenLabyrinth extension, as part of the semantic mEducator3.0 approach, is a virtual patient sharing approach that builds on a collection of Semantic Web services and federates existing sources of clinical and educational data. It is an effective sharing tool for virtual patients and has been merged into the next version of the app (OpenLabyrinth 3.3). Such tool extensions may enhance the medical education arsenal with capacities of creating simulation/game-based learning episodes, massive open online courses, curricular transformations, and a future robust infrastructure for enabling mobile learning. PMID:25616272

  7. Virtual patients on the semantic Web: a proof-of-application study.

    PubMed

    Dafli, Eleni; Antoniou, Panagiotis; Ioannidis, Lazaros; Dombros, Nicholas; Topps, David; Bamidis, Panagiotis D

    2015-01-22

    Virtual patients are interactive computer simulations that are increasingly used as learning activities in modern health care education, especially in teaching clinical decision making. A key challenge is how to retrieve and repurpose virtual patients as unique types of educational resources between different platforms because of the lack of standardized content-retrieving and repurposing mechanisms. Semantic Web technologies provide the capability, through structured information, for easy retrieval, reuse, repurposing, and exchange of virtual patients between different systems. An attempt to address this challenge has been made through the mEducator Best Practice Network, which provisioned frameworks for the discovery, retrieval, sharing, and reuse of medical educational resources. We have extended the OpenLabyrinth virtual patient authoring and deployment platform to facilitate the repurposing and retrieval of existing virtual patient material. A standalone Web distribution and Web interface, which contains an extension for the OpenLabyrinth virtual patient authoring system, was implemented. This extension was designed to semantically annotate virtual patients to facilitate intelligent searches, complex queries, and easy exchange between institutions. The OpenLabyrinth extension enables OpenLabyrinth authors to integrate and share virtual patient case metadata within the mEducator3.0 network. Evaluation included 3 successive steps: (1) expert reviews; (2) evaluation of the ability of health care professionals and medical students to create, share, and exchange virtual patients through specific scenarios in extended OpenLabyrinth (OLabX); and (3) evaluation of the repurposed learning objects that emerged from the procedure. We evaluated 30 repurposed virtual patient cases. The evaluation, with a total of 98 participants, demonstrated the system's main strength: the core repurposing capacity. The extensive metadata schema presentation facilitated user exploration and filtering of resources. Usability weaknesses were primarily related to standard computer applications' ease of use provisions. Most evaluators provided positive feedback regarding educational experiences on both content and system usability. Evaluation results replicated across several independent evaluation events. The OpenLabyrinth extension, as part of the semantic mEducator3.0 approach, is a virtual patient sharing approach that builds on a collection of Semantic Web services and federates existing sources of clinical and educational data. It is an effective sharing tool for virtual patients and has been merged into the next version of the app (OpenLabyrinth 3.3). Such tool extensions may enhance the medical education arsenal with capacities of creating simulation/game-based learning episodes, massive open online courses, curricular transformations, and a future robust infrastructure for enabling mobile learning.

  8. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.

    PubMed

    Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  9. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy

    NASA Astrophysics Data System (ADS)

    Barabas, Federico M.; Masullo, Luciano A.; Stefani, Fernando D.

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  10. OpenCFU, a New Free and Open-Source Software to Count Cell Colonies and Other Circular Objects

    PubMed Central

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net. PMID:23457446

  11. Utilization of open source electronic health record around the world: A systematic review.

    PubMed

    Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahamdi, Maryam

    2014-01-01

    Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems.

  12. Semiotic foundation for multisensor-multilook fusion

    NASA Astrophysics Data System (ADS)

    Myler, Harley R.

    1998-07-01

    This paper explores the concept of an application of semiotic principles to the design of a multisensor-multilook fusion system. Semiotics is an approach to analysis that attempts to process media in a united way using qualitative methods as opposed to quantitative. The term semiotic refers to signs, or signatory data that encapsulates information. Semiotic analysis involves the extraction of signs from information sources and the subsequent processing of the signs into meaningful interpretations of the information content of the source. The multisensor fusion problem predicated on a semiotic system structure and incorporating semiotic analysis techniques is explored and the design for a multisensor system as an information fusion system is explored. Semiotic analysis opens the possibility of using non-traditional sensor sources and modalities in the fusion process, such as verbal and textual intelligence derived from human observers. Examples of how multisensor/multimodality data might be analyzed semiotically is shown and discussion on how a semiotic system for multisensor fusion could be realized is outlined. The architecture of a semiotic multisensor fusion processor that can accept situational awareness data is described, although an implementation has not as yet been constructed.

  13. Graduate entry nurses' initial perspectives on nursing: Content analysis of open-ended survey questions.

    PubMed

    McKenna, Lisa; Brooks, Ingrid; Vanderheide, Rebecca

    2017-02-01

    Graduate entry nursing courses offer individuals with prior degrees the opportunity to gain nursing qualifications and facilitate career change. While it is known that accelerated graduate entry courses are increasingly popular, the perceptions of nursing held by such individuals and the influence this has on those seeking to enter the profession are less clearly understood. To explore graduate entry nursing students' perceptions of nursing on entering their pre-registration course. A descriptive design utilising cross-section survey with two open-ended questions: What do you believe the role of the nurse is? What things have influenced that view? were asked. Demographic data were analysed using descriptive frequencies, while the two open-ended questions were analysed using summative content analysis. One university-based postgraduate graduate entry nursing course in Australia PARTICIPANTS: Eight cohorts (n=286) commencing students with prior degrees other than nursing. The course attracts students from diverse backgrounds. Exposure to nursing and nurses, either as a consumer of health care or other health care role, plays a primary role in influencing career change. However, similar to those found with school leavers, there remains much misinformation about nurses' roles for students in these courses. Most identify the role of caring in nursing. For some, media representations are the only information sources. Graduate entry courses offer opportunities to attract new nurses and contribute to addressing workforce shortages. However, there is still a lack of knowledge of nursing roles among students on entry. More work is required by the profession to ensure nursing is accurately and positively represented to the community. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  14. Limits to Open Class Performance?

    NASA Technical Reports Server (NTRS)

    Bowers, Albion H.

    2007-01-01

    This viewgraph presentation describes the limits to open class performance. The contents include: 1) Standard Class; 2) 15m/Racing Class; 3) Open Class; and 4) Design Solutions associated with assumptions, limiting parameters, airfoil performance, current trends, and analysis.

  15. A New Architecture for Visualization: Open Mission Control Technologies

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2017-01-01

    Open Mission Control Technologies (MCT) is a new architecture for visualisation of mission data. Driven by requirements for new mission capabilities, including distributed mission operations, access to data anywhere, customization by users, synthesis of multiple data sources, and flexibility for multi-mission adaptation, Open MCT provides users with an integrated customizable environment. Developed at NASAs Ames Research Center (ARC), in collaboration with NASAs Advanced Multimission Operations System (AMMOS) and NASAs Jet Propulsion Laboratory (JPL), Open MCT is getting its first mission use on the Jason 3 Mission, and is also available in the testbed for the Mars 2020 Rover and for development use for NASAs Resource Prospector Lunar Rover. The open source nature of the project provides for use outside of space missions, including open source contributions from a community of users. The defining features of Open MCT for mission users are data integration, end user composition and multiple views. Data integration provides access to mission data across domains in one place, making data such as activities, timelines, telemetry, imagery, event timers and procedures available in one place, without application switching. End user composition provides users with layouts, which act as a canvas to assemble visualisations. Multiple views provide the capability to view the same data in different ways, with live switching of data views in place. Open MCT is browser based, and works on the desktop as well as tablets and phones, providing access to data anywhere. An early use case for mobile data access took place on the Resource Prospector (RP) Mission Distributed Operations Test, in which rover engineers in the field were able to view telemetry on their phones. We envision this capability providing decision support to on console operators from off duty personnel. The plug-in architecture also allows for adaptation for different mission capabilities. Different data types and capabilities may be added or removed using plugins. An API provides a means to write new capabilities and to create data adaptors. Data plugins exist for mission data sources for NASA missions. Adaptors have been written by international and commercial users. Open MCT is open source. Open source enables collaborative development across organizations and also makes the product available outside of the space community, providing a potential source of usage and ideas to drive product design and development. The combination of open source with an Apache 2 license, and distribution on GitHub, has enabled an active community of users and contributors. The spectrum of users for Open MCT is, to our knowledge, unprecedented for mission software. In addition to our NASA users, we have, through open source, had users and inquires on projects ranging from Internet of Things, to radio hobbyists, to farming projects. We have an active community of contributors, enabling a flow of ideas inside and outside of the space community.

  16. An open-source solution for advanced imaging flow cytometry data analysis using machine learning.

    PubMed

    Hennig, Holger; Rees, Paul; Blasi, Thomas; Kamentsky, Lee; Hung, Jane; Dao, David; Carpenter, Anne E; Filby, Andrew

    2017-01-01

    Imaging flow cytometry (IFC) enables the high throughput collection of morphological and spatial information from hundreds of thousands of single cells. This high content, information rich image data can in theory resolve important biological differences among complex, often heterogeneous biological samples. However, data analysis is often performed in a highly manual and subjective manner using very limited image analysis techniques in combination with conventional flow cytometry gating strategies. This approach is not scalable to the hundreds of available image-based features per cell and thus makes use of only a fraction of the spatial and morphometric information. As a result, the quality, reproducibility and rigour of results are limited by the skill, experience and ingenuity of the data analyst. Here, we describe a pipeline using open-source software that leverages the rich information in digital imagery using machine learning algorithms. Compensated and corrected raw image files (.rif) data files from an imaging flow cytometer (the proprietary .cif file format) are imported into the open-source software CellProfiler, where an image processing pipeline identifies cells and subcellular compartments allowing hundreds of morphological features to be measured. This high-dimensional data can then be analysed using cutting-edge machine learning and clustering approaches using "user-friendly" platforms such as CellProfiler Analyst. Researchers can train an automated cell classifier to recognize different cell types, cell cycle phases, drug treatment/control conditions, etc., using supervised machine learning. This workflow should enable the scientific community to leverage the full analytical power of IFC-derived data sets. It will help to reveal otherwise unappreciated populations of cells based on features that may be hidden to the human eye that include subtle measured differences in label free detection channels such as bright-field and dark-field imagery. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. An Integrated SNP Mining and Utilization (ISMU) Pipeline for Next Generation Sequencing Data

    PubMed Central

    Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M.; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A. V. S. K.; Varshney, Rajeev K.

    2014-01-01

    Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software. PMID:25003610

  18. Trace metal depositional patterns from an open pit mining activity as revealed by archived avian gizzard contents.

    PubMed

    Bendell, L I

    2011-02-15

    Archived samples of blue grouse (Dendragapus obscurus) gizzard contents, inclusive of grit, collected yearly between 1959 and 1970 were analyzed for cadmium, lead, zinc, and copper content. Approximately halfway through the 12-year sampling period, an open-pit copper mine began activities, then ceased operations 2 years later. Thus the archived samples provided a unique opportunity to determine if avian gizzard contents, inclusive of grit, could reveal patterns in the anthropogenic deposition of trace metals associated with mining activities. Gizzard concentrations of cadmium and copper strongly coincided with the onset of opening and the closing of the pit mining activity. Gizzard zinc and lead demonstrated significant among year variation; however, maximum concentrations did not correlate to mining activity. The archived gizzard contents did provide a useful tool for documenting trends in metal depositional patterns related to an anthropogenic activity. Further, blue grouse ingesting grit particles during the time of active mining activity would have been exposed to toxicologically significant levels of cadmium. Gizzard lead concentrations were also of toxicological significance but not related to mining activity. This type of "pulse" toxic metal exposure as a consequence of open-pit mining activity would not necessarily have been revealed through a "snap-shot" of soil, plant or avian tissue trace metal analysis post-mining activity. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Open source tools for ATR development and performance evaluation

    NASA Astrophysics Data System (ADS)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  20. Open Source Hbim for Cultural Heritage: a Project Proposal

    NASA Astrophysics Data System (ADS)

    Diara, F.; Rinaudo, F.

    2018-05-01

    Actual technologies are changing Cultural Heritage research, analysis, conservation and development ways, allowing new innovative approaches. The possibility of integrating Cultural Heritage data, like archaeological information, inside a three-dimensional environment system (like a Building Information Modelling) involve huge benefits for its management, monitoring and valorisation. Nowadays there are many commercial BIM solutions. However, these tools are thought and developed mostly for architecture design or technical installations. An example of better solution could be a dynamic and open platform that might consider Cultural Heritage needs as priority. Suitable solution for better and complete data usability and accessibility could be guaranteed by open source protocols. This choice would allow adapting software to Cultural Heritage needs and not the opposite, thus avoiding methodological stretches. This work will focus exactly on analysis and experimentations about specific characteristics of these kind of open source software (DBMS, CAD, Servers) applied to a Cultural Heritage example, in order to verifying their flexibility, reliability and then creating a dynamic HBIM open source prototype. Indeed, it might be a starting point for a future creation of a complete HBIM open source solution that we could adapt to others Cultural Heritage researches and analysis.

  1. Open Source Clinical NLP - More than Any Single System.

    PubMed

    Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang

    2014-01-01

    The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.

  2. The Use of Open Source Software in the Global Land Ice Measurements From Space (GLIMS) Project, and the Relevance to Institutional Cooperation

    Treesearch

    Christopher W. Helm

    2006-01-01

    GLIMS is a NASA funded project that utilizes Open-Source Software to achieve its goal of creating a globally complete inventory of glaciers. The participation of many international institutions and the development of on-line mapping applications to provide access to glacial data have both been enhanced by Open-Source GIS capabilities and play a crucial role in the...

  3. Meteorological Error Budget Using Open Source Data

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using Open- Source Data by J Cogan, J Smith, P...needed. Do not return it to the originator. ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using...Error Budget Using Open-Source Data 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) J Cogan, J Smith, P Haines

  4. Open source bioimage informatics for cell biology

    PubMed Central

    Swedlow, Jason R.; Eliceiri, Kevin W.

    2009-01-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery. PMID:19833518

  5. bioGUID: resolving, discovering, and minting identifiers for biodiversity informatics

    PubMed Central

    Page, Roderic DM

    2009-01-01

    Background Linking together the data of interest to biodiversity researchers (including specimen records, images, taxonomic names, and DNA sequences) requires services that can mint, resolve, and discover globally unique identifiers (including, but not limited to, DOIs, HTTP URIs, and LSIDs). Results bioGUID implements a range of services, the core ones being an OpenURL resolver for bibliographic resources, and a LSID resolver. The LSID resolver supports Linked Data-friendly resolution using HTTP 303 redirects and content negotiation. Additional services include journal ISSN look-up, author name matching, and a tool to monitor the status of biodiversity data providers. Conclusion bioGUID is available at . Source code is available from . PMID:19900301

  6. Lunar rocks as a source of oxygen

    NASA Technical Reports Server (NTRS)

    Poole, H. G.

    1963-01-01

    A thermodynamic study of the thermal stability of conventional terrestrial minerals in a hypothetical lunar atmosphere has opened some interesting speculation. Much of the Earth's crust is composed of oxides of silicon, aluminum, magnesium, and related compounds. These crust components may be as much a product of the Earth's atmosphere as vegetation and animal life. Though inanimate and long considered imperishable, these materials are stable under conditions of an atmosphere equivalent to 34 ft of water at sea level and persist under adverse conditions of moisture and temperature to altitudes of roughly 29,000 ft above sea level. The oxygen content averages 21% ) and the oxygen partial pressure would be roughly 1/5 atm.

  7. dendextend: an R package for visualizing, adjusting and comparing trees of hierarchical clustering

    PubMed Central

    2015-01-01

    Summary: dendextend is an R package for creating and comparing visually appealing tree diagrams. dendextend provides utility functions for manipulating dendrogram objects (their color, shape and content) as well as several advanced methods for comparing trees to one another (both statistically and visually). As such, dendextend offers a flexible framework for enhancing R's rich ecosystem of packages for performing hierarchical clustering of items. Availability and implementation: The dendextend R package (including detailed introductory vignettes) is available under the GPL-2 Open Source license and is freely available to download from CRAN at: (http://cran.r-project.org/package=dendextend) Contact: Tal.Galili@math.tau.ac.il PMID:26209431

  8. Fluorescently tuned nitrogen-doped carbon dots from carbon source with different content of carboxyl groups

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hao; Wang, Yun; Dai, Xiao

    2015-08-01

    In this study, fluorescent nitrogen-doped carbon dots (NCDs) were tuned via varying the sources with different number of carboxyl groups. Owing to the interaction between amino and carboxyl, more amino groups conjugate the surface of the NCDs by the source with more carboxyl groups. Fluorescent NCDs were tuned via varying the sources with different content of carboxyl groups. Correspondingly, the nitrogen content, fluorescence quantum yields and lifetime of NCDs increases with the content of carboxyl groups from the source. Furthermore, cytotoxicity assay and cell imaging test indicate that the resultant NCDs possess low cytotoxicity and excellent biocompatibility.

  9. Free and Innovative Teaching Resources for STEM Educators

    NASA Astrophysics Data System (ADS)

    Weber, W. J.; McWhirter, J.; Dirks, D.

    2014-12-01

    The Unidata Program Center has implemented a teaching resource facility that allows educators to create, access, and share collections of resource material related to atmospheric, oceanic, and other earth system phenomena. While the facility can manage almost any type of electronic resource, it is designed with scientific data and products, teaching tools such as lesson plans and guided exercises, and tools for displaying data in mind. In addition to being very easy for educators and students to access, the facility makes it simple for other educators and scientists to contribute content related to their own areas of expertise to the collection. This allows existing teaching resources to grow in depth and breadth over time, enhancing their relevance and providing insights from multiple disciplines. Based on the open-source RAMADDA content/data management framework, the teaching resource facility provides a variety of built-in services to analyze and display data, as well as support for Unidata's rich 3D client, the Interactive Data Viewer (IDV).

  10. 48 CFR 1806.303-2 - Content.

    Code of Federal Regulations, 2010 CFR

    1997-10-01

    ... 48 Federal Acquisition Regulations System 6 1997-10-01 1997-10-01 false Content. 1806.303-2 Section 1806.303-2 COMPETITION AND ACQUISITION PLANNING COMPETITION REQUIREMENTS Other Than Full and Open Competition 1806.303-2 Content. ...

  11. The Use of Sahris as a State Sponsored Digital Heritage Repository and Management System in South Africa

    NASA Astrophysics Data System (ADS)

    Wiltshire, N. G.

    2013-07-01

    SAHRA has developed versions 1 and 2 of the South African Heritage Resources Information System (SAHRIS - http://www.sahra.org.za) in 2012 and 2013. The system has been rolled out since May 2012 to the national and provincial heritage authorities in South Africa in line with the National Heritage Resources Act (Act 25 of 1999). SAHRIS was developed using Drupal and Geoserver, both of which are free open source software packages. The three core functions of SAHRIS include: an online application system for developments that is integrated with a commenting module for public participation; a national sites archive of heritage sites; and a comprehensive collections management system for objects. With Geoserver, Openlayers and GMAP, users are provided with an online GIS platform that is integrated with most of the content types on SAHRIS. More than 21000 sites have already been migrated into SAHRIS along with over 4300 objects. The media and reports archive has already grown to 500 gigabytes, data storage is offered free of charge and so far 96 Terabytes of replicated storage have been installed. The distribution and dissemination of this content is facilitated by the adoption of The Creative Commons South Africa license. Lessons learnt from previous attempts to develop SAHRIS are covered briefly in light of the opportunities that have been opened up by the relatively recent maturation of FOSS content management systems. The current uptake of SAHRIS and the solutions to the challenges faced thus far are discussed before concluding with the implications for E-governance in South Africa.

  12. Numerical Simulation of Dispersion from Urban Greenhouse Gas Sources

    NASA Astrophysics Data System (ADS)

    Nottrott, Anders; Tan, Sze; He, Yonggang; Winkler, Renato

    2017-04-01

    Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model scalar emissions from various components of the natural gas distribution system, to study the impact of urban meteorology on mobile greenhouse gas measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of plumes, due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments. The Boussinesq approximation was applied to investigate the effects of canopy layer temperature gradients and convection on sensor footprints.

  13. Heating the Ice-Covered Lakes of the McMurdo Dry Valleys, Antarctica - Decadal Trends in Heat Content, Ice Thickness, and Heat Exchange

    NASA Astrophysics Data System (ADS)

    Gooseff, M. N.; Priscu, J. C.; Doran, P. T.; Chiuchiolo, A.; Obryk, M.

    2014-12-01

    Lakes integrate landscape processes and climate conditions. Most of the permanently ice-covered lakes in the McMurdo Dry Valleys, Antarctica are closed basin, receiving glacial melt water from streams for 10-12 weeks per year. Lake levels rise during the austral summer are balanced by sublimation of ice covers (year-round) and evaporation of open water moats (summer only). Vertical profiles of water temperature have been measured in three lakes in Taylor Valley since 1988. Up to 2002, lake levels were dropping, ice covers were thickening, and total heat contents were decreasing. These lakes have been gaining heat since the mid-2000s, at rates as high as 19.5x1014 cal/decade). Since 2002, lake levels have risen substantially (as much as 2.5 m), and ice covers have thinned (1.5 m on average). Analyses of lake ice thickness, meteorological conditions, and stream water heat loads indicate that the main source of heat to these lakes is from latent heat released when ice-covers form during the winter. An aditional source of heat to the lakes is water inflows from streams and direct glacieal melt. Mean lake temperatures in the past few years have stabilized or cooled, despite increases in lake level and total heat content, suggesting increased direct inflow of meltwater from glaciers. These results indicate that McMurdo Dry Valley lakes are sensitive indicators of climate processes in this polar desert landscape and demonstrate the importance of long-term data sets when addressing the effects of climate on ecosystem processes.

  14. Application of Open Source Software by the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on top of XML. Apache Solr, an open source search engine, was used to drive our search interface and as way to store references to metadata and data exposed via REST endpoints. As was the case with Apache OODT there was team experience with this component that helped drive this choice. Lastly, OpenSSO, an open source single sign on service, was used to secure and provide access constraints to our REST based services. For this product there was little past experience but given our service based approach seemed to be a natural fit. Given our exposure to open source we will discuss the tradeoffs and benefits received by the choices made. Moreover, we will dive into the context of how the software packages were used and the impact of their design and extensibility had on the construction of the infrastructure. Finally, we will compare our encounter across open source solutions and attributes that can vary the impression one will get. This comprehensive account of our endeavor should aid others in their assessment and use of open source.

  15. Building CHAOS: An Operating System for Livermore Linux Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garlick, J E; Dunlap, C M

    2003-02-21

    The Livermore Computing (LC) Linux Integration and Development Project (the Linux Project) produces and supports the Clustered High Availability Operating System (CHAOS), a cluster operating environment based on Red Hat Linux. Each CHAOS release begins with a set of requirements and ends with a formally tested, packaged, and documented release suitable for use on LC's production Linux clusters. One characteristic of CHAOS is that component software packages come from different sources under varying degrees of project control. Some are developed by the Linux Project, some are developed by other LC projects, some are external open source projects, and some aremore » commercial software packages. A challenge to the Linux Project is to adhere to release schedules and testing disciplines in a diverse, highly decentralized development environment. Communication channels are maintained for externally developed packages in order to obtain support, influence development decisions, and coordinate/understand release schedules. The Linux Project embraces open source by releasing locally developed packages under open source license, by collaborating with open source projects where mutually beneficial, and by preferring open source over proprietary software. Project members generally use open source development tools. The Linux Project requires system administrators and developers to work together to resolve problems that arise in production. This tight coupling of production and development is a key strategy for making a product that directly addresses LC's production requirements. It is another challenge to balance support and development activities in such a way that one does not overwhelm the other.« less

  16. Towards an open, collaborative, reusable framework for sharing hands-on bioinformatics training workshops

    PubMed Central

    Revote, Jerico; Suchecki, Radosław; Tyagi, Sonika; Corley, Susan M.; Shang, Catherine A.; McGrath, Annette

    2017-01-01

    Abstract There is a clear demand for hands-on bioinformatics training. The development of bioinformatics workshop content is both time-consuming and expensive. Therefore, enabling trainers to develop bioinformatics workshops in a way that facilitates reuse is becoming increasingly important. The most widespread practice for sharing workshop content is through making PDF, PowerPoint and Word documents available online. While this effort is to be commended, such content is usually not so easy to reuse or repurpose and does not capture all the information required for a third party to rerun a workshop. We present an open, collaborative framework for developing and maintaining, reusable and shareable hands-on training workshop content. PMID:26984618

  17. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  18. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  19. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  20. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  1. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  2. Toward Emotionally Accessible Massive Open Online Courses (MOOCs).

    PubMed

    Hillaire, Garron; Iniesto, Francisco; Rienties, Bart

    2017-01-01

    This paper outlines an approach to evaluating the emotional content of three Massive Open Online Courses (MOOCs) using the affective computing approach of prosody detection on two different text-to-speech voices in conjunction with human raters judging the emotional content of course text. The intent of this work is to establish the potential variation on the emotional delivery of MOOC material through synthetic voice.

  3. Trends and Patterns in Massive Open Online Courses: Review and Content Analysis of Research on MOOCs (2008-2015)

    ERIC Educational Resources Information Center

    Bozkurt, Aras; Akgün-Özbek, Ela; Zawacki-Richter, Olaf

    2017-01-01

    To fully understand the phenomenon of massive open online courses (MOOCs), it is important to identify and map trends and patterns in research on MOOCs. This study does so by reviewing 362 empirical articles published in peer-reviewed journals from 2008 to 2015. For the purpose of this study, content analysis and discourse analysis were employed…

  4. Sharing Lessons-Learned on Effective Open Data, Open-Source Practices from OpenAQ, a Global Open Air Quality Community.

    NASA Astrophysics Data System (ADS)

    Hasenkopf, C. A.

    2017-12-01

    Increasingly, open data, open-source projects are unearthing rich datasets and tools, previously impossible for more traditional avenues to generate. These projects are possible, in part, because of the emergence of online collaborative and code-sharing tools, decreasing costs of cloud-based services to fetch, store, and serve data, and increasing interest of individuals to contribute their time and skills to 'open projects.' While such projects have generated palpable enthusiasm from many sectors, many of these projects face uncharted paths for sustainability, visibility, and acceptance. Our project, OpenAQ, is an example of an open-source, open data community that is currently forging its own uncharted path. OpenAQ is an open air quality data platform that aggregates and universally formats government and research-grade air quality data from 50 countries across the world. To date, we make available more than 76 million air quality (PM2.5, PM10, SO2, NO2, O3, CO and black carbon) data points through an open Application Programming Interface (API) and a user-customizable download interface at https://openaq.org. The goal of the platform is to enable an ecosystem of users to advance air pollution efforts from science to policy to the private sector. The platform is also an open-source project (https://github.com/openaq) and has only been made possible through the coding and data contributions of individuals around the world. In our first two years of existence, we have seen requests for data to our API skyrocket to more than 6 million datapoints per month, and use-cases as varied as ingesting data aggregated from our system into real-time models of wildfires to building open-source statistical packages (e.g. ropenaq and py-openaq) on top of the platform to creating public-friendly apps and chatbots. We will share a whirl-wind trip through our evolution and the many lessons learned so far related to platform structure, community engagement, organizational model type and sustainability.

  5. Comprehensive Routing Security Development and Deployment for the Internet

    DTIC Science & Technology

    2015-02-01

    feature enhancement and bug fixes. • MySQL : MySQL is a widely used and popular open source database package. It was chosen for database support in the...RPSTIR depends on several other open source packages. • MySQL : MySQL is used for the the local RPKI database cache. • OpenSSL: OpenSSL is used for...cryptographic libraries for X.509 certificates. • ODBC mySql Connector: ODBC (Open Database Connectivity) is a standard programming interface (API) for

  6. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  7. Something for Everyone? The Different Approaches of Academic Disciplines to Open Educational Resources and the Effect on Widening Participation

    ERIC Educational Resources Information Center

    Coughlan, Tony; Perryman, Leigh-Anne

    2011-01-01

    This article explores the relationship between academic disciplines' representation in the United Kingdom Open University's (OU) OpenLearn open educational resources (OER) repository and in the OU's fee-paying curriculum. Becher's (1989) typology was used to subdivide the OpenLearn and OU fee-paying curriculum content into four disciplinary…

  8. Constraining the presence and abundance of an excess gas phase prior to the June 1991 eruption of Mt Pinatubo (Philippines) using S-isotopes

    NASA Astrophysics Data System (ADS)

    Bouvet de Maisonneuve, C.; Fiege, A.; Fabbro, G.; Kubo, A. I.

    2016-12-01

    Large explosive eruptions typically release orders of magnitude more S to the atmosphere than expected based on degassing of the erupted magma. To explain this, an excess, accumulated vapor phase is often proposed. Resolving the presence, composition, and source of such an exsolved volatile phase is essential, as it will drive eruptions towards increased explosivity. Integration of melt inclusion (MI) volatile contents (H, C, S, Cl, F) with S isotope data on melt inclusions, and sulfur-bearing minerals (anhydrite) can provide information on pre- and syn-eruptive degassing. The June 1991 eruption of Mt Pinatubo is an ideal candidate for such a study as it injected a >17 Mt of SO2 into the stratosphere, corresponding to a S excess release of a factor close to 100. The erupted magma was oxidized (QFM+3) and should therefore yield a clear isotopic trend. Volatile contents in glassy but vesicular quartz-hosted MIs were measured by SIMS and yield <3 wt% H2O and <100 ppm S but up to 1500 ppm CO2, in agreement with previous measurements. The MIs with few but large vapor bubbles (avoided during analysis) have lower H2O and CO2 contents and smaller standard deviations. The MIs with many small bubbles have higher volatile contents and standard deviations because the gas phase was not avoided during analysis. We observed scattered S contents and highly variable S isotope compositions for all MIs, which could be due to the presence of submicron S phases. Thus, we homogenized a batch of MIs under P-T-fO2 conditions that best correspond to pre-eruptive conditions. The δ34S for quartz-hosted MIs ranges from -1 to +14 ‰ and δ34S vs. S-H-C content trends are used to infer open or closed system degassing processes. In the near future, anhydrites and melt inclusions in other mineral hosts (amphibole and plagioclase) will be investigated in order to reconstruct the degassing history of the 1991 Pinatubo magma and to trace the S source.

  9. GIS-Based Noise Simulation Open Source Software: N-GNOIS

    NASA Astrophysics Data System (ADS)

    Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh

    2015-12-01

    Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.

  10. Utilization of open source electronic health record around the world: A systematic review

    PubMed Central

    Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahamdi, Maryam

    2014-01-01

    Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems. PMID:24672566

  11. Bioclipse: an open source workbench for chemo- and bioinformatics.

    PubMed

    Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl E S

    2007-02-22

    There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no successful attempts have been made to integrate chemo- and bioinformatics into a single framework. Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at http://www.bioclipse.net.

  12. A Platform for Innovation and Standards Evaluation: a Case Study from the OpenMRS Open-Source Radiology Information System.

    PubMed

    Gichoya, Judy W; Kohli, Marc; Ivange, Larry; Schmidt, Teri S; Purkayastha, Saptarshi

    2018-05-10

    Open-source development can provide a platform for innovation by seeking feedback from community members as well as providing tools and infrastructure to test new standards. Vendors of proprietary systems may delay adoption of new standards until there are sufficient incentives such as legal mandates or financial incentives to encourage/mandate adoption. Moreover, open-source systems in healthcare have been widely adopted in low- and middle-income countries and can be used to bridge gaps that exist in global health radiology. Since 2011, the authors, along with a community of open-source contributors, have worked on developing an open-source radiology information system (RIS) across two communities-OpenMRS and LibreHealth. The main purpose of the RIS is to implement core radiology workflows, on which others can build and test new radiology standards. This work has resulted in three major releases of the system, with current architectural changes driven by changing technology, development of new standards in health and imaging informatics, and changing user needs. At their core, both these communities are focused on building general-purpose EHR systems, but based on user contributions from the fringes, we have been able to create an innovative system that has been used by hospitals and clinics in four different countries. We provide an overview of the history of the LibreHealth RIS, the architecture of the system, overview of standards integration, describe challenges of developing an open-source product, and future directions. Our goal is to attract more participation and involvement to further develop the LibreHealth RIS into an Enterprise Imaging System that can be used in other clinical imaging including pathology and dermatology.

  13. Defending the Amazon: Conservation, Development and Security in Brazil

    DTIC Science & Technology

    2009-03-01

    against drugs is not 191 Nelson Jobim, interview by Empresa Brasil de Comunicação Radio, trans. Open Source Center, February 6, 2009, available from... Empresa Brasil de Comunicação Radio, trans. Open Source Center, February 6, 2009, available from http://www.ebc.com.br (accessed February 23, 2009...Institute of Peace, 1996. Jobim, Nelson. Interview by Empresa Brasil de Comunicação Radio. Translated by Open Source Center. February 6, 2009

  14. Open-Source web-based geographical information system for health exposure assessment

    PubMed Central

    2012-01-01

    This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software. PMID:22233606

  15. Open source 3D printers: an appropriate technology for building low cost optics labs for the developing communities

    NASA Astrophysics Data System (ADS)

    Gwamuri, J.; Pearce, Joshua M.

    2017-08-01

    The recent introduction of RepRap (self-replicating rapid prototyper) 3-D printers and the resultant open source technological improvements have resulted in affordable 3-D printing, enabling low-cost distributed manufacturing for individuals. This development and others such as the rise of open source-appropriate technology (OSAT) and solar powered 3-D printing are moving 3-D printing from an industry based technology to one that could be used in the developing world for sustainable development. In this paper, we explore some specific technological improvements and how distributed manufacturing with open-source 3-D printing can be used to provide open-source 3-D printable optics components for developing world communities through the ability to print less expensive and customized products. This paper presents an open-source low cost optical equipment library which enables relatively easily adapted customizable designs with the potential of changing the way optics is taught in resource constraint communities. The study shows that this method of scientific hardware development has a potential to enables a much broader audience to participate in optical experimentation both as research and teaching platforms. Conclusions on the technical viability of 3-D printing to assist in development and recommendations on how developing communities can fully exploit this technology to improve the learning of optics through hands-on methods have been outlined.

  16. Experimental assessment of theory for refraction of sound by a shear layer

    NASA Technical Reports Server (NTRS)

    Schlinker, R. H.; Amiet, R. K.

    1978-01-01

    The refraction angle and amplitude changes associated with sound transmission through a circular, open-jet shear layer were studied in a 0.91 m diameter open jet acoustic research tunnel. Free stream Mach number was varied from 0.1 to 0.4. Good agreement between refraction angle correction theory and experiment was obtained over the test Mach number, frequency and angle measurement range for all on-axis acoustic source locations. For off-axis source positions, good agreement was obtained at a source-to-shear layer separation distance greater than the jet radius. Measureable differences between theory and experiment occurred at a source-to-shear layer separation distance less than one jet radius. A shear layer turbulence scattering experiment was conducted at 90 deg to the open jet axis for the same free stream Mach numbers and axial source locations used in the refraction study. Significant discrete tone spectrum broadening and tone amplitude changes were observed at open jet Mach numbers above 0.2 and at acoustic source frequencies greater than 5 kHz. More severe turbulence scattering was observed for downstream source locations.

  17. Open Distribution of Virtual Containers as a Key Framework for Open Educational Resources and STEAM Subjects

    ERIC Educational Resources Information Center

    Corbi, Alberto; Burgos, Daniel

    2017-01-01

    This paper presents how virtual containers enhance the implementation of STEAM (science, technology, engineering, arts, and math) subjects as Open Educational Resources (OER). The publication initially summarizes the limitations of delivering open rich learning contents and corresponding assignments to students in college level STEAM areas. The…

  18. The Early Development of the Open University: Report of the Vice-Chancellor January 1969-December 1970.

    ERIC Educational Resources Information Center

    Open Univ., Walton, Bletchley, Bucks (England).

    This report concerns the extablishment and development of the British Open University. Contents include the descriptions of: the development of the institution; staffing the open university; development of the Milton Keynes Campus; undergraduate course development; regional organization; demand for open university courses; development, production,…

  19. The OpenCourseWare Model: High-Impact Open Educational Content

    ERIC Educational Resources Information Center

    Carson, Stephen

    2007-01-01

    OpenCourseWare (OCW) is one among several models for offering open educational resources (OER). This article explains the OCW model and its position within the broader OER context. OCW primarily represents publication of existing course materials already in use for teaching purposes. OCW projects are most often institutional, carrying the…

  20. Supporting Access to Open Online Courses for Learners of Developing Countries

    ERIC Educational Resources Information Center

    Nti, Kwame

    2015-01-01

    This paper examines how access to, and use of, open online courses may be enhanced for learners of developing countries from a learner perspective. Using analysis of the open education concept, factors that affect access to open educational resources content, and universal standards for delivering online learning, the author demonstrates that the…

  1. From Learning in Coffee Houses to Learning with Open Educational Resources

    ERIC Educational Resources Information Center

    Peter, Sandra; Farrell, Lesley

    2013-01-01

    What is "open" about Open Educational Resources? How does education become "open" when it is removed from the institutional housing of the school or the university and develops in public social settings? Has the Internet, in providing educational content without cost and free of copyright restrictions, provoked a unique and…

  2. An Open Source Model for Open Access Journal Publication

    PubMed Central

    Blesius, Carl R.; Williams, Michael A.; Holzbach, Ana; Huntley, Arthur C.; Chueh, Henry

    2005-01-01

    We describe an electronic journal publication infrastructure that allows a flexible publication workflow, academic exchange around different forms of user submissions, and the exchange of articles between publishers and archives using a common XML based standard. This web-based application is implemented on a freely available open source software stack. This publication demonstrates the Dermatology Online Journal's use of the platform for non-biased independent open access publication. PMID:16779183

  3. [GNU Pattern: open source pattern hunter for biological sequences based on SPLASH algorithm].

    PubMed

    Xu, Ying; Li, Yi-xue; Kong, Xiang-yin

    2005-06-01

    To construct a high performance open source software engine based on IBM SPLASH algorithm for later research on pattern discovery. Gpat, which is based on SPLASH algorithm, was developed by using open source software. GNU Pattern (Gpat) software was developped, which efficiently implemented the core part of SPLASH algorithm. Full source code of Gpat was also available for other researchers to modify the program under the GNU license. Gpat is a successful implementation of SPLASH algorithm and can be used as a basic framework for later research on pattern recognition in biological sequences.

  4. Passive rejection of heat from an isotope heat source through an open door

    NASA Technical Reports Server (NTRS)

    Burns, R. K.

    1971-01-01

    The isotope heat-source design for a Brayton power system includes a door in the thermal insulation through which the heat can be passively rejected to space when the power system is not operating. The results of an analysis to predict the heat-source surface temperature and the heat-source heat-exchanger temperature during passive heat rejection as a function of insulation door opening angle are presented. They show that for a door opening angle greater than 20 deg, the temperatures are less than the steady-state temperatures during power system operation.

  5. DUAL HEATED ION SOURCE STRUCTURE HAVING ARC SHIFTING MEANS

    DOEpatents

    Lawrence, E.O.

    1959-04-14

    An ion source is presented for calutrons, particularly an electrode arrangement for the ion generator of a calutron ion source. The ion source arc chamber is heated and an exit opening with thermally conductive plates defines the margins of the opening. These plates are electrically insulated from the body of the ion source and are connected to a suitable source of voltage to serve as electrodes for shaping the ion beam egressing from the arc chamber.

  6. Open-Source Syringe Pump Library

    PubMed Central

    Wijnen, Bas; Hunt, Emily J.; Anzalone, Gerald C.; Pearce, Joshua M.

    2014-01-01

    This article explores a new open-source method for developing and manufacturing high-quality scientific equipment suitable for use in virtually any laboratory. A syringe pump was designed using freely available open-source computer aided design (CAD) software and manufactured using an open-source RepRap 3-D printer and readily available parts. The design, bill of materials and assembly instructions are globally available to anyone wishing to use them. Details are provided covering the use of the CAD software and the RepRap 3-D printer. The use of an open-source Rasberry Pi computer as a wireless control device is also illustrated. Performance of the syringe pump was assessed and the methods used for assessment are detailed. The cost of the entire system, including the controller and web-based control interface, is on the order of 5% or less than one would expect to pay for a commercial syringe pump having similar performance. The design should suit the needs of a given research activity requiring a syringe pump including carefully controlled dosing of reagents, pharmaceuticals, and delivery of viscous 3-D printer media among other applications. PMID:25229451

  7. The Role of Free/Libre and Open Source Software in Learning Health Systems.

    PubMed

    Paton, C; Karopka, T

    2017-08-01

    Objective: To give an overview of the role of Free/Libre and Open Source Software (FLOSS) in the context of secondary use of patient data to enable Learning Health Systems (LHSs). Methods: We conducted an environmental scan of the academic and grey literature utilising the MedFLOSS database of open source systems in healthcare to inform a discussion of the role of open source in developing LHSs that reuse patient data for research and quality improvement. Results: A wide range of FLOSS is identified that contributes to the information technology (IT) infrastructure of LHSs including operating systems, databases, frameworks, interoperability software, and mobile and web apps. The recent literature around the development and use of key clinical data management tools is also reviewed. Conclusions: FLOSS already plays a critical role in modern health IT infrastructure for the collection, storage, and analysis of patient data. The nature of FLOSS systems to be collaborative, modular, and modifiable may make open source approaches appropriate for building the digital infrastructure for a LHS. Georg Thieme Verlag KG Stuttgart.

  8. OpenLMD, multimodal monitoring and control of LMD processing

    NASA Astrophysics Data System (ADS)

    Rodríguez-Araújo, Jorge; García-Díaz, Antón

    2017-02-01

    This paper presents OpenLMD, a novel open-source solution for on-line multimodal monitoring of Laser Metal Deposition (LMD). The solution is also applicable to a wider range of laser-based applications that require on-line control (e.g. laser welding). OpenLMD is a middleware that enables the orchestration and virtualization of a LMD robot cell, using several open-source frameworks (e.g. ROS, OpenCV, PCL). The solution also allows reconfiguration by easy integration of multiple sensors and processing equipment. As a result, OpenLMD delivers significant advantages over existing monitoring and control approaches, such as improved scalability, and multimodal monitoring and data sharing capabilities.

  9. "Do-It-Yourself" reliable pH-stat device by using open-source software, inexpensive hardware and available laboratory equipment

    PubMed Central

    Kragic, Rastislav; Kostic, Mirjana

    2018-01-01

    In this paper, we present the construction of a reliable and inexpensive pH stat device, by using open-source “OpenPhControl” software, inexpensive hardware (a peristaltic and a syringe pump, Arduino, a step motor…), readily available laboratory devices: a pH meter, a computer, a webcam, and some 3D printed parts. We provide a methodology for the design, development and test results of each part of the device, as well as of the entire system. In addition to dosing reagents by means of a low-cost peristaltic pump, we also present carefully controlled dosing of reagents by an open-source syringe pump. The upgrading of the basic open-source syringe pump is given in terms of pump control and application of a larger syringe. In addition to the basic functions of pH stat, i.e. pH value measurement and maintenance, an improvement allowing the device to be used for potentiometric titration has been made as well. We have demonstrated the device’s utility when applied for cellulose fibers oxidation with 2,2,6,6-tetramethylpiperidine-1-oxyl radical, i.e. for TEMPO-mediated oxidation. In support of this, we present the results obtained for the oxidation kinetics, the consumption of added reagent and experimental repeatability. Considering that the open-source scientific tools are available to everyone, and that researchers can construct and adjust the device according to their needs, as well as, that the total cost of the open-source pH stat device, excluding the existing laboratory equipment (pH meter, computer and glossary) was less than 150 EUR, we believe that, at a small fraction of the cost of available commercial offers, our open-source pH stat can significantly improve experimental work where the use of pH stat is necessary. PMID:29509793

  10. Free and Open Source Software for Geospatial in the field of planetary science

    NASA Astrophysics Data System (ADS)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and solutions to possible detriments coming from the effort required by using, supporting and contributing.

  11. "Do-It-Yourself" reliable pH-stat device by using open-source software, inexpensive hardware and available laboratory equipment.

    PubMed

    Milanovic, Jovana Z; Milanovic, Predrag; Kragic, Rastislav; Kostic, Mirjana

    2018-01-01

    In this paper, we present the construction of a reliable and inexpensive pH stat device, by using open-source "OpenPhControl" software, inexpensive hardware (a peristaltic and a syringe pump, Arduino, a step motor…), readily available laboratory devices: a pH meter, a computer, a webcam, and some 3D printed parts. We provide a methodology for the design, development and test results of each part of the device, as well as of the entire system. In addition to dosing reagents by means of a low-cost peristaltic pump, we also present carefully controlled dosing of reagents by an open-source syringe pump. The upgrading of the basic open-source syringe pump is given in terms of pump control and application of a larger syringe. In addition to the basic functions of pH stat, i.e. pH value measurement and maintenance, an improvement allowing the device to be used for potentiometric titration has been made as well. We have demonstrated the device's utility when applied for cellulose fibers oxidation with 2,2,6,6-tetramethylpiperidine-1-oxyl radical, i.e. for TEMPO-mediated oxidation. In support of this, we present the results obtained for the oxidation kinetics, the consumption of added reagent and experimental repeatability. Considering that the open-source scientific tools are available to everyone, and that researchers can construct and adjust the device according to their needs, as well as, that the total cost of the open-source pH stat device, excluding the existing laboratory equipment (pH meter, computer and glossary) was less than 150 EUR, we believe that, at a small fraction of the cost of available commercial offers, our open-source pH stat can significantly improve experimental work where the use of pH stat is necessary.

  12. Open Source Drug Discovery in Practice: A Case Study

    PubMed Central

    Årdal, Christine; Røttingen, John-Arne

    2012-01-01

    Background Open source drug discovery offers potential for developing new and inexpensive drugs to combat diseases that disproportionally affect the poor. The concept borrows two principle aspects from open source computing (i.e., collaboration and open access) and applies them to pharmaceutical innovation. By opening a project to external contributors, its research capacity may increase significantly. To date there are only a handful of open source R&D projects focusing on neglected diseases. We wanted to learn from these first movers, their successes and failures, in order to generate a better understanding of how a much-discussed theoretical concept works in practice and may be implemented. Methodology/Principal Findings A descriptive case study was performed, evaluating two specific R&D projects focused on neglected diseases. CSIR Team India Consortium's Open Source Drug Discovery project (CSIR OSDD) and The Synaptic Leap's Schistosomiasis project (TSLS). Data were gathered from four sources: interviews of participating members (n = 14), a survey of potential members (n = 61), an analysis of the websites and a literature review. Both cases have made significant achievements; however, they have done so in very different ways. CSIR OSDD encourages international collaboration, but its process facilitates contributions from mostly Indian researchers and students. Its processes are formal with each task being reviewed by a mentor (almost always offline) before a result is made public. TSLS, on the other hand, has attracted contributors internationally, albeit significantly fewer than CSIR OSDD. Both have obtained funding used to pay for access to facilities, physical resources and, at times, labor costs. TSLS releases its results into the public domain, whereas CSIR OSDD asserts ownership over its results. Conclusions/Significance Technically TSLS is an open source project, whereas CSIR OSDD is a crowdsourced project. However, both have enabled high quality research at low cost. The critical success factors appear to be clearly defined entry points, transparency and funding to cover core material costs. PMID:23029588

  13. 48 CFR 1506.303-2 - Content.

    Code of Federal Regulations, 2010 CFR

    2008-10-01

    ... 48 Federal Acquisition Regulations System 6 2008-10-01 2008-10-01 false Content. 1506.303-2... PLANNING COMPETITION REQUIREMENTS Other Than Full and Open Competition 1506.303-2 Content. The... incorporate the evaluation of responses to the synopsis in the JOFOC. (See 1506.371(d) for contents of the...

  14. 48 CFR 1506.303-2 - Content.

    Code of Federal Regulations, 2010 CFR

    2017-10-01

    ... 48 Federal Acquisition Regulations System 6 2017-10-01 2017-10-01 false Content. 1506.303-2... PLANNING COMPETITION REQUIREMENTS Other Than Full and Open Competition 1506.303-2 Content. The... incorporate the evaluation of responses to the synopsis in the JOFOC. (See 1506.371(d) for contents of the...

  15. Managing multicentre clinical trials with open source.

    PubMed

    Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan

    2014-03-01

    Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.

  16. Mapping DICOM to OpenDocument format

    NASA Astrophysics Data System (ADS)

    Yu, Cong; Yao, Zhihong

    2009-02-01

    In order to enhance the readability, extensibility and sharing of DICOM files, we have introduced XML into DICOM file system (SPIE Volume 5748)[1] and the multilayer tree structure into DICOM (SPIE Volume 6145)[2]. In this paper, we proposed mapping DICOM to ODF(OpenDocument Format), for it is also based on XML. As a result, the new format realizes the separation of content(including text content and image) and display style. Meanwhile, since OpenDocument files take the format of a ZIP compressed archive, the new kind of DICOM files can benefit from ZIP's lossless compression to reduce file size. Moreover, this open format can also guarantee long-term access to data without legal or technical barriers, making medical images accessible to various fields.

  17. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.

  18. Open Source Solutions for Libraries: ABCD vs Koha

    ERIC Educational Resources Information Center

    Macan, Bojan; Fernandez, Gladys Vanesa; Stojanovski, Jadranka

    2013-01-01

    Purpose: The purpose of this study is to present an overview of the two open source (OS) integrated library systems (ILS)--Koha and ABCD (ISIS family), to compare their "next-generation library catalog" functionalities, and to give comparison of other important features available through ILS modules. Design/methodology/approach: Two open source…

  19. Development and evaluation of a lightweight sensor system for emission sampling from open area sources

    EPA Science Inventory

    A new sensor system for mobile and aerial emission sampling was developed for open area sources, such as open burning. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, and black carbon, samplers for particulate matter with ...

  20. Opening a New Door

    ERIC Educational Resources Information Center

    Waters, John K.

    2007-01-01

    A growing number of K-12 districts are taking the open source plunge, both to cope with tight budgets and to escape proprietary vendor lock-in and expensive upgrade cycles. With the potential for cost savings and a growing number of educational applications, open source software is proving to be an effective alternative for schools willing to make…

Top