Sample records for open source product

  1. The Commercial Open Source Business Model

    NASA Astrophysics Data System (ADS)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  2. All-source Information Management and Integration for Improved Collective Intelligence Production

    DTIC Science & Technology

    2011-06-01

    Intelligence (ELINT) • Open Source Intelligence ( OSINT ) • Technical Intelligence (TECHINT) These intelligence disciplines produce... intelligence , measurement and signature intelligence , signals intelligence , and open - source data, in the production of intelligence . All- source intelligence ...All- Source Information Integration and Management) R&D Project 3 All- Source Intelligence

  3. How Is Open Source Special?

    ERIC Educational Resources Information Center

    Kapor, Mitchell

    2005-01-01

    Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…

  4. An Analysis of Open Source Security Software Products Downloads

    ERIC Educational Resources Information Center

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  5. Building CHAOS: An Operating System for Livermore Linux Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garlick, J E; Dunlap, C M

    2003-02-21

    The Livermore Computing (LC) Linux Integration and Development Project (the Linux Project) produces and supports the Clustered High Availability Operating System (CHAOS), a cluster operating environment based on Red Hat Linux. Each CHAOS release begins with a set of requirements and ends with a formally tested, packaged, and documented release suitable for use on LC's production Linux clusters. One characteristic of CHAOS is that component software packages come from different sources under varying degrees of project control. Some are developed by the Linux Project, some are developed by other LC projects, some are external open source projects, and some aremore » commercial software packages. A challenge to the Linux Project is to adhere to release schedules and testing disciplines in a diverse, highly decentralized development environment. Communication channels are maintained for externally developed packages in order to obtain support, influence development decisions, and coordinate/understand release schedules. The Linux Project embraces open source by releasing locally developed packages under open source license, by collaborating with open source projects where mutually beneficial, and by preferring open source over proprietary software. Project members generally use open source development tools. The Linux Project requires system administrators and developers to work together to resolve problems that arise in production. This tight coupling of production and development is a key strategy for making a product that directly addresses LC's production requirements. It is another challenge to balance support and development activities in such a way that one does not overwhelm the other.« less

  6. The case for open-source software in drug discovery.

    PubMed

    DeLano, Warren L

    2005-02-01

    Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.

  7. Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.

    ERIC Educational Resources Information Center

    Newby, Gregory B.; Greenberg, Jane; Jones, Paul

    2003-01-01

    Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)

  8. Cyberscience and the Knowledge-Based Economy. Open Access and Trade Publishing: From Contradiction to Compatibility with Non-Exclusive Copyright Licensing

    ERIC Educational Resources Information Center

    Armbruster, Chris

    2008-01-01

    Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge…

  9. Learning from hackers: open-source clinical trials.

    PubMed

    Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico

    2012-05-02

    Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.

  10. Open Source and Design Thinking at NASA: A Vision for Future Software

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2017-01-01

    NASA Mission Control Software for the Visualization of data has historically been closed, accessible only to small groups of flight controllers, often bound to a specific mission discipline such as flight dynamics, health and status or mission planning. Open Mission Control Technologies (MCT) provides new capability for NASA mission controllers and, by being fully open source, opens up NASA software for the visualization of mission data to broader communities inside and outside of NASA. Open MCT is the product of a design thinking process within NASA, using participatory design and design sprints to build a product that serves users.

  11. Developing open-source codes for electromagnetic geophysics using industry support

    NASA Astrophysics Data System (ADS)

    Key, K.

    2017-12-01

    Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.

  12. The Open Source DataTurbine Initiative: Streaming Data Middleware for Environmental Observing Systems

    NASA Technical Reports Server (NTRS)

    Fountain T.; Tilak, S.; Shin, P.; Hubbard, P.; Freudinger, L.

    2009-01-01

    The Open Source DataTurbine Initiative is an international community of scientists and engineers sharing a common interest in real-time streaming data middleware and applications. The technology base of the OSDT Initiative is the DataTurbine open source middleware. Key applications of DataTurbine include coral reef monitoring, lake monitoring and limnology, biodiversity and animal tracking, structural health monitoring and earthquake engineering, airborne environmental monitoring, and environmental sustainability. DataTurbine software emerged as a commercial product in the 1990 s from collaborations between NASA and private industry. In October 2007, a grant from the USA National Science Foundation (NSF) Office of Cyberinfrastructure allowed us to transition DataTurbine from a proprietary software product into an open source software initiative. This paper describes the DataTurbine software and highlights key applications in environmental monitoring.

  13. What an open source clinical trial community can learn from hackers

    PubMed Central

    Dunn, Adam G.; Day, Richard O.; Mandl, Kenneth D.; Coiera, Enrico

    2014-01-01

    Summary Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. Since a similar gap has already been addressed in the software industry by the open source software movement, we examine how the social and technical principles of the movement can be used to guide the growth of an open source clinical trial community. PMID:22553248

  14. Build, Buy, Open Source, or Web 2.0?: Making an Informed Decision for Your Library

    ERIC Educational Resources Information Center

    Fagan, Jody Condit; Keach, Jennifer A.

    2010-01-01

    When improving a web presence, today's libraries have a choice: using a free Web 2.0 application, opting for open source, buying a product, or building a web application. This article discusses how to make an informed decision for one's library. The authors stress that deciding whether to use a free Web 2.0 application, to choose open source, to…

  15. Sources of Information as Determinants of Product and Process Innovation.

    PubMed

    Gómez, Jaime; Salazar, Idana; Vargas, Pilar

    2016-01-01

    In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent.

  16. Sources of Information as Determinants of Product and Process Innovation

    PubMed Central

    2016-01-01

    In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent. PMID:27035456

  17. The successes and challenges of open-source biopharmaceutical innovation.

    PubMed

    Allarakhia, Minna

    2014-05-01

    Increasingly, open-source-based alliances seek to provide broad access to data, research-based tools, preclinical samples and downstream compounds. The challenge is how to create value from open-source biopharmaceutical innovation. This value creation may occur via transparency and usage of data across the biopharmaceutical value chain as stakeholders move dynamically between open source and open innovation. In this article, several examples are used to trace the evolution of biopharmaceutical open-source initiatives. The article specifically discusses the technological challenges associated with the integration and standardization of big data; the human capacity development challenges associated with skill development around big data usage; and the data-material access challenge associated with data and material access and usage rights, particularly as the boundary between open source and open innovation becomes more fluid. It is the author's opinion that the assessment of when and how value creation will occur, through open-source biopharmaceutical innovation, is paramount. The key is to determine the metrics of value creation and the necessary technological, educational and legal frameworks to support the downstream outcomes of now big data-based open-source initiatives. The continued focus on the early-stage value creation is not advisable. Instead, it would be more advisable to adopt an approach where stakeholders transform open-source initiatives into open-source discovery, crowdsourcing and open product development partnerships on the same platform.

  18. Open Education and the Open Science Economy

    ERIC Educational Resources Information Center

    Peters, Michael A.

    2009-01-01

    Openness as a complex code word for a variety of digital trends and movements has emerged as an alternative mode of "social production" based on the growing and overlapping complexities of open source, open access, open archiving, open publishing, and open science. This paper argues that the openness movement with its reinforcing structure of…

  19. A Comprehensive review on the open source hackable text editor-ATOM

    NASA Astrophysics Data System (ADS)

    Sumangali, K.; Borra, Lokesh; Suraj Mishra, Amol

    2017-11-01

    This document represents a comprehensive study of “Atom”, one of the best open-source code editors available with many features built-in to support multitude of programming environments and to provide a more productive toolset for developers.

  20. Deciding to Change OpenURL Link Resolvers

    ERIC Educational Resources Information Center

    Johnson, Megan; Leonard, Andrea; Wiswell, John

    2015-01-01

    This article will be of interest to librarians, particularly those in consortia that are evaluating OpenURL link resolvers. This case study contrasts WebBridge (an Innovative Interface product) and LinkSource (EBSCO's product). This study assisted us in the decision-making process of choosing an OpenURL link resolver that was sustainable to…

  1. The Efficient Utilization of Open Source Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baty, Samuel R.

    These are a set of slides on the efficient utilization of open source information. Open source information consists of a vast set of information from a variety of sources. Not only does the quantity of open source information pose a problem, the quality of such information can hinder efforts. To show this, two case studies are mentioned: Iran and North Korea, in order to see how open source information can be utilized. The huge breadth and depth of open source information can complicate an analysis, especially because open information has no guarantee of accuracy. Open source information can provide keymore » insights either directly or indirectly: looking at supporting factors (flow of scientists, products and waste from mines, government budgets, etc.); direct factors (statements, tests, deployments). Fundamentally, it is the independent verification of information that allows for a more complete picture to be formed. Overlapping sources allow for more precise bounds on times, weights, temperatures, yields or other issues of interest in order to determine capability. Ultimately, a "good" answer almost never comes from an individual, but rather requires the utilization of a wide range of skill sets held by a team of people.« less

  2. Open source Matrix Product States: Opening ways to simulate entangled many-body quantum systems in one dimension

    NASA Astrophysics Data System (ADS)

    Jaschke, Daniel; Wall, Michael L.; Carr, Lincoln D.

    2018-04-01

    Numerical simulations are a powerful tool to study quantum systems beyond exactly solvable systems lacking an analytic expression. For one-dimensional entangled quantum systems, tensor network methods, amongst them Matrix Product States (MPSs), have attracted interest from different fields of quantum physics ranging from solid state systems to quantum simulators and quantum computing. Our open source MPS code provides the community with a toolset to analyze the statics and dynamics of one-dimensional quantum systems. Here, we present our open source library, Open Source Matrix Product States (OSMPS), of MPS methods implemented in Python and Fortran2003. The library includes tools for ground state calculation and excited states via the variational ansatz. We also support ground states for infinite systems with translational invariance. Dynamics are simulated with different algorithms, including three algorithms with support for long-range interactions. Convenient features include built-in support for fermionic systems and number conservation with rotational U(1) and discrete Z2 symmetries for finite systems, as well as data parallelism with MPI. We explain the principles and techniques used in this library along with examples of how to efficiently use the general interfaces to analyze the Ising and Bose-Hubbard models. This description includes the preparation of simulations as well as dispatching and post-processing of them.

  3. A Dozen Years after Open Source's 1998 Birth, It's Time for "OpenTechComm"

    ERIC Educational Resources Information Center

    Still, Brian

    2010-01-01

    2008 marked the 10-year Anniversary of the Open Source movement, which has had a substantial impact on not only software production and adoption, but also on the sharing and distribution of information. Technical communication as a discipline has taken some advantage of the movement or its derivative software, but this article argues not as much…

  4. Open Source Surrogate Safety Assessment Model, 2017 Enhancement and Update: SSAM Version 3.0 [Tech Brief

    DOT National Transportation Integrated Search

    2016-11-17

    The ETFOMM (Enhanced Transportation Flow Open Source Microscopic Model) Cloud Service (ECS) is a software product sponsored by the U.S. Department of Transportation in conjunction with the Microscopic Traffic Simulation Models and SoftwareAn Op...

  5. An Open Source Simulation System

    NASA Technical Reports Server (NTRS)

    Slack, Thomas

    2005-01-01

    An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.

  6. Open-Source RTOS Space Qualification: An RTEMS Case Study

    NASA Technical Reports Server (NTRS)

    Zemerick, Scott

    2017-01-01

    NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.

  7. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  8. Application of Open Source Software by the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on top of XML. Apache Solr, an open source search engine, was used to drive our search interface and as way to store references to metadata and data exposed via REST endpoints. As was the case with Apache OODT there was team experience with this component that helped drive this choice. Lastly, OpenSSO, an open source single sign on service, was used to secure and provide access constraints to our REST based services. For this product there was little past experience but given our service based approach seemed to be a natural fit. Given our exposure to open source we will discuss the tradeoffs and benefits received by the choices made. Moreover, we will dive into the context of how the software packages were used and the impact of their design and extensibility had on the construction of the infrastructure. Finally, we will compare our encounter across open source solutions and attributes that can vary the impression one will get. This comprehensive account of our endeavor should aid others in their assessment and use of open source.

  9. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  10. Open-Source 3D-Printable Optics Equipment

    PubMed Central

    Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104

  11. Open-source 3D-printable optics equipment.

    PubMed

    Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.

  12. THE CDF ARCHIVE: HERSCHEL PACS AND SPIRE SPECTROSCOPIC DATA PIPELINE AND PRODUCTS FOR PROTOSTARS AND YOUNG STELLAR OBJECTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Joel D.; Yang, Yao-Lun; II, Neal J. Evans

    2016-03-15

    We present the COPS-DIGIT-FOOSH (CDF) Herschel spectroscopy data product archive, and related ancillary data products, along with data fidelity assessments, and a user-created archive in collaboration with the Herschel-PACS and SPIRE ICC groups. Our products include datacubes, contour maps, automated line fitting results, and best 1D spectra products for all protostellar and disk sources observed with PACS in RangeScan mode for two observing programs: the DIGIT Open Time Key Program (KPOT-nevans-1 and SDP-nevans-1; PI: N. Evans), and the FOOSH Open Time Program (OT1-jgreen02-2; PI: J. Green). In addition, we provide our best SPIRE-FTS spectroscopic products for the COPS Open Time Program (OT2-jgreen02-6;more » PI: J. Green) and FOOSH sources. We include details of data processing, descriptions of output products, and tests of their reliability for user applications. We identify the parts of the data set to be used with caution. The resulting absolute flux calibration has improved in almost all cases. Compared to previous reductions, the resulting rotational temperatures and numbers of CO molecules have changed substantially in some sources. On average, however, the rotational temperatures have not changed substantially (<2%), but the number of warm (T{sub rot} ∼ 300 K) CO molecules has increased by about 18%.« less

  13. The CDF Archive: Herschel PACS and SPIRE Spectroscopic Data Pipeline and Products for Protostars and Young Stellar Objects

    NASA Astrophysics Data System (ADS)

    Green, Joel D.; Yang, Yao-Lun; Evans, Neal J., II; Karska, Agata; Herczeg, Gregory; van Dishoeck, Ewine F.; Lee, Jeong-Eun; Larson, Rebecca L.; Bouwman, Jeroen

    2016-03-01

    We present the COPS-DIGIT-FOOSH (CDF) Herschel spectroscopy data product archive, and related ancillary data products, along with data fidelity assessments, and a user-created archive in collaboration with the Herschel-PACS and SPIRE ICC groups. Our products include datacubes, contour maps, automated line fitting results, and best 1D spectra products for all protostellar and disk sources observed with PACS in RangeScan mode for two observing programs: the DIGIT Open Time Key Program (KPOT_nevans1 and SDP_nevans_1; PI: N. Evans), and the FOOSH Open Time Program (OT1_jgreen02_2; PI: J. Green). In addition, we provide our best SPIRE-FTS spectroscopic products for the COPS Open Time Program (OT2_jgreen02_6; PI: J. Green) and FOOSH sources. We include details of data processing, descriptions of output products, and tests of their reliability for user applications. We identify the parts of the data set to be used with caution. The resulting absolute flux calibration has improved in almost all cases. Compared to previous reductions, the resulting rotational temperatures and numbers of CO molecules have changed substantially in some sources. On average, however, the rotational temperatures have not changed substantially (<2%), but the number of warm (Trot ∼ 300 K) CO molecules has increased by about 18%.

  14. Spatial rainfall data in open source environment

    NASA Astrophysics Data System (ADS)

    Schuurmans, Hanneke; Maarten Verbree, Jan; Leijnse, Hidde; van Heeringen, Klaas-Jan; Uijlenhoet, Remko; Bierkens, Marc; van de Giesen, Nick; Gooijer, Jan; van den Houten, Gert

    2013-04-01

    Since January 2013 The Netherlands have access to innovative high-quality rainfall data that is used for watermanagers. This product is innovative because of the following reasons. (i) The product is developed in a 'golden triangle' construction - corporation between government, business and research. (ii) Second the rainfall products are developed according to the open-source GPL license. The initiative comes from a group of water boards in the Netherlands that joined their forces to fund the development of a new rainfall product. Not only data from Dutch radar stations (as is currently done by the Dutch meteorological organization KNMI) is used but also data from radars in Germany and Belgium. After a radarcomposite is made, it is adjusted according to data from raingauges (ground truth). This results in 9 different rainfall products that give for each moment the best rainfall data. Specific knowledge is necessary to develop these kind of data. Therefore a pool of experts (KNMI, Deltares and 3 universities) participated in the development. The philosophy of the developers (being corporations) is that products like this should be developed in open source. This way knowledge is shared and the whole community is able to make suggestions for improvement. In our opinion this is the only way to make real progress in product development. Furthermore the financial resources of government organizations are optimized. More info (in Dutch): www.nationaleregenradar.nl

  15. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  16. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  17. Computer Forensics Education - the Open Source Approach

    NASA Astrophysics Data System (ADS)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  18. Open Source Software and Design-Based Research Symbiosis in Developing 3D Virtual Learning Environments: Examples from the iSocial Project

    ERIC Educational Resources Information Center

    Schmidt, Matthew; Galyen, Krista; Laffey, James; Babiuch, Ryan; Schmidt, Carla

    2014-01-01

    Design-based research (DBR) and open source software are both acknowledged as potentially productive ways for advancing learning technologies. These approaches have practical benefits for the design and development process and for building and leveraging community to augment and sustain design and development. This report presents a case study of…

  19. Dynamic robustness of knowledge collaboration network of open source product development community

    NASA Astrophysics Data System (ADS)

    Zhou, Hong-Li; Zhang, Xiao-Dong

    2018-01-01

    As an emergent innovative design style, open source product development communities are characterized by a self-organizing, mass collaborative, networked structure. The robustness of the community is critical to its performance. Using the complex network modeling method, the knowledge collaboration network of the community is formulated, and the robustness of the network is systematically and dynamically studied. The characteristics of the network along the development period determine that its robustness should be studied from three time stages: the start-up, development and mature stages of the network. Five kinds of user-loss pattern are designed, to assess the network's robustness under different situations in each of these three time stages. Two indexes - the largest connected component and the network efficiency - are used to evaluate the robustness of the community. The proposed approach is applied in an existing open source car design community. The results indicate that the knowledge collaboration networks show different levels of robustness in different stages and different user loss patterns. Such analysis can be applied to provide protection strategies for the key users involved in knowledge dissemination and knowledge contribution at different stages of the network, thereby promoting the sustainable and stable development of the open source community.

  20. Achieving Better Buying Power through Acquisition of Open Architecture Software Systems. Volume 2 Understanding Open Architecture Software Systems: Licensing and Security Research and Recommendations

    DTIC Science & Technology

    2016-01-06

    of- breed software components and software products lines (SPLs) that are subject to different IP license and cybersecurity requirements. The... commercially priced closed source software components, to be used in the design, implementation, deployment, and evolution of open architecture (OA... breed software components and software products lines (SPLs) that are subject to different IP license and cybersecurity requirements. The Department

  1. An Analysis of the President’s Budgetary Proposals for Fiscal Year 2006

    DTIC Science & Technology

    2005-03-01

    Domestic Product (Average percentage change from CBO’s baseline) Source: Congressional Budget Office. Notes: The “textbook” growth model is an...Global Insight Closed-Economy Life-Cycle Model Open-Economy Life-Cycle Model Textbook Model Memorandum: Gross National Product Open-Economy Life-Cycle...domestic product in the models . 2. Over time, however, increased investment will enlarge the capital stock, in turn reducing the pretax rate of return and

  2. Embracing Open Source for NASA's Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin

    2017-01-01

    The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.

  3. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yidong Xia; Mitch Plummer; Robert Podgorney

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less

  4. Evaluation of selective vs. point-source perforating for hydraulic fracturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Underwood, P.J.; Kerley, L.

    1996-12-31

    This paper is a case history comparing and evaluating the effects of fracturing the Reef Ridge Diatomite formation in the Midway-Sunset Field, Kern County, California, using {open_quotes}select-fire{close_quotes} and {open_quotes}point-source{close_quotes} perforating completions. A description of the reservoir, production history, and fracturing techniques used leading up to this study is presented. Fracturing treatment analysis and production history matching were used to evaluate the reservoir and fracturing parameters for both completion types. The work showed that single fractures were created with the point-source (PS) completions, and multiple fractures resulted from many of the select-fire (SF) completions. A good correlation was developed between productivitymore » and the product of formation permeability, net fracture height, bottomhole pressure, and propped fracture length. Results supported the continued development of 10 wells using the PS concept with a more efficient treatment design, resulting in substantial cost savings.« less

  5. A New Architecture for Visualization: Open Mission Control Technologies

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2017-01-01

    Open Mission Control Technologies (MCT) is a new architecture for visualisation of mission data. Driven by requirements for new mission capabilities, including distributed mission operations, access to data anywhere, customization by users, synthesis of multiple data sources, and flexibility for multi-mission adaptation, Open MCT provides users with an integrated customizable environment. Developed at NASAs Ames Research Center (ARC), in collaboration with NASAs Advanced Multimission Operations System (AMMOS) and NASAs Jet Propulsion Laboratory (JPL), Open MCT is getting its first mission use on the Jason 3 Mission, and is also available in the testbed for the Mars 2020 Rover and for development use for NASAs Resource Prospector Lunar Rover. The open source nature of the project provides for use outside of space missions, including open source contributions from a community of users. The defining features of Open MCT for mission users are data integration, end user composition and multiple views. Data integration provides access to mission data across domains in one place, making data such as activities, timelines, telemetry, imagery, event timers and procedures available in one place, without application switching. End user composition provides users with layouts, which act as a canvas to assemble visualisations. Multiple views provide the capability to view the same data in different ways, with live switching of data views in place. Open MCT is browser based, and works on the desktop as well as tablets and phones, providing access to data anywhere. An early use case for mobile data access took place on the Resource Prospector (RP) Mission Distributed Operations Test, in which rover engineers in the field were able to view telemetry on their phones. We envision this capability providing decision support to on console operators from off duty personnel. The plug-in architecture also allows for adaptation for different mission capabilities. Different data types and capabilities may be added or removed using plugins. An API provides a means to write new capabilities and to create data adaptors. Data plugins exist for mission data sources for NASA missions. Adaptors have been written by international and commercial users. Open MCT is open source. Open source enables collaborative development across organizations and also makes the product available outside of the space community, providing a potential source of usage and ideas to drive product design and development. The combination of open source with an Apache 2 license, and distribution on GitHub, has enabled an active community of users and contributors. The spectrum of users for Open MCT is, to our knowledge, unprecedented for mission software. In addition to our NASA users, we have, through open source, had users and inquires on projects ranging from Internet of Things, to radio hobbyists, to farming projects. We have an active community of contributors, enabling a flow of ideas inside and outside of the space community.

  6. Open source tools for ATR development and performance evaluation

    NASA Astrophysics Data System (ADS)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  7. Alone Together: A Socio-Technical Theory of Motivation, Coordination and Collaboration Technologies in Organizing for Free and Open Source Software Development

    ERIC Educational Resources Information Center

    Howison, James

    2009-01-01

    This dissertation presents evidence that the production of Free and Open Source Software (FLOSS) is far more alone than together; it is far more often individual work done "in company" than it is teamwork. When tasks appear too large for an individual they are more likely to be deferred until they are easier rather than be undertaken through…

  8. Performance Assessment of Network Intrusion-Alert Prediction

    DTIC Science & Technology

    2012-09-01

    the threats. In this thesis, we use Snort to generate the intrusion detection alerts. 2. SNORT Snort is an open source network intrusion...standard for IPS. (Snort, 2012) We choose Snort because it is an open source product that is free to download and can be deployed cross-platform...Learning & prediction in relational time series: A survey. 21st Behavior Representation in Modeling & Simulation ( BRIMS ) Conference 2012, 93–100. Tan

  9. Open-source point-of-care electronic medical records for use in resource-limited settings: systematic review and questionnaire surveys

    PubMed Central

    Bru, Juan; Berger, Christopher A

    2012-01-01

    Background Point-of-care electronic medical records (EMRs) are a key tool to manage chronic illness. Several EMRs have been developed for use in treating HIV and tuberculosis, but their applicability to primary care, technical requirements and clinical functionalities are largely unknown. Objectives This study aimed to address the needs of clinicians from resource-limited settings without reliable internet access who are considering adopting an open-source EMR. Study eligibility criteria Open-source point-of-care EMRs suitable for use in areas without reliable internet access. Study appraisal and synthesis methods The authors conducted a comprehensive search of all open-source EMRs suitable for sites without reliable internet access. The authors surveyed clinician users and technical implementers from a single site and technical developers of each software product. The authors evaluated availability, cost and technical requirements. Results The hardware and software for all six systems is easily available, but they vary considerably in proprietary components, installation requirements and customisability. Limitations This study relied solely on self-report from informants who developed and who actively use the included products. Conclusions and implications of key findings Clinical functionalities vary greatly among the systems, and none of the systems yet meet minimum requirements for effective implementation in a primary care resource-limited setting. The safe prescribing of medications is a particular concern with current tools. The dearth of fully functional EMR systems indicates a need for a greater emphasis by global funding agencies to move beyond disease-specific EMR systems and develop a universal open-source health informatics platform. PMID:22763661

  10. Using OpenOffice as a Portable Interface to JAVA-Based Applications

    NASA Astrophysics Data System (ADS)

    Comeau, T.; Garrett, B.; Richon, J.; Romelfanger, F.

    2004-07-01

    STScI previously used Microsoft Word and Microsoft Access, a Sybase ODBC driver, and the Adobe Acrobat PDF writer, along with a substantial amount of Visual Basic, to generate a variety of documents for the internal Space Telescope Grants Administration System (STGMS). While investigating an upgrade to Microsoft Office XP, we began considering alternatives, ultimately selecting an open source product, OpenOffice.org. This reduces the total number of products required to operate the internal STGMS system, simplifies the build system, and opens the possibility of moving to a non-Windows platform. We describe the experience of moving from Microsoft Office to OpenOffice.org, and our other internal uses of OpenOffice.org in our development environment.

  11. 16 CFR 1211.14 - Instruction manual.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... opener. 4. Where possible, install door opener 7 feet or more above the floor. For products requiring an emergency release, mount the emergency release 6 feet above the floor. 5. Do not connect opener to source of... height of 5 feet so small children cannot reach it, and (c) away from all moving parts of the door. 7...

  12. Open source system OpenVPN in a function of Virtual Private Network

    NASA Astrophysics Data System (ADS)

    Skendzic, A.; Kovacic, B.

    2017-05-01

    Using of Virtual Private Networks (VPN) can establish high security level in network communication. VPN technology enables high security networking using distributed or public network infrastructure. VPN uses different security and managing rules inside networks. It can be set up using different communication channels like Internet or separate ISP communication infrastructure. VPN private network makes security communication channel over public network between two endpoints (computers). OpenVPN is an open source software product under GNU General Public License (GPL) that can be used to establish VPN communication between two computers inside business local network over public communication infrastructure. It uses special security protocols and 256-bit Encryption and it is capable of traversing network address translators (NATs) and firewalls. It allows computers to authenticate each other using a pre-shared secret key, certificates or username and password. This work gives review of VPN technology with a special accent on OpenVPN. This paper will also give comparison and financial benefits of using open source VPN software in business environment.

  13. Creating system engineering products with executable models in a model-based engineering environment

    NASA Astrophysics Data System (ADS)

    Karban, Robert; Dekens, Frank G.; Herzig, Sebastian; Elaasar, Maged; Jankevičius, Nerijus

    2016-08-01

    Applying systems engineering across the life-cycle results in a number of products built from interdependent sources of information using different kinds of system level analysis. This paper focuses on leveraging the Executable System Engineering Method (ESEM) [1] [2], which automates requirements verification (e.g. power and mass budget margins and duration analysis of operational modes) using executable SysML [3] models. The particular value proposition is to integrate requirements, and executable behavior and performance models for certain types of system level analysis. The models are created with modeling patterns that involve structural, behavioral and parametric diagrams, and are managed by an open source Model Based Engineering Environment (named OpenMBEE [4]). This paper demonstrates how the ESEM is applied in conjunction with OpenMBEE to create key engineering products (e.g. operational concept document) for the Alignment and Phasing System (APS) within the Thirty Meter Telescope (TMT) project [5], which is under development by the TMT International Observatory (TIO) [5].

  14. C3I and Modelling and Simulation (M&S) Interoperability

    DTIC Science & Technology

    2004-03-01

    customised Open Source products. The technical implementation is based on the use of the eXtendend Markup Language (XML) and Python . XML is developed...to structure, store and send information. The language is focus on the description of data. Python is a portable, interpreted, object-oriented...programming language. A huge variety of usable Open Source Projects were issued by the Python Community. 3.1 Phase 1: Feasibility Studies Phase 1 was

  15. Negative ion production in large volume source with small deposition of cesium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacquot, C.; Pamela, J.; Riz, D.

    1996-03-01

    Experimental data on the enhancement of D{sup {minus}} (H{sup {minus}}) negative ion production due to cesium injection into a large volume multiampere negative ion source (MANTIS) are described. The directed deposition of small cesium amounts (5{endash}100 mg) from a compact, movable oven, placed into the central part of a MANTIS gas-discharge box was used. A calorimetrically measured D{sup {minus}} beam with an intensity up to 1.6 A and an extracted current density up to 4.2 mA/cm{sup 2} (beam energy 25 kV) was obtained. Exactly 30 mg of cesium provides at least one month of source operation (1000 pulses with amore » discharge pulse duration of 4 s). The effect of cesium on NI enhancement was immediately displayed after the distributed Cs deposition, but it needed some {open_quote}{open_quote}conditioning{close_quote}{close_quote} of cesium by tens of discharge pulses (or by several hours {open_quote}{open_quote}pause{close_quote}{close_quote}) in the case of a localized Cs deposition. No degradation of extraction-acceleration voltage holding on within the tested range of cesium injection was observed. {copyright} {ital 1996 American Institute of Physics.}« less

  16. ION PRODUCING MECHANISM

    DOEpatents

    MacKenzie, K.R.

    1958-09-01

    An ion source is described for use in a calutron and more particularly deals with an improved filament arrangement for a calutron. According to the invention, the ion source block has a gas ionizing passage open along two adjoining sides of the block. A filament is disposed in overlying relation to one of the passage openings and has a greater width than the passage width, so that both the filament and opening lengths are parallel and extend in a transverse relation to the magnetic field. The other passage opening is parallel to the length of the magnetic field. This arrangement is effective in assisting in the production of a stable, long-lived arc for the general improvement of calutron operation.

  17. Open source 3D printers: an appropriate technology for building low cost optics labs for the developing communities

    NASA Astrophysics Data System (ADS)

    Gwamuri, J.; Pearce, Joshua M.

    2017-08-01

    The recent introduction of RepRap (self-replicating rapid prototyper) 3-D printers and the resultant open source technological improvements have resulted in affordable 3-D printing, enabling low-cost distributed manufacturing for individuals. This development and others such as the rise of open source-appropriate technology (OSAT) and solar powered 3-D printing are moving 3-D printing from an industry based technology to one that could be used in the developing world for sustainable development. In this paper, we explore some specific technological improvements and how distributed manufacturing with open-source 3-D printing can be used to provide open-source 3-D printable optics components for developing world communities through the ability to print less expensive and customized products. This paper presents an open-source low cost optical equipment library which enables relatively easily adapted customizable designs with the potential of changing the way optics is taught in resource constraint communities. The study shows that this method of scientific hardware development has a potential to enables a much broader audience to participate in optical experimentation both as research and teaching platforms. Conclusions on the technical viability of 3-D printing to assist in development and recommendations on how developing communities can fully exploit this technology to improve the learning of optics through hands-on methods have been outlined.

  18. Neural ensemble communities: open-source approaches to hardware for large-scale electrophysiology.

    PubMed

    Siegle, Joshua H; Hale, Gregory J; Newman, Jonathan P; Voigts, Jakob

    2015-06-01

    One often-overlooked factor when selecting a platform for large-scale electrophysiology is whether or not a particular data acquisition system is 'open' or 'closed': that is, whether or not the system's schematics and source code are available to end users. Open systems have a reputation for being difficult to acquire, poorly documented, and hard to maintain. With the arrival of more powerful and compact integrated circuits, rapid prototyping services, and web-based tools for collaborative development, these stereotypes must be reconsidered. We discuss some of the reasons why multichannel extracellular electrophysiology could benefit from open-source approaches and describe examples of successful community-driven tool development within this field. In order to promote the adoption of open-source hardware and to reduce the need for redundant development efforts, we advocate a move toward standardized interfaces that connect each element of the data processing pipeline. This will give researchers the flexibility to modify their tools when necessary, while allowing them to continue to benefit from the high-quality products and expertise provided by commercial vendors. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Evaluation of Teacher Perceptions and Potential of OpenOffice in a K-12 School District

    ERIC Educational Resources Information Center

    Vajda, James; Abbitt, Jason T.

    2011-01-01

    Through this mixed-method evaluation study the authors investigated a pilot implementation of an open-source productivity suite for teachers in a K-12 public school district. The authors evaluated OpenOffice version 3.0 using measures identified by the technology acceptance model as predictors of acceptance and use of technology systems. During a…

  20. Automatic tracking of red blood cells in micro channels using OpenCV

    NASA Astrophysics Data System (ADS)

    Rodrigues, Vânia; Rodrigues, Pedro J.; Pereira, Ana I.; Lima, Rui

    2013-10-01

    The present study aims to developan automatic method able to track red blood cells (RBCs) trajectories flowing through a microchannel using the Open Source Computer Vision (OpenCV). The developed method is based on optical flux calculation assisted by the maximization of the template-matching product. The experimental results show a good functional performance of this method.

  1. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  2. The Core Flight System (cFS) Community: Providing Low Cost Solutions for Small Spacecraft

    NASA Technical Reports Server (NTRS)

    McComas, David; Wilmot, Jonathan; Cudmore, Alan

    2016-01-01

    In February 2015 the NASA Goddard Space Flight Center (GSFC) completed the open source release of the entire Core Flight Software (cFS) suite. After the open source release a multi-NASA center Configuration Control Board (CCB) was established that has managed multiple cFS product releases. The cFS was developed and is being maintained in compliance with the NASA Class B software development process requirements and the open source release includes all Class B artifacts. The cFS is currently running on three operational science spacecraft and is being used on multiple spacecraft and instrument development efforts. While the cFS itself is a viable flight software (FSW) solution, we have discovered that the cFS community is a continuous source of innovation and growth that provides products and tools that serve the entire FSW lifecycle and future mission needs. This paper summarizes the current state of the cFS community, the key FSW technologies being pursued, the development/verification tools and opportunities for the small satellite community to become engaged. The cFS is a proven high quality and cost-effective solution for small satellites with constrained budgets.

  3. Mother Lode: The Untapped Rare Earth Mineral Resources of Vietnam

    DTIC Science & Technology

    2013-11-01

    Library of Congress, Congressional Research Service. Rare Earth Elements: The Global Supply Chain, 4. 14 Tse , Pui-Kwan. China’s Rare-Earth Industry...U.S. Geological Survey Open-File Report 2011–1042, 2. Figure 2. Global REO production, 1960-2011. Source: Tse , Pui-Kwan. China’s Rare-Earth...3 compiled from three sources: Tse , Pui-Kwan. China’s Rare-Earth Industry: U.S. Geological Survey Open-File Report 2011–1042, 4; Areddy, James T

  4. Getting Open Source Right for Big Data Analytics: Software Sharing, Governance, Collaboration and Most of All, Fun!

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.

    2013-12-01

    A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.

  5. Application of polar orbiter products in weather forecasting using open source tools and open standards

    NASA Astrophysics Data System (ADS)

    Plieger, Maarten; de Vreede, Ernst

    2015-04-01

    EUMETSAT disseminates data for a number of polar satellites. At KNMI these data are not fully used for operational weather forecasting mainly because of the irregular coverage and lack of tools for handling these different types of data and products. For weather forecasting there is a lot of interest in the application of products from these polar orbiters. One of the key aspects is the high-resolution of these products, which can complement the information provided by numerical weather forecasts. Another advantage over geostationary satellites is the high coverage at higher latitudes and lack of parallax. Products like the VIIRS day-night band offer many possibilities for this application. This presentation will describe a project that aims to make available a number of products from polar satellites to the forecasting operation. The goal of the project is to enable easy and timely access to polar orbiter products and enable combined presentations of satellite imagery with model data. The system will be able to generate RGB composites (“false colour images”) for operational use. The system will be built using open source components and open standards. Pytroll components are used for data handling, reprojection and derived product generation. For interactive presentation of imagery the browser based ADAGUC WMS viewer component is used. Image generation is done by ADAGUC server components, which provide OGC WMS services. Polar satellite products are stored as true color RGBA data in the NetCDF file format, the satellite swaths are stored as regular grids with their own custom geographical projection. The ADAGUC WMS system is able to reproject, render and combine these data in a webbrowser interactively. Results and lessons learned will be presented at the conference.

  6. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions.

    PubMed

    Shenoy, Shailesh M

    2016-07-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.

  7. A Survey of Usability Practices in Free/Libre/Open Source Software

    NASA Astrophysics Data System (ADS)

    Paul, Celeste Lyn

    A review of case studies about usability in eight Free/Libre/Open Source Software (FLOSS) projects showed that an important issue regarding a usability initiative in the project was the lack of user research. User research is a key component in the user-centered design (UCD) process and a necessary step for creating usable products. Reasons why FLOSS projects suffered from a lack of user research included poor or unclear project leadership, cultural differences between developer and designers, and a lack of usability engineers. By identifying these critical issues, the FLOSS usability community can begin addressing problems in the efficacy of usability activities and work towards creating more usable FLOSS products.

  8. Depositional environment and distribution of Late Cretaceous [open quotes]source rocks[close quotes] from Costa Rica to West Africa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erlich, R.N.; Sofer, Z.; Pratt, L.M.

    1993-02-01

    Late Cretaceous [open quotes]source rocks[close quotes] from Costa Rica, western and eastern Venezuela, and Trinidad were studied using organic and inorganic geochemistry, biostratigraphy, and sedimentology in order to determine their depositional environments. Bulk mineralogy and major element geochemistry for 304 samples were combined with Rock Eval data and extract biomaker analysis to infer the types and distributions of the various Late Cretaceous productivity systems represented in the dataset. When data from this study are combined with published and proprietary data from offshore West Africa, Guyana/Suriname, and the central Caribbean, they show that these Late Cretaceous units can be correlated bymore » their biogeochemical characteristics to establish their temporal and spatial relationships. Paleogeographic maps constructed for the early to late Cenomanian, Turonian, Coniacian to middle Santonian, and late Santonian to latest Campanian show that upwelling and excessive fluvial runoff were probably the dominant sources of nutrient supply to the coastal productivity systems. The late Santonian to Maastrichtian rocks examined in this study indicate that organic material was poorly preserved after deposition, even though biologic productivity remained constant or changed only slightly. A rapid influx of oxygenated bottom water may have occurred following the opening of a deep water connection between the North and South Atlantic oceans, and/or separation of India from Africa and the establishment of an Antarctic oceanic connection. This study suggests that the most important factors that controlled source rock quality in northern South America were productivity, preservation, degree of clastic dilution, and subsurface diagenesis.« less

  9. Developing open source, self-contained disease surveillance software applications for use in resource-limited settings

    PubMed Central

    2012-01-01

    Background Emerging public health threats often originate in resource-limited countries. In recognition of this fact, the World Health Organization issued revised International Health Regulations in 2005, which call for significantly increased reporting and response capabilities for all signatory nations. Electronic biosurveillance systems can improve the timeliness of public health data collection, aid in the early detection of and response to disease outbreaks, and enhance situational awareness. Methods As components of its Suite for Automated Global bioSurveillance (SAGES) program, The Johns Hopkins University Applied Physics Laboratory developed two open-source, electronic biosurveillance systems for use in resource-limited settings. OpenESSENCE provides web-based data entry, analysis, and reporting. ESSENCE Desktop Edition provides similar capabilities for settings without internet access. Both systems may be configured to collect data using locally available cell phone technologies. Results ESSENCE Desktop Edition has been deployed for two years in the Republic of the Philippines. Local health clinics have rapidly adopted the new technology to provide daily reporting, thus eliminating the two-to-three week data lag of the previous paper-based system. Conclusions OpenESSENCE and ESSENCE Desktop Edition are two open-source software products with the capability of significantly improving disease surveillance in a wide range of resource-limited settings. These products, and other emerging surveillance technologies, can assist resource-limited countries compliance with the revised International Health Regulations. PMID:22950686

  10. Open-source LCA tool for estimating greenhouse gas emissions from crude oil production using field characteristics.

    PubMed

    El-Houjeiri, Hassan M; Brandt, Adam R; Duffy, James E

    2013-06-04

    Existing transportation fuel cycle emissions models are either general and calculate nonspecific values of greenhouse gas (GHG) emissions from crude oil production, or are not available for public review and auditing. We have developed the Oil Production Greenhouse Gas Emissions Estimator (OPGEE) to provide open-source, transparent, rigorous GHG assessments for use in scientific assessment, regulatory processes, and analysis of GHG mitigation options by producers. OPGEE uses petroleum engineering fundamentals to model emissions from oil and gas production operations. We introduce OPGEE and explain the methods and assumptions used in its construction. We run OPGEE on a small set of fictional oil fields and explore model sensitivity to selected input parameters. Results show that upstream emissions from petroleum production operations can vary from 3 gCO2/MJ to over 30 gCO2/MJ using realistic ranges of input parameters. Significant drivers of emissions variation are steam injection rates, water handling requirements, and rates of flaring of associated gas.

  11. Energy Spectral Behaviors of Communication Networks of Open-Source Communities

    PubMed Central

    Yang, Jianmei; Yang, Huijie; Liao, Hao; Wang, Jiangtao; Zeng, Jinqun

    2015-01-01

    Large-scale online collaborative production activities in open-source communities must be accompanied by large-scale communication activities. Nowadays, the production activities of open-source communities, especially their communication activities, have been more and more concerned. Take CodePlex C # community for example, this paper constructs the complex network models of 12 periods of communication structures of the community based on real data; then discusses the basic concepts of quantum mapping of complex networks, and points out that the purpose of the mapping is to study the structures of complex networks according to the idea of quantum mechanism in studying the structures of large molecules; finally, according to this idea, analyzes and compares the fractal features of the spectra in different quantum mappings of the networks, and concludes that there are multiple self-similarity and criticality in the communication structures of the community. In addition, this paper discusses the insights and application conditions of different quantum mappings in revealing the characteristics of the structures. The proposed quantum mapping method can also be applied to the structural studies of other large-scale organizations. PMID:26047331

  12. 78 FR 45565 - Notice Pursuant to the National Cooperative Research and Production Act of 1993 -- tranSMART...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-29

    ... activities are to enable effective sharing, integration, standardization, and analysis of heterogeneous data from collaborative translational research by mobilizing the tranSMART open- source and open-data...: (a) Establish and sustain tranSMART as the preferred data sharing and analytics platform for...

  13. OnEarth: An Open Source Solution for Efficiently Serving High-Resolution Mapped Image Products

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Plesea, L.; Hall, J. R.; Roberts, J. T.; Cechini, M. F.; Schmaltz, J. E.; Alarcon, C.; Huang, T.; McGann, J. M.; Chang, G.; Boller, R. A.; Ilavajhala, S.; Murphy, K. J.; Bingham, A. W.

    2013-12-01

    This presentation introduces OnEarth, a server side software package originally developed at the Jet Propulsion Laboratory (JPL), that facilitates network-based, minimum-latency geolocated image access independent of image size or spatial resolution. The key component in this package is the Meta Raster Format (MRF), a specialized raster file extension to the Geospatial Data Abstraction Library (GDAL) consisting of an internal indexed pyramid of image tiles. Imagery to be served is converted to the MRF format and made accessible online via an expandable set of server modules handling requests in several common protocols, including the Open Geospatial Consortium (OGC) compliant Web Map Tile Service (WMTS) as well as Tiled WMS and Keyhole Markup Language (KML). OnEarth has recently transitioned to open source status and is maintained and actively developed as part of GIBS (Global Imagery Browse Services), a collaborative project between JPL and Goddard Space Flight Center (GSFC). The primary function of GIBS is to enhance and streamline the data discovery process and to support near real-time (NRT) applications via the expeditious ingestion and serving of full-resolution imagery representing science products from across the NASA Earth Science spectrum. Open source software solutions are leveraged where possible in order to utilize existing available technologies, reduce development time, and enlist wider community participation. We will discuss some of the factors and decision points in transitioning OnEarth to a suitable open source paradigm, including repository and licensing agreement decision points, institutional hurdles, and perceived benefits. We will also provide examples illustrating how OnEarth is integrated within GIBS and other applications.

  14. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  15. Improving Software Sustainability: Lessons Learned from Profiles in Science.

    PubMed

    Gallagher, Marie E

    2013-01-01

    The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment. Sometimes these changes happen on short notice, so we continually monitor our library's software for signs of endangerment. We have attempted to replace proprietary software with suitable in-house or open source software. When the replacement involves a standalone piece of software with a nearly equivalent version, such as replacing a commercial HTTP server with an open source HTTP server, the replacement is straightforward. Recently we replaced software that functioned not only as our search engine but also as the backbone of the architecture of our Web site. In this paper, we describe the lessons learned and the pros and cons of replacing this software with open source software.

  16. Biomass production of multipopulation microalgae in open air pond for biofuel potential.

    PubMed

    Selvakumar, P; Umadevi, K

    2016-04-01

    Biodiesel gains attention as it is made from renewable resources and has considerable environmental benefits. The present investigation has focused on large scale cultivation of multipopulation microalgae in open air pond using natural sea water without any additional nutritive supplements for low cost biomass production as a possible source of biofuel in large scale. Open air algal pond attained average chlorophyll concentration of 11.01 µg/L with the maximum of 43.65 µg/L as well as a higher lipid concentration of 18% (w/w) with lipid content 9.3 mg/L on the 10th day of the culture; and maximum biomass of 0.36 g/L on the 7th day of the culture. Composition analysis of fatty acid methyl ester (FAME) was performed by gas chromatography and mass spectrometry (GCMS). Multipopulation of algal biomass had 18% of total lipid content with 55% of total saturated fatty acids (SFA), 35.3% of monounsaturated fatty acids (MUFA) and 9.7% of polyunsaturated fatty acids (PUFA), revealing a potential source of biofuel production at low cost.

  17. A Platform for Innovation and Standards Evaluation: a Case Study from the OpenMRS Open-Source Radiology Information System.

    PubMed

    Gichoya, Judy W; Kohli, Marc; Ivange, Larry; Schmidt, Teri S; Purkayastha, Saptarshi

    2018-05-10

    Open-source development can provide a platform for innovation by seeking feedback from community members as well as providing tools and infrastructure to test new standards. Vendors of proprietary systems may delay adoption of new standards until there are sufficient incentives such as legal mandates or financial incentives to encourage/mandate adoption. Moreover, open-source systems in healthcare have been widely adopted in low- and middle-income countries and can be used to bridge gaps that exist in global health radiology. Since 2011, the authors, along with a community of open-source contributors, have worked on developing an open-source radiology information system (RIS) across two communities-OpenMRS and LibreHealth. The main purpose of the RIS is to implement core radiology workflows, on which others can build and test new radiology standards. This work has resulted in three major releases of the system, with current architectural changes driven by changing technology, development of new standards in health and imaging informatics, and changing user needs. At their core, both these communities are focused on building general-purpose EHR systems, but based on user contributions from the fringes, we have been able to create an innovative system that has been used by hospitals and clinics in four different countries. We provide an overview of the history of the LibreHealth RIS, the architecture of the system, overview of standards integration, describe challenges of developing an open-source product, and future directions. Our goal is to attract more participation and involvement to further develop the LibreHealth RIS into an Enterprise Imaging System that can be used in other clinical imaging including pathology and dermatology.

  18. Open source marketing: Camel cigarette brand marketing in the "Web 2.0" world.

    PubMed

    Freeman, B; Chapman, S

    2009-06-01

    The international trend towards comprehensive bans on tobacco advertising has seen the tobacco industry become increasingly innovative in its approach to marketing. Further fuelling this innovation is the rapid evolution and accessibility of web-based technology. The internet, as a relatively unregulated marketing environment, provides many opportunities for tobacco companies to pursue their promotional ambitions. In this paper, "open source marketing" is considered as a vehicle that has been appropriated by the tobacco industry, through a case study of efforts to design the packaging for the Camel Signature Blends range of cigarettes. Four sources are used to explore this case study including a marketing literature search, a web-based content search via the Google search engine, interviews with advertising trade informants and an analysis of the Camel brand website. RJ Reynolds (RJR) has proven to be particularly innovative in designing cigarette packaging. RJR engaged with thousands of consumers through their Camel brand website to design four new cigarette flavours and packages. While the Camel Signature Blends packaging designs were subsequently modified for the retail market due to problems arising with their cartoon-like imagery, important lessons arise on how the internet blurs the line between marketing and market research. Open source marketing has the potential to exploit advertising ban loopholes and stretch legal definitions in order to generate positive word of mouth about tobacco products. There are also lessons in the open source marketing movement for more effective tobacco control measures including interactive social marketing campaigns and requiring plain packaging of tobacco products.

  19. Cascading influence of inorganic nitrogen sources on DOM production, composition, lability and microbial community structure in the open ocean.

    PubMed

    Goldberg, S J; Nelson, C E; Viviani, D A; Shulse, C N; Church, M J

    2017-09-01

    Nitrogen frequently limits oceanic photosynthesis and the availability of inorganic nitrogen sources in the surface oceans is shifting with global change. We evaluated the potential for abrupt increases in inorganic N sources to induce cascading effects on dissolved organic matter (DOM) and microbial communities in the surface ocean. We collected water from 5 m depth in the central North Pacific and amended duplicate 20 liter polycarbonate carboys with nitrate or ammonium, tracking planktonic carbon fixation, DOM production, DOM composition and microbial community structure responses over 1 week relative to controls. Both nitrogen sources stimulated bulk phytoplankton, bacterial and DOM production and enriched Synechococcus and Flavobacteriaceae; ammonium enriched for oligotrophic Actinobacteria OM1 and Gammaproteobacteria KI89A clades while nitrate enriched Gammaproteobacteria SAR86, SAR92 and OM60 clades. DOM resulting from both N enrichments was more labile and stimulated growth of copiotrophic Gammaproteobacteria (Alteromonadaceae and Oceanospirillaceae) and Alphaproteobacteria (Rhodobacteraceae and Hyphomonadaceae) in weeklong dark incubations relative to controls. Our study illustrates how nitrogen pulses may have direct and cascading effects on DOM composition and microbial community dynamics in the open ocean. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.

  20. openECA Platform and Analytics Alpha Test Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.

  1. openECA Platform and Analytics Beta Demonstration Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.

  2. A Customizable and Expandable Electroencephalography (EEG) Data Collection System

    DTIC Science & Technology

    2016-03-01

    devices, including Emotiv Systems3 and Advanced Brain Monitoring,4 as well as open source alternatives such as OpenBCI.5 These products generally...Analysis, Inc and ACI Society; 2006. p. 91–101. 3. Emotiv . San Francisco (CA): Emotiv , Inc; c2011 – 2015 [accessed 2015 Mar 6]. http://emotiv.com/. 4

  3. Testing of Cerex Open Path Ultraviolet Differential Optical Absorption Spectroscopy Systems for Fenceline Monitoring Applications

    EPA Science Inventory

    Industrial facilities, energy production, and refining operations can be significant sources of gas-phase air pollutants. Some industrial emissions originate from fugitive sources (leaks) or process malfunctions and can be mitigated if identified. In recent amendments to the Nati...

  4. A high-resolution open biomass burning emission inventory based on statistical data and MODIS observations in mainland China

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Fan, M.; Huang, Z.; Zheng, J.; Chen, L.

    2017-12-01

    Open biomass burning which has adverse effects on air quality and human health is an important source of gas and particulate matter (PM) in China. Current emission estimations of open biomass burning are generally based on single source (alternative to statistical data and satellite-derived data) and thus contain large uncertainty due to the limitation of data. In this study, to quantify the 2015-based amount of open biomass burning, we established a new estimation method for open biomass burning activity levels by combining the bottom-up statistical data and top-down MODIS observations. And three sub-category sources which used different activity data were considered. For open crop residue burning, the "best estimate" of activity data was obtained by averaging the statistical data from China statistical yearbooks and satellite observations from MODIS burned area product MCD64A1 weighted by their uncertainties. For the forest and grassland fires, their activity levels were represented by the combination of statistical data and MODIS active fire product MCD14ML. Using the fire radiative power (FRP) which is considered as a better indicator of active fire level as the spatial allocation surrogate, coarse gridded emissions were reallocated into 3km ×3km grids to get a high-resolution emission inventory. Our results showed that emissions of CO, NOx, SO2, NH3, VOCs, PM2.5, PM10, BC and OC in mainland China were 6607, 427, 84, 79, 1262, 1198, 1222, 159 and 686 Gg/yr, respectively. Among all provinces of China, Henan, Shandong and Heilongjiang were the top three contributors to the total emissions. In this study, the developed open biomass burning emission inventory with a high-resolution could support air quality modeling and policy-making for pollution control.

  5. Re-utilization of Industrial CO 2 for Algae Production Using a Phase Change Material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, Brian

    This is the final report of a 36-month Phase II cooperative agreement. Under this project, Touchstone Research Laboratory (Touchstone) investigated the merits of incorporating a Phase Change Material (PCM) into an open-pond algae production system that can capture and re-use the CO 2 from a coal-fired flue gas source located in Wooster, OH. The primary objective of the project was to design, construct, and operate a series of open algae ponds that accept a slipstream of flue gas from a coal-fired source and convert a significant portion of the CO 2 to liquid biofuels, electricity, and specialty products, while demonstratingmore » the merits of the PCM technology. Construction of the pilot facility and shakedown of the facility in Wooster, OH, was completed during the first two years, and the focus of the last year was on operations and the cultivation of algae. During this Phase II effort a large-scale algae concentration unit from OpenAlgae was installed and utilized to continuously harvest algae from indoor raceways. An Algae Lysing Unit and Oil Recovery Unit were also received and installed. Initial parameters for lysing nanochloropsis were tested. Conditions were established that showed the lysing operation was effective at killing the algae cells. Continuous harvesting activities yielded over 200 kg algae dry weight for Ponds 1, 2 and 4. Studies were conducted to determine the effect of anaerobic digestion effluent as a nutrient source and the resulting lipid productivity of the algae. Lipid content and total fatty acids were unaffected by culture system and nutrient source, indicating that open raceway ponds fed diluted anaerobic digestion effluent can obtain similar lipid productivities to open raceway ponds using commercial nutrients. Data were also collected with respect to the performance of the PCM material on the pilot-scale raceway ponds. Parameters such as evaporative water loss, temperature differences, and growth/productivity were tracked. The pond with the PCM material was consistently 2 to 5°C warmer than the control pond. This difference did not seem to increase significantly over time. During phase transitions for the PCM, the magnitude of the difference between the daily minimum and maximum temperatures decreased, resulting in smaller daily temperature fluctuations. A thin layer of PCM material reduced overall water loss by 74% and consistently provided algae densities that were 80% greater than the control pond.« less

  6. Neural ensemble communities: Open-source approaches to hardware for large-scale electrophysiology

    PubMed Central

    Siegle, Joshua H.; Hale, Gregory J.; Newman, Jonathan P.; Voigts, Jakob

    2014-01-01

    One often-overlooked factor when selecting a platform for large-scale electrophysiology is whether or not a particular data acquisition system is “open” or “closed”: that is, whether or not the system’s schematics and source code are available to end users. Open systems have a reputation for being difficult to acquire, poorly documented, and hard to maintain. With the arrival of more powerful and compact integrated circuits, rapid prototyping services, and web-based tools for collaborative development, these stereotypes must be reconsidered. We discuss some of the reasons why multichannel extracellular electrophysiology could benefit from open-source approaches and describe examples of successful community-driven tool development within this field. In order to promote the adoption of open-source hardware and to reduce the need for redundant development efforts, we advocate a move toward standardized interfaces that connect each element of the data processing pipeline. This will give researchers the flexibility to modify their tools when necessary, while allowing them to continue to benefit from the high-quality products and expertise provided by commercial vendors. PMID:25528614

  7. The open sea as the main source of methylmercury in the water column of the Gulf of Lions (Northwestern Mediterranean margin)

    NASA Astrophysics Data System (ADS)

    Cossa, Daniel; Durrieu de Madron, Xavier; Schäfer, Jörg; Lanceleur, Laurent; Guédron, Stéphane; Buscail, Roselyne; Thomas, Bastien; Castelle, Sabine; Naudin, Jean-Jacques

    2017-02-01

    Despite the ecologic and economical importance of coastal areas, the neurotoxic bioaccumulable monomethylmercury (MMHg) fluxes within the ocean margins and exchanges with the open sea remain unassessed. The aim of this paper is to address the questions of the abundance, distribution, production and exchanges of methylated mercury species (MeHgT), including MMHg and dimethylmercury (DMHg), in the waters, atmosphere and sediments of the Northwestern Mediterranean margin including the Rhône River delta, the continental shelf and its slope (Gulf of Lions) and the adjacent open sea (North Gyre). Concentrations of MeHgT ranged from <0.02 to 0.48 pmol L-1 with highest values associated with the oxygen-deficient zone of the open sea. The methylated mercury to total mercury proportion (MeHgT/HgT) increased from 2% to 4% in the Rhône River to up to 30% (averaging 18%) in the North Gyre waters, whereas, within the shelf waters, MeHgT/HgT proportions were the lowest (1-3%). We calculate that the open sea is the major source of MeHgT for the shelf waters, with an annual flux estimated at 0.68 ± 0.12 kmol a-1 (i.e., equivalent to 12% of the HgT flux). This MeHgT influx is more than 80 times the direct atmospheric deposition or the in situ net production, more than 40 times the estimated "maximum potential" annual efflux from shelf sediment, and more than 7 times that of the continental sources. In the open sea, ratios of MMHg/DMHg in waters were always <1 and minimum in the oxygen deficient zones of the water column, where MeHg concentrations are maximum. This observation supports the idea that MMHg could be a degradation product of DMHg produced from inorganic divalent Hg.

  8. Leveraging Open Standards and Technologies to Enhance Community Access to Earth Science Lidar Data

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Cowart, C.; Baru, C.; Arrowsmith, R.

    2011-12-01

    Lidar (Light Detection and Ranging) data, collected from space, airborne and terrestrial platforms, have emerged as an invaluable tool for a variety of Earth science applications ranging from ice sheet monitoring to modeling of earth surface processes. However, lidar present a unique suite of challenges from the perspective of building cyberinfrastructure systems that enable the scientific community to access these valuable research datasets. Lidar data are typically characterized by millions to billions of individual measurements of x,y,z position plus attributes; these "raw" data are also often accompanied by derived raster products and are frequently terabytes in size. As a relatively new and rapidly evolving data collection technology, relevant open data standards and software projects are immature compared to those for other remote sensing platforms. The NSF-funded OpenTopography Facility project has developed an online lidar data access and processing system that co-locates data with on-demand processing tools to enable users to access both raw point cloud data as well as custom derived products and visualizations. OpenTopography is built on a Service Oriented Architecture (SOA) in which applications and data resources are deployed as standards compliant (XML and SOAP) Web services with the open source Opal Toolkit. To develop the underlying applications for data access, filtering and conversion, and various processing tasks, OpenTopography has heavily leveraged existing open source software efforts for both lidar and raster data. Operating on the de facto LAS binary point cloud format (maintained by ASPRS), open source libLAS and LASlib libraries provide OpenTopography data ingestion, query and translation capabilities. Similarly, raster data manipulation is performed through a suite of services built on the Geospatial Data Abstraction Library (GDAL). OpenTopography has also developed our own algorithm for high-performance gridding of lidar point cloud data, Points2Grid, and have released the code as an open source project. An emerging conversation that the lidar community and OpenTopography are actively engaged in is the need for open, community supported standards and metadata for both full waveform and terrestrial (waveform and discrete return) lidar data. Further, given the immature nature of many lidar data archives and limited online access to public domain data, there is an opportunity to develop interoperable data catalogs based on an open standard such as the OGC CSW specification to facilitate discovery and access to Earth science oriented lidar data.

  9. Open source approaches to health information systems in Kenya.

    PubMed

    Drury, Peter; Dahlman, Bruce

    2005-01-01

    This paper focuses on the experience to date of an installation of a Free Open Source Software (FOSS) product, Care2X, at a church hospital in Kenya. The FOSS movement has been maturing rapidly. In developed countries, its benefits relative to proprietary software have been extensively discussed and ways of quantifying the total costs of the development have been developed. Nevertheless, empirical data on the impact of FOSS, particularly in the developing world, concerning its use and development is still quite limited, although the possibilities of FOSS are becoming increasingly attractive.

  10. Open source EMR software: profiling, insights and hands-on analysis.

    PubMed

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    The use of open source software in health informatics is increasingly advocated by authors in the literature. Although there is no clear evidence of the superiority of the current open source applications in the healthcare field, the number of available open source applications online is growing and they are gaining greater prominence. This repertoire of open source options is of a great value for any future-planner interested in adopting an electronic medical/health record system, whether selecting an existent application or building a new one. The following questions arise. How do the available open source options compare to each other with respect to functionality, usability and security? Can an implementer of an open source application find sufficient support both as a user and as a developer, and to what extent? Does the available literature provide adequate answers to such questions? This review attempts to shed some light on these aspects. The objective of this study is to provide more comprehensive guidance from an implementer perspective toward the available alternatives of open source healthcare software, particularly in the field of electronic medical/health records. The design of this study is twofold. In the first part, we profile the published literature on a sample of existent and active open source software in the healthcare area. The purpose of this part is to provide a summary of the available guides and studies relative to the sampled systems, and to identify any gaps in the published literature with respect to our research questions. In the second part, we investigate those alternative systems relative to a set of metrics, by actually installing the software and reporting a hands-on experience of the installation process, usability, as well as other factors. The literature covers many aspects of open source software implementation and utilization in healthcare practice. Roughly, those aspects could be distilled into a basic taxonomy, making the literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. Web Solutions Inspire Cloud Computing Software

    NASA Technical Reports Server (NTRS)

    2013-01-01

    An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.

  12. Python Open source Waveform ExtractoR (POWER): an open source, Python package to monitor and post-process numerical relativity simulations

    NASA Astrophysics Data System (ADS)

    Johnson, Daniel; Huerta, E. A.; Haas, Roland

    2018-01-01

    Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.

  13. Open Source GIS based integrated watershed management

    NASA Astrophysics Data System (ADS)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users

  14. Hanging with the Right Crowd: Crowdsourcing as a New Business Practice for Innovation, Productivity, Knowledge Capture, and Marketing

    ERIC Educational Resources Information Center

    Erickson, Lisa B.

    2013-01-01

    In today's connected world, the reach of the Internet and collaborative social media tools have opened up new opportunities for individuals, regardless of their location, to share their knowledge, expertise, and creativity with others. These tools have also opened up opportunities for organizations to connect with new sources of innovation to…

  15. Open-Source GIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vatsavai, Raju; Burk, Thomas E; Lime, Steve

    2012-01-01

    The components making up an Open Source GIS are explained in this chapter. A map server (Sect. 30.1) can broadly be defined as a software platform for dynamically generating spatially referenced digital map products. The University of Minnesota MapServer (UMN Map Server) is one such system. Its basic features are visualization, overlay, and query. Section 30.2 names and explains many of the geospatial open source libraries, such as GDAL and OGR. The other libraries are FDO, JTS, GEOS, JCS, MetaCRS, and GPSBabel. The application examples include derived GIS-software and data format conversions. Quantum GIS, its origin and its applications explainedmore » in detail in Sect. 30.3. The features include a rich GUI, attribute tables, vector symbols, labeling, editing functions, projections, georeferencing, GPS support, analysis, and Web Map Server functionality. Future developments will address mobile applications, 3-D, and multithreading. The origins of PostgreSQL are outlined and PostGIS discussed in detail in Sect. 30.4. It extends PostgreSQL by implementing the Simple Feature standard. Section 30.5 details the most important open source licenses such as the GPL, the LGPL, the MIT License, and the BSD License, as well as the role of the Creative Commons.« less

  16. Greenhouse gas and ammonia emissions from an open-freestall dairy in southern idaho.

    PubMed

    Leytem, April B; Dungan, Robert S; Bjorneberg, David L; Koehn, Anita C

    2013-01-01

    Concentrated dairy operations emit trace gases such as ammonia (NH), methane (CH), and nitrous oxide (NO) to the atmosphere. The implementation of air quality regulations in livestock-producing states increases the need for accurate on-farm determination of emission rates. Our objective was to determine the emission rates of NH, CH, and NO from the open-freestall and wastewater pond source areas on a commercial dairy in southern Idaho using a flush system with anaerobic digestion. Gas concentrations and wind statistics were measured and used with an inverse dispersion model to calculate emission rates. Average emissions per cow per day from the open-freestall source area were 0.08 kg NH, 0.41 kg CH, and 0.02 kg NO. Average emissions from the wastewater ponds (g m d) were 6.8 NH, 22 CH, and 0.2 NO. The combined emissions on a per cow per day basis from the open-freestall and wastewater pond areas averaged 0.20 kg NH and 0.75 kg CH. Combined NO emissions were not calculated due to limited available data. The wastewater ponds were the greatest source of total farm NH emissions (67%) in spring and summer. The emissions of CH were approximately equal from the two source areas in spring and summer. During the late fall and winter months, the open-freestall area constituted the greatest source area of NH and CH emissions. Data from this study can be used to develop trace gas emissions factors from open-freestall dairies in southern Idaho and other open-freestall production systems in similar climatic regions. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  17. Database for Rapid Dereplication of Known Natural Products Using Data from MS and Fast NMR Experiments.

    PubMed

    Zani, Carlos L; Carroll, Anthony R

    2017-06-23

    The discovery of novel and/or new bioactive natural products from biota sources is often confounded by the reisolation of known natural products. Dereplication strategies that involve the analysis of NMR and MS spectroscopic data to infer structural features present in purified natural products in combination with database searches of these substructures provide an efficient method to rapidly identify known natural products. Unfortunately this strategy has been hampered by the lack of publically available and comprehensive natural product databases and open source cheminformatics tools. A new platform, DEREP-NP, has been developed to help solve this problem. DEREP-NP uses the open source cheminformatics program DataWarrior to generate a database containing counts of 65 structural fragments present in 229 358 natural product structures derived from plants, animals, and microorganisms, published before 2013 and freely available in the nonproprietary Universal Natural Products Database (UNPD). By counting the number of times one or more of these structural features occurs in an unknown compound, as deduced from the analysis of its NMR ( 1 H, HSQC, and/or HMBC) and/or MS data, matching structures carrying the same numeric combination of searched structural features can be retrieved from the database. Confirmation that the matching structure is the same compound can then be verified through literature comparison of spectroscopic data. This methodology can be applied to both purified natural products and fractions containing a small number of individual compounds that are often generated as screening libraries. The utility of DEREP-NP has been verified through the analysis of spectra derived from compounds (and fractions containing two or three compounds) isolated from plant, marine invertebrate, and fungal sources. DEREP-NP is freely available at https://github.com/clzani/DEREP-NP and will help to streamline the natural product discovery process.

  18. Open Access Resources for Genome Wide Association Studies (GWAS) in rice (Oryza sativa) illustrate the power of population-specific mapping

    USDA-ARS?s Scientific Manuscript database

    Increasing food production is essential to meet the demands of a growing human population, with its rising income levels and nutritional expectations. New sources of genetic variation are key to enhancing the productivity, sustainability and resilience of crop varieties and agricultural systems that...

  19. High solubility pathway for the carbon dioxide free production of iron.

    PubMed

    Licht, Stuart; Wang, Baohui

    2010-10-07

    We report a fundamental change in the understanding of iron oxide thermochemistry, opening a facile, new CO(2)-free route to iron production. The resultant process can eliminate a major global source of greenhouse gas emission, producing the staple iron in molten media at high rate and low electrolysis energy.

  20. Large-scale biodiesel production using flue gas from coal-fired power plants with Nannochloropsis microalgal biomass in open raceway ponds.

    PubMed

    Zhu, Baohua; Sun, Faqiang; Yang, Miao; Lu, Lin; Yang, Guanpin; Pan, Kehou

    2014-12-01

    The potential use of microalgal biomass as a biofuel source has raised broad interest. Highly effective and economically feasible biomass generating techniques are essential to realize such potential. Flue gas from coal-fired power plants may serve as an inexpensive carbon source for microalgal culture, and it may also facilitate improvement of the environment once the gas is fixed in biomass. In this study, three strains of the genus Nannochloropsis (4-38, KA2 and 75B1) survived this type of culture and bloomed using flue gas from coal-fired power plants in 8000-L open raceway ponds. Lower temperatures and solar irradiation reduced the biomass yield and lipid productivities of these strains. Strain 4-38 performed better than the other two as it contained higher amounts of triacylglycerols and fatty acids, which are used for biodiesel production. Further optimization of the application of flue gas to microalgal culture should be undertaken. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. OpenFOAM: Open source CFD in research and industry

    NASA Astrophysics Data System (ADS)

    Jasak, Hrvoje

    2009-12-01

    The current focus of development in industrial Computational Fluid Dynamics (CFD) is integration of CFD into Computer-Aided product development, geometrical optimisation, robust design and similar. On the other hand, in CFD research aims to extend the boundaries ofpractical engineering use in "non-traditional " areas. Requirements of computational flexibility and code integration are contradictory: a change of coding paradigm, with object orientation, library components, equation mimicking is proposed as a way forward. This paper describes OpenFOAM, a C++ object oriented library for Computational Continuum Mechanics (CCM) developed by the author. Efficient and flexible implementation of complex physical models is achieved by mimicking the form ofpartial differential equation in software, with code functionality provided in library form. Open Source deployment and development model allows the user to achieve desired versatility in physical modeling without the sacrifice of complex geometry support and execution efficiency.

  2. The Simple Concurrent Online Processing System (SCOPS) - An open-source interface for remotely sensed data processing

    NASA Astrophysics Data System (ADS)

    Warren, M. A.; Goult, S.; Clewley, D.

    2018-06-01

    Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.

  3. 3D Technology Selection for a Virtual Learning Environment by Blending ISO 9126 Standard and AHP

    ERIC Educational Resources Information Center

    Cetin, Aydin; Guler, Inan

    2011-01-01

    Web3D presents many opportunities for learners in a virtual world or virtual environment over the web. This is a great opportunity for open-distance education institutions to benefit from web3d technologies to create courses with interactive 3d materials. There are many open source and commercial products offering 3d technologies over the web…

  4. Massive stereo-based DTM production for Mars on cloud computers

    NASA Astrophysics Data System (ADS)

    Tao, Y.; Muller, J.-P.; Sidiropoulos, P.; Xiong, Si-Ting; Putri, A. R. D.; Walter, S. H. G.; Veitch-Michaelis, J.; Yershov, V.

    2018-05-01

    Digital Terrain Model (DTM) creation is essential to improving our understanding of the formation processes of the Martian surface. Although there have been previous demonstrations of open-source or commercial planetary 3D reconstruction software, planetary scientists are still struggling with creating good quality DTMs that meet their science needs, especially when there is a requirement to produce a large number of high quality DTMs using "free" software. In this paper, we describe a new open source system to overcome many of these obstacles by demonstrating results in the context of issues found from experience with several planetary DTM pipelines. We introduce a new fully automated multi-resolution DTM processing chain for NASA Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) and High Resolution Imaging Science Experiment (HiRISE) stereo processing, called the Co-registration Ames Stereo Pipeline (ASP) Gotcha Optimised (CASP-GO), based on the open source NASA ASP. CASP-GO employs tie-point based multi-resolution image co-registration, and Gotcha sub-pixel refinement and densification. CASP-GO pipeline is used to produce planet-wide CTX and HiRISE DTMs that guarantee global geo-referencing compliance with respect to High Resolution Stereo Colour imaging (HRSC), and thence to the Mars Orbiter Laser Altimeter (MOLA); providing refined stereo matching completeness and accuracy. All software and good quality products introduced in this paper are being made open-source to the planetary science community through collaboration with NASA Ames, United States Geological Survey (USGS) and the Jet Propulsion Laboratory (JPL), Advanced Multi-Mission Operations System (AMMOS) Planetary Data System (PDS) Pipeline Service (APPS-PDS4), as well as browseable and visualisable through the iMars web based Geographic Information System (webGIS) system.

  5. From Open Geographical Data to Tangible Maps: Improving the Accessibility of Maps for Visually Impaired People

    NASA Astrophysics Data System (ADS)

    Ducasse, J.; Macé, M.; Jouffrais, C.

    2015-08-01

    Visual maps must be transcribed into (interactive) raised-line maps to be accessible for visually impaired people. However, these tactile maps suffer from several shortcomings: they are long and expensive to produce, they cannot display a large amount of information, and they are not dynamically modifiable. A number of methods have been developed to automate the production of raised-line maps, but there is not yet any tactile map editor on the market. Tangible interactions proved to be an efficient way to help a visually impaired user manipulate spatial representations. Contrary to raised-line maps, tangible maps can be autonomously constructed and edited. In this paper, we present the scenarios and the main expected contributions of the AccessiMap project, which is based on the availability of many sources of open spatial data: 1/ facilitating the production of interactive tactile maps with the development of an open-source web-based editor; 2/ investigating the use of tangible interfaces for the autonomous construction and exploration of a map by a visually impaired user.

  6. The hobbyist phenomenon in physical security.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michaud, E. C.

    Pro-Ams (professional amateurs) are groups of people who work on a problem as amateurs or unpaid persons in a given field at professional levels of competence. Astronomy is a good example of Pro-Am activity. At Galaxy Zoo, Pro-Ams evaluate data generated by professional observatories and are able to evaluate the millions of galaxies that have been observed but not classified, and report their findings at professional levels for fun. To allow the archiving of millions of galaxies that have been observed but not classified, the website has been engineered so that the public can view and classify galaxies even ifmore » they are not professional astronomers. In this endeavor, it has been found that amateurs can easily outperform automated vision systems. Today in the world of physical security, Pro-Ams are playing an ever-increasing role. Traditionally, locksmiths, corporations, and government organizations have been largely responsible for developing standards, uncovering vulnerabilities, and devising best security practices. Increasingly, however, nonprofit sporting organizations and clubs are doing this. They can be found all over the world, from Europe to the US and now South East Asia. Examples include TOOOL (The Open Organization of Lockpickers), the Longhorn Lockpicking Club, Sportsfreunde der Sperrtechnik - Deustcheland e.V., though there are many others. Members of these groups have been getting together weekly to discuss many elements of security, with some groups specializing in specific areas of security. When members are asked why they participate in these hobbyist groups, they usually reply (with gusto) that they do it for fun, and that they view defeating locks and other security devices as an interesting and entertaining puzzle. A lot of what happens at these clubs would not be possible if it weren't for 'Super Abundance', the ability to easily acquire (at little or no cost) the products, security tools, technologies, and intellectual resources traditionally limited to corporations, government organizations, or wealthy individuals. With this new access comes new discoveries. For example, hobbyist sport lockpicking groups discovered - and publicized - a number of new vulnerabilities between 2004 and 2009 that resulted in the majority of high-security lock manufacturers having to make changes and improvements to their products. A decade ago, amateur physical security discoveries were rare, at least those discussed publicly. In the interim, Internet sites such as lockpicking.org, lockpicking101.com and others have provided an online meeting place for people to trade tips, find friends with similar interests, and develop tools. The open, public discussion of software vulnerabilities, in contrast, has been going on for a long time. These two industries, physical security and software, have very different upgrade mechanisms. With software, a patch can typically be deployed quickly to fix a serious vulnerability, whereas a hardware fix for a physical security device or system can take upwards of months to implement in the field, especially if (as is often the case) hardware integrators are involved. Even when responding to publicly announced security vulnerabilities, manufacturers of physical security devices such as locks, intrusion detectors, or access control devices rarely view hobbyists as a positive resource. This is most unfortunate. In the field of software, it is common to speak of Open Source versus Closed Source. An Open Source software company may choose to distribute their software with a particular license, and give it away openly, with full details and all the lines of source code made available. Linux is a very popular example of this. A Close Source company, in contrast, chooses not to reveal its source code and will license its software products in a restrictive manor. Slowly, the idea of Open Source is now coming to the world of physical security. In the case of locks, it provides an alternative to the traditional Closed Source world of locksmiths. Now locks are physical objects, and can therefore be disassembled. As such, they have always been Open Source in a limited sense. Secrecy, in fact, is very difficult to maintain for a lock that is widely distributed. Having direct access to the lock design provides the hobbyist with a very open environment for finding security flaws, even if the lock manufacturer attempts to follow a Close Source model. It is clear that the field of physical security is going the digital route with companies such as Medeco, Mul-T-Lock, and Abloy manufacturing electromechanical locks. Various companies have already begun to add microcontrollers, cryptographic chip sets, solid-state sensors, and a number of other high-tech improvements to their product lineup in an effort to thwart people from defeating their security products.« less

  7. AIR EMISSIONS FROM SCRAP TIRE COMBUSTION

    EPA Science Inventory

    The report discusses air emissions from two types of scrap tire combustion: uncontrolled and controlled. Uncontrolled sources are open tire fires, which produce many unhealthful products of incomplete combustion and release them directly into the atmosphere. Controlled combustion...

  8. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    PubMed

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  9. Large area multiarc ion beam source {open_quote}MAIS{close_quote}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engelko, V.; Giese, H.; Schalk, S.

    1996-12-31

    A pulsed large area intense ion beam source is described, in which the ion emitting plasma is built up by an array of individual discharge units, homogeneously distributed over the surface of a common discharge electrode. A particularly advantageous feature of the source is that for plasma generation and subsequent acceleration of the ions only one common energy supply is necessary. This allows to simplify the source design and provides inherent synchronization of plasma production and ion extraction. The homogeneity of the plasma density was found to be superior to plasma sources using plasma expanders. Originally conceived for the productionmore » of proton beams, the source can easily be modified for the production of beams composed of carbon and metal ions or mixed ion species. Results of investigations of the source performance for the production of a proton beam are presented. The maximum beam current achieved to date is of the order of 100 A, with a particle kinetic energy of 15 - 30 keV and a pulse length in the range of 10 {mu}s.« less

  10. The Human Exposure Model (HEM): A Tool to Support Rapid ...

    EPA Pesticide Factsheets

    The US EPA is developing an open and publically available software program called the Human Exposure Model (HEM) to provide near-field exposure information for Life Cycle Impact Assessments (LCIAs). Historically, LCIAs have often omitted impacts from near-field sources of exposure. The use of consumer products often results in near-field exposures (exposures that occur directly from the use of a product) that are larger than environmentally mediated exposures (i.e. far-field sources)1,2. Failure to consider near-field exposures could result in biases in LCIA-based determinations of the relative sustainability of consumer products. HEM is designed to provide this information.Characterizing near-field sources of chemical exposures present a challenge to LCIA practitioners. Unlike far-field sources, where multimedia mass balance models have been used to determine human exposure, near-field sources require product-specific models of human exposure and considerable information on product use and product composition. Such information is difficult and time-consuming to gather and curate. The HEM software will characterize the distribution of doses and product intake fractions2 across populations of product users and bystanders, allowing for differentiation by various demographic characteristics. The tool incorporates a newly developed database of the composition of more than 17,000 products, data on physical and chemical properties for more than 2,000 chemicals, and mo

  11. An informatics model for guiding assembly of telemicrobiology workstations for malaria collaborative diagnostics using commodity products and open-source software.

    PubMed

    Suhanic, West; Crandall, Ian; Pennefather, Peter

    2009-07-17

    Deficits in clinical microbiology infrastructure exacerbate global infectious disease burdens. This paper examines how commodity computation, communication, and measurement products combined with open-source analysis and communication applications can be incorporated into laboratory medicine microbiology protocols. Those commodity components are all now sourceable globally. An informatics model is presented for guiding the use of low-cost commodity components and free software in the assembly of clinically useful and usable telemicrobiology workstations. The model incorporates two general principles: 1) collaborative diagnostics, where free and open communication and networking applications are used to link distributed collaborators for reciprocal assistance in organizing and interpreting digital diagnostic data; and 2) commodity engineering, which leverages globally available consumer electronics and open-source informatics applications, to build generic open systems that measure needed information in ways substantially equivalent to more complex proprietary systems. Routine microscopic examination of Giemsa and fluorescently stained blood smears for diagnosing malaria is used as an example to validate the model. The model is used as a constraint-based guide for the design, assembly, and testing of a functioning, open, and commoditized telemicroscopy system that supports distributed acquisition, exploration, analysis, interpretation, and reporting of digital microscopy images of stained malarial blood smears while also supporting remote diagnostic tracking, quality assessment and diagnostic process development. The open telemicroscopy workstation design and use-process described here can address clinical microbiology infrastructure deficits in an economically sound and sustainable manner. It can boost capacity to deal with comprehensive measurement of disease and care outcomes in individuals and groups in a distributed and collaborative fashion. The workstation enables local control over the creation and use of diagnostic data, while allowing for remote collaborative support of diagnostic data interpretation and tracking. It can enable global pooling of malaria disease information and the development of open, participatory, and adaptable laboratory medicine practices. The informatic model highlights how the larger issue of access to generic commoditized measurement, information processing, and communication technology in both high- and low-income countries can enable diagnostic services that are much less expensive, but substantially equivalent to those currently in use in high-income countries.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.

    VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less

  13. Biotechnological production of gluconic acid: future implications.

    PubMed

    Singh, Om V; Kumar, Raj

    2007-06-01

    Gluconic acid (GA) is a multifunctional carbonic acid regarded as a bulk chemical in the food, feed, beverage, textile, pharmaceutical, and construction industries. The favored production process is submerged fermentation by Aspergillus niger utilizing glucose as a major carbohydrate source, which accompanied product yield of 98%. However, use of GA and its derivatives is currently restricted because of high prices: about US$ 1.20-8.50/kg. Advancements in biotechnology such as screening of microorganisms, immobilization techniques, and modifications in fermentation process for continuous fermentation, including genetic engineering programmes, could lead to cost-effective production of GA. Among alternative carbohydrate sources, sugarcane molasses, grape must show highest GA yield of 95.8%, and banana must may assist reducing the overall cost of GA production. These methodologies would open new markets and increase applications of GA.

  14. Bioactive natural products from novel microbial sources.

    PubMed

    Challinor, Victoria L; Bode, Helge B

    2015-09-01

    Despite the importance of microbial natural products for human health, only a few bacterial genera have been mined for the new natural products needed to overcome the urgent threat of antibiotic resistance. This is surprising, given that genome sequencing projects have revealed that the capability to produce natural products is not a rare feature among bacteria. Even the bacteria occurring in the human microbiome produce potent antibiotics, and thus potentially are an untapped resource for novel compounds, potentially with new activities. This review highlights examples of bacteria that should be considered new sources of natural products, including anaerobes, pathogens, and symbionts of humans, insects, and nematodes. Exploitation of these producer strains, combined with advances in modern natural product research methodology, has the potential to open the way for a new golden age of microbial therapeutics. © 2015 New York Academy of Sciences.

  15. An Interface Transformation Strategy for AF-IPPS

    DTIC Science & Technology

    2012-12-01

    Representational State Transfer (REST) and Java Enterprise Edition ( Java EE) to implement a reusable “translation service.” For SOAP and REST protocols, XML and...of best-of-breed open source software. The product baseline is summarized in the following table: Product Function Description Java Language...Compiler & Runtime JBoss Application Server Applications, Messaging, Translation Java EE Application Server Ruby on Rails Applications Ruby Web

  16. Design and construction of a first-generation high-throughput integrated molecular biology platform for production of optimized synthetic genes and improved industrial strains

    USDA-ARS?s Scientific Manuscript database

    The molecular biological techniques for plasmid-based assembly and cloning of synthetic assembled gene open reading frames are essential for elucidating the function of the proteins encoded by the genes. These techniques involve the production of full-length cDNA libraries as a source of plasmid-bas...

  17. Implementation of a departmental picture archiving and communication system: a productivity and cost analysis.

    PubMed

    Macyszyn, Luke; Lega, Brad; Bohman, Leif-Erik; Latefi, Ahmad; Smith, Michelle J; Malhotra, Neil R; Welch, William; Grady, Sean M

    2013-09-01

    Digital radiology enhances productivity and results in long-term cost savings. However, the viewing, storage, and sharing of outside imaging studies on compact discs at ambulatory offices and hospitals pose a number of unique challenges to a surgeon's efficiency and clinical workflow. To improve the efficiency and clinical workflow of an academic neurosurgical practice when evaluating patients with outside radiological studies. Open-source software and commercial hardware were used to design and implement a departmental picture archiving and communications system (PACS). The implementation of a departmental PACS system significantly improved productivity and enhanced collaboration in a variety of clinical settings. Using published data on the rate of information technology problems associated with outside studies on compact discs, this system produced a cost savings ranging from $6250 to $33600 and from $43200 to $72000 for 2 cohorts, urgent transfer and spine clinic patients, respectively, therefore justifying the costs of the system in less than a year. The implementation of a departmental PACS system using open-source software is straightforward and cost-effective and results in significant gains in surgeon productivity when evaluating patients with outside imaging studies.

  18. Compliance of secondary production and eco-exergy as indicators of benthic macroinvertebrates assemblages' response to canopy cover conditions in Neotropical headwater streams.

    PubMed

    Linares, Marden Seabra; Callisto, Marcos; Marques, João Carlos

    2018-02-01

    Riparian vegetation cover influences benthic assemblages structure and functioning in headwater streams, as it regulates light availability and autochthonous primary production in these ecosystems.Secondary production, diversity, and exergy-based indicators were applied in capturing how riparian cover influences the structure and functioning of benthic macroinvertebrate assemblages in tropical headwater streams. Four hypotheses were tested: (1) open canopy will determine the occurrence of higher diversity in benthic macroinvertebrate assemblages; (2) streams with open canopy will exhibit more complex benthic macroinvertebrate communities (in terms of information embedded in the organisms' biomass); (3) in streams with open canopy benthic macroinvertebrate assemblages will be more efficient in using the available resources to build structure, which will be reflected by higher eco-exergy values; (4) benthic assemblages in streams with open canopy will exhibit more secondary productivity. We selected eight non-impacted headwater streams, four shaded and four with open canopy, all located in the Neotropical savannah (Cerrado) of southeastern Brazil. Open canopy streams consistently exhibited significantly higher eco-exergy and instant secondary production values, exemplifying that these streams may support more complex and productive benthic macroinvertebrate assemblages. Nevertheless, diversity indices and specific eco-exergy were not significantly different in shaded and open canopy streams. Since all the studied streams were selected for being considered as non-impacted, this suggests that the potential represented by more available food resources was not used to build a more complex dissipative structure. These results illustrate the role and importance of the canopy cover characteristics on the structure and functioning of benthic macroinvertebrate assemblages in tropical headwater streams, while autochthonous production appears to play a crucial role as food source for benthic macroinvertebrates. This study also highlights the possible application of thermodynamic based indicators as tools to guide environmental managers in developing and implementing policies in the neotropical savannah. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Methanol shutdowns cause anxiety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, N.

    1996-10-23

    European methanol players face an anxious few weeks as unscheduled outages combine with planned turnarounds to make an increasingly tight market. Global markets are also described as tightening, with production problems widely reported in North America. Several European producers were in the middle of shutdown periods when problems at Condea`s 400,000-m.t./year unit at Wesseling, Germany reportedly caused production to run at only 50% of capacity. In addition, the methanol plant at the Leuna refinery is said to be operating at only 60% of capacity, and one producer has had to extend a turnaround period. River levels in Germany are alsomore » low, putting pressure on shipments from Rotterdam. {open_quotes}This is a very difficult situation and we`re living hand to mouth,{close_quotes} says one producer. Producer sources report bids from consumers up to DM280/m.t. T2 fob Rotterdam, but they are unable to obtain extra product. Derivatives makers may also face problems: One methyl tert-butyl ether producer predicts prices {open_quotes}may hit the roof{close_quotes} once feedstock sourcing problems hit home.« less

  20. Comparison of petroleum generation kinetics by isothermal hydrous and nonisothermal open-system pyrolysis

    USGS Publications Warehouse

    Lewan, M.D.; Ruble, T.E.

    2002-01-01

    This study compares kinetic parameters determined by open-system pyrolysis and hydrous pyrolysis using aliquots of source rocks containing different kerogen types. Kinetic parameters derived from these two pyrolysis methods not only differ in the conditions employed and products generated, but also in the derivation of the kinetic parameters (i.e., isothermal linear regression and non-isothermal nonlinear regression). Results of this comparative study show that there is no correlation between kinetic parameters derived from hydrous pyrolysis and open-system pyrolysis. Hydrous-pyrolysis kinetic parameters determine narrow oil windows that occur over a wide range of temperatures and depths depending in part on the organic-sulfur content of the original kerogen. Conversely, open-system kinetic parameters determine broad oil windows that show no significant differences with kerogen types or their organic-sulfur contents. Comparisons of the kinetic parameters in a hypothetical thermal-burial history (2.5 ??C/my) show open-system kinetic parameters significantly underestimate the extent and timing of oil generation for Type-US kerogen and significantly overestimate the extent and timing of petroleum formation for Type-I kerogen compared to hydrous pyrolysis kinetic parameters. These hypothetical differences determined by the kinetic parameters are supported by natural thermal-burial histories for the Naokelekan source rock (Type-IIS kerogen) in the Zagros basin of Iraq and for the Green River Formation (Type-I kerogen) in the Uinta basin of Utah. Differences in extent and timing of oil generation determined by open-system pyrolysis and hydrous pyrolysis can be attributed to the former not adequately simulating natural oil generation conditions, products, and mechanisms.

  1. EMISIONES AL AIRE DE LA COMBUSTION DE LLANTAS USADAS (SPANISH VERSION)

    EPA Science Inventory

    The report discusses air emissions from two types of scrap tire combustion: uncontrolled and controlled. Uncontrolled sources are open tire fires, which produce many unhealthful products of incomplete combustion and release them directly into the atmosphere. Controlled combustion...

  2. Mobile open-source plant-canopy monitoring system

    USDA-ARS?s Scientific Manuscript database

    Many agricultural applications, including improved crop production, precision agriculture, and phenotyping, rely on detailed field and crop information to detect and react to spatial variabilities. Mobile farm vehicles, such as tractors and sprayers, have the potential to operate as mobile sensing ...

  3. Quality Test of Flexible Flat Cable (FFC) With Short Open Test Using Law Ohm Approach through Embedded Fuzzy Logic Based On Open Source Arduino Data Logger

    NASA Astrophysics Data System (ADS)

    Rohmanu, Ajar; Everhard, Yan

    2017-04-01

    A technological development, especially in the field of electronics is very fast. One of the developments in the electronics hardware device is Flexible Flat Cable (FFC), which serves as a media liaison between the main boards with other hardware parts. The production of Flexible Flat Cable (FFC) will go through the process of testing and measuring of the quality Flexible Flat Cable (FFC). Currently, the testing and measurement is still done manually by observing the Light Emitting Diode (LED) by the operator, so there were many problems. This study will be made of test quality Flexible Flat Cable (FFC) computationally utilize Open Source Embedded System. The method used is the measurement with Short Open Test method using Ohm’s Law approach to 4-wire (Kelvin) and fuzzy logic as a decision maker measurement results based on Open Source Arduino Data Logger. This system uses a sensor current INA219 as a sensor to read the voltage value thus obtained resistance value Flexible Flat Cable (FFC). To get a good system we will do the Black-box testing as well as testing the accuracy and precision with the standard deviation method. In testing the system using three models samples were obtained the test results in the form of standard deviation for the first model of 1.921 second model of 4.567 and 6.300 for the third model. While the value of the Standard Error of Mean (SEM) for the first model of the model 0.304 second at 0.736 and 0.996 of the third model. In testing this system, we will also obtain the average value of the measurement tolerance resistance values for the first model of - 3.50% 4.45% second model and the third model of 5.18% with the standard measurement of prisoners and improve productivity becomes 118.33%. From the results of the testing system is expected to improve the quality and productivity in the process of testing Flexible Flat Cable (FFC).

  4. Perfluorinated Compounds in Greenhouse and Open Agricultural Producing Areas of Three Provinces of China: Levels, Sources and Risk Assessment

    PubMed Central

    Zhang, Yanwei; Tan, Dongfei; Geng, Yue; Wang, Lu; Peng, Yi; He, Zeying; Xu, Yaping; Liu, Xiaowei

    2016-01-01

    Field investigations on perfluoroalkyl acid (PFAA) levels in various environmental matrixes were reported, but there is still a lack of PFAA level data for agricultural environments, especially agricultural producing areas, so we collected soil, irrigation water and agricultural product samples from agricultural producing areas in the provinces of Liaoning, Shandong and Sichuan in China. The background pollution from instruments was removed and C4–C18 PFAAs were detected by LC-MS/MS. The concentrations of PFAAs in the top and deep layers of soil were compared, and the levels of PFAAs in different agricultural environments (greenhouses and open agriculture) were analyzed. We found the order of PFAA levels by province was Shandong > Liaoning > Sichuan. A descending trend of PFAA levels from top to deep soil and open to greenhouse agriculture was shown and perfluorobutanoic acid (PFBA) was considered as a marker for source analysis. Bean vegetables contribute highly to the overall PFAA load in vegetables. A significant correlation was shown between irrigation water and agricultural products. The EDI (estimated daily intake) from vegetables should be of concern in China. PMID:27973400

  5. 40 CFR Table 3 to Subpart Wwww of... - Organic HAP Emissions Limits for Existing Open Molding Sources, New Open Molding Sources Emitting...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and... CATEGORIES National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and...

  6. Outdoor cultivation of microalgae for carotenoid production: current state and perspectives.

    PubMed

    Del Campo, José A; García-González, Mercedes; Guerrero, Miguel G

    2007-04-01

    Microalgae are a major natural source for a vast array of valuable compounds, including a diversity of pigments, for which these photosynthetic microorganisms represent an almost exclusive biological resource. Yellow, orange, and red carotenoids have an industrial use in food products and cosmetics as vitamin supplements and health food products and as feed additives for poultry, livestock, fish, and crustaceans. The growing worldwide market value of carotenoids is projected to reach over US$1,000 million by the end of the decade. The nutraceutical boom has also integrated carotenoids mainly on the claim of their proven antioxidant properties. Recently established benefits in human health open new uses for some carotenoids, especially lutein, an effective agent for the prevention and treatment of a variety of degenerative diseases. Consumers' demand for natural products favors development of pigments from biological sources, thus increasing opportunities for microalgae. The biotechnology of microalgae has gained considerable progress and relevance in recent decades, with carotenoid production representing one of its most successful domains. In this paper, we review the most relevant features of microalgal biotechnology related to the production of different carotenoids outdoors, with a main focus on beta-carotene from Dunaliella, astaxanthin from Haematococcus, and lutein from chlorophycean strains. We compare the current state of the corresponding production technologies, based on either open-pond systems or closed photobioreactors. The potential of scientific and technological advances for improvements in yield and reduction in production costs for carotenoids from microalgae is also discussed.

  7. Acid Rain and the Environment: An Ethical Perspective.

    DTIC Science & Technology

    1985-01-01

    reactive products are deposited from the air in locations remote from the major source of the pollution.1 This is the opening statement in a report...22.5 4million tons/year of SO2 , and only slightly less NOx Natural mechanisms are able to neutralize a portion of the annual acid production , and...possibly the entire natural production , but they do not appear to be able to neutralize the tremendous additional amounts of acid generated by human

  8. A generic open-source software framework supporting scenario simulations in bioterrorist crises.

    PubMed

    Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie

    2013-09-01

    Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.

  9. Open-Source Sequence Clustering Methods Improve the State Of the Art.

    PubMed

    Kopylova, Evguenia; Navas-Molina, Jose A; Mercier, Céline; Xu, Zhenjiang Zech; Mahé, Frédéric; He, Yan; Zhou, Hong-Wei; Rognes, Torbjørn; Caporaso, J Gregory; Knight, Rob

    2016-01-01

    Sequence clustering is a common early step in amplicon-based microbial community analysis, when raw sequencing reads are clustered into operational taxonomic units (OTUs) to reduce the run time of subsequent analysis steps. Here, we evaluated the performance of recently released state-of-the-art open-source clustering software products, namely, OTUCLUST, Swarm, SUMACLUST, and SortMeRNA, against current principal options (UCLUST and USEARCH) in QIIME, hierarchical clustering methods in mothur, and USEARCH's most recent clustering algorithm, UPARSE. All the latest open-source tools showed promising results, reporting up to 60% fewer spurious OTUs than UCLUST, indicating that the underlying clustering algorithm can vastly reduce the number of these derived OTUs. Furthermore, we observed that stringent quality filtering, such as is done in UPARSE, can cause a significant underestimation of species abundance and diversity, leading to incorrect biological results. Swarm, SUMACLUST, and SortMeRNA have been included in the QIIME 1.9.0 release. IMPORTANCE Massive collections of next-generation sequencing data call for fast, accurate, and easily accessible bioinformatics algorithms to perform sequence clustering. A comprehensive benchmark is presented, including open-source tools and the popular USEARCH suite. Simulated, mock, and environmental communities were used to analyze sensitivity, selectivity, species diversity (alpha and beta), and taxonomic composition. The results demonstrate that recent clustering algorithms can significantly improve accuracy and preserve estimated diversity without the application of aggressive filtering. Moreover, these tools are all open source, apply multiple levels of multithreading, and scale to the demands of modern next-generation sequencing data, which is essential for the analysis of massive multidisciplinary studies such as the Earth Microbiome Project (EMP) (J. A. Gilbert, J. K. Jansson, and R. Knight, BMC Biol 12:69, 2014, http://dx.doi.org/10.1186/s12915-014-0069-1).

  10. Turbulent aerosol fluxes over the Arctic Ocean: 2. Wind-driven sources from the sea

    NASA Astrophysics Data System (ADS)

    Nilsson, E. D.; Rannik, Ü.; Swietlicki, E.; Leck, C.; Aalto, P. P.; Zhou, J.; Norman, M.

    2001-12-01

    An eddy-covariance flux system was successfully applied over open sea, leads and ice floes during the Arctic Ocean Expedition in July-August 1996. Wind-driven upward aerosol number fluxes were observed over open sea and leads in the pack ice. These particles must originate from droplets ejected into the air at the bursting of small air bubbles at the water surface. The source flux F (in 106 m-2 s-1) had a strong dependency on wind speed, log>(F>)=0.20U¯-1.71 and 0.11U¯-1.93, over the open sea and leads, respectively (where U¯ is the local wind speed at about 10 m height). Over the open sea the wind-driven aerosol source flux consisted of a film drop mode centered at ˜100 nm diameter and a jet drop mode centered at ˜1 μm diameter. Over the leads in the pack ice, a jet drop mode at ˜2 μm diameter dominated. The jet drop mode consisted of sea-salt, but oxalate indicated an organic contribution, and bacterias and other biogenic particles were identified by single particle analysis. Particles with diameters less than -100 nm appear to have contributed to the flux, but their chemical composition is unknown. Whitecaps were probably the bubble source at open sea and on the leads at high wind speed, but a different bubble source is needed in the leads owing to their small fetch. Melting of ice in the leads is probably the best candidate. The flux over the open sea was of such a magnitude that it could give a significant contribution to the condensation nuclei (CCN) population. Although the flux from the leads were roughly an order of magnitude smaller and the leads cover only a small fraction of the pack ice, the local source may till be important for the CCN population in Arctic fogs. The primary marine aerosol source will increase both with increased wind speed and with decreased ice fraction and extent. The local CCN production may therefore increase and influence cloud or fog albedo and lifetime in response to greenhouse warming in the Arctic Ocean region.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, C.S.

    Kenaf`s story is now being told in the fields of South Texas and Southern Louisiana as new fiber processing operation are responding to the public`s demand for more environmentally sound sources of fiber and farmer`s desperate pleas for additional production options. Despite the title, this paper focuses primarily on the {open_quotes}demand{close_quotes} pull from the market place that brings the new crop production/processing system together. Kenaf, an annual hibiscus crop, has been cultivated for several centuries in Asia and Africa, mostly as a substitute for jute fiber in the world`s cordage industry. The crop was first seriously considered in the Americasmore » when jute supplies from Asia were cut off by the War in the Pacific. In the 1960s the US Department of Agriculture selected kenaf as the most promising annual crop source of fiber for the pulp and paper industry. Industry took a look but it wasn`t their priority and the initial USDA effort ceased in the late 1970s. However, almost at the same time some newspaper publishers, who had been following the USDA work, intervened to keep things going. Kenaf International was formed in 1981 as system-oriented company determined to finally put things together on a commercial basis. The company focused on both ends (market and production), hoping to fill in the middle as it went forward. The primary objective at first was to introduce kenaf as an annually renewable fiber source for newsprint manufacturers. That eventually proved to be a very big bite for a small organization to chew, and Kenaf International (and its associates) soon {open_quotes}discovered{close_quotes} other aspects of kenaf`s potential as it pursued its goals. This is where we join The Kenaf Story {open_quotes}in progress.{close_quotes}« less

  12. Increasing the impact of medical image computing using community-based open-access hackathons: The NA-MIC and 3D Slicer experience.

    PubMed

    Kapur, Tina; Pieper, Steve; Fedorov, Andriy; Fillion-Robin, J-C; Halle, Michael; O'Donnell, Lauren; Lasso, Andras; Ungi, Tamas; Pinter, Csaba; Finet, Julien; Pujol, Sonia; Jagadeesan, Jayender; Tokuda, Junichi; Norton, Isaiah; Estepar, Raul San Jose; Gering, David; Aerts, Hugo J W L; Jakab, Marianna; Hata, Nobuhiko; Ibanez, Luiz; Blezek, Daniel; Miller, Jim; Aylward, Stephen; Grimson, W Eric L; Fichtinger, Gabor; Wells, William M; Lorensen, William E; Schroeder, Will; Kikinis, Ron

    2016-10-01

    The National Alliance for Medical Image Computing (NA-MIC) was launched in 2004 with the goal of investigating and developing an open source software infrastructure for the extraction of information and knowledge from medical images using computational methods. Several leading research and engineering groups participated in this effort that was funded by the US National Institutes of Health through a variety of infrastructure grants. This effort transformed 3D Slicer from an internal, Boston-based, academic research software application into a professionally maintained, robust, open source platform with an international leadership and developer and user communities. Critical improvements to the widely used underlying open source libraries and tools-VTK, ITK, CMake, CDash, DCMTK-were an additional consequence of this effort. This project has contributed to close to a thousand peer-reviewed publications and a growing portfolio of US and international funded efforts expanding the use of these tools in new medical computing applications every year. In this editorial, we discuss what we believe are gaps in the way medical image computing is pursued today; how a well-executed research platform can enable discovery, innovation and reproducible science ("Open Science"); and how our quest to build such a software platform has evolved into a productive and rewarding social engineering exercise in building an open-access community with a shared vision. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Agricultural and Management Practices and Bacterial Contamination in Greenhouse versus Open Field Lettuce Production

    PubMed Central

    Holvoet, Kevin; Sampers, Imca; Seynnaeve, Marleen; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2014-01-01

    The aim of this study was to gain insight into potential differences in risk factors for microbial contamination in greenhouse versus open field lettuce production. Information was collected on sources, testing, and monitoring and if applicable, treatment of irrigation and harvest rinsing water. These data were combined with results of analysis on the levels of Escherichia coli as a fecal indicator organism and the presence of enteric bacterial pathogens on both lettuce crops and environmental samples. Enterohemorragic Escherichia coli (EHEC) PCR signals (vt1 or vt2 positive and eae positive), Campylobacter spp., and Salmonella spp. isolates were more often obtained from irrigation water sampled from open field farms (21/45, 46.7%) versus from greenhouse production (9/75, 12.0%). The open field production was shown to be more prone to fecal contamination as the number of lettuce samples and irrigation water with elevated E. coli was significantly higher. Farmers comply with generic guidelines on good agricultural practices available at the national level, but monitoring of microbial quality, and if applicable appropriateness of water treatment, or water used for irrigation or at harvest is restricted. These results indicate the need for further elaboration of specific guidelines and control measures for leafy greens with regard to microbial hazards. PMID:25546272

  14. Agricultural and management practices and bacterial contamination in greenhouse versus open field lettuce production.

    PubMed

    Holvoet, Kevin; Sampers, Imca; Seynnaeve, Marleen; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2014-12-23

    The aim of this study was to gain insight into potential differences in risk factors for microbial contamination in greenhouse versus open field lettuce production. Information was collected on sources, testing, and monitoring and if applicable, treatment of irrigation and harvest rinsing water. These data were combined with results of analysis on the levels of Escherichia coli as a fecal indicator organism and the presence of enteric bacterial pathogens on both lettuce crops and environmental samples. Enterohemorragic Escherichia coli (EHEC) PCR signals (vt1 or vt2 positive and eae positive), Campylobacter spp., and Salmonella spp. isolates were more often obtained from irrigation water sampled from open field farms (21/45, 46.7%) versus from greenhouse production (9/75, 12.0%). The open field production was shown to be more prone to fecal contamination as the number of lettuce samples and irrigation water with elevated E. coli was significantly higher. Farmers comply with generic guidelines on good agricultural practices available at the national level, but monitoring of microbial quality, and if applicable appropriateness of water treatment, or water used for irrigation or at harvest is restricted. These results indicate the need for further elaboration of specific guidelines and control measures for leafy greens with regard to microbial hazards.

  15. Stochastic production phase design for an open pit mining complex with multiple processing streams

    NASA Astrophysics Data System (ADS)

    Asad, Mohammad Waqar Ali; Dimitrakopoulos, Roussos; van Eldert, Jeroen

    2014-08-01

    In a mining complex, the mine is a source of supply of valuable material (ore) to a number of processes that convert the raw ore to a saleable product or a metal concentrate for production of the refined metal. In this context, expected variation in metal content throughout the extent of the orebody defines the inherent uncertainty in the supply of ore, which impacts the subsequent ore and metal production targets. Traditional optimization methods for designing production phases and ultimate pit limit of an open pit mine not only ignore the uncertainty in metal content, but, in addition, commonly assume that the mine delivers ore to a single processing facility. A stochastic network flow approach is proposed that jointly integrates uncertainty in supply of ore and multiple ore destinations into the development of production phase design and ultimate pit limit. An application at a copper mine demonstrates the intricacies of the new approach. The case study shows a 14% higher discounted cash flow when compared to the traditional approach.

  16. WebGL Visualisation of 3D Environmental Models Based on Finnish Open Geospatial Data Sets

    NASA Astrophysics Data System (ADS)

    Krooks, A.; Kahkonen, J.; Lehto, L.; Latvala, P.; Karjalainen, M.; Honkavaara, E.

    2014-08-01

    Recent developments in spatial data infrastructures have enabled real time GIS analysis and visualization using open input data sources and service interfaces. In this study we present a new concept where metric point clouds derived from national open airborne laser scanning (ALS) and photogrammetric image data are processed, analyzed, finally visualised a through open service interfaces to produce user-driven analysis products from targeted areas. The concept is demonstrated in three environmental applications: assessment of forest storm damages, assessment of volumetric changes in open pit mine and 3D city model visualization. One of the main objectives was to study the usability and requirements of national level photogrammetric imagery in these applications. The results demonstrated that user driven 3D geospatial analyses were possible with the proposed approach and current technology, for instance, the landowner could assess the amount of fallen trees within his property borders after a storm easily using any web browser. On the other hand, our study indicated that there are still many uncertainties especially due to the insufficient standardization of photogrammetric products and processes and their quality indicators.

  17. Building integrated business environments: analysing open-source ESB

    NASA Astrophysics Data System (ADS)

    Martínez-Carreras, M. A.; García Jimenez, F. J.; Gómez Skarmeta, A. F.

    2015-05-01

    Integration and interoperability are two concepts that have gained significant prominence in the business field, providing tools which enable enterprise application integration (EAI). In this sense, enterprise service bus (ESB) has played a crucial role as the underpinning technology for creating integrated environments in which companies may connect all their legacy-applications. However, the potential of these technologies remains unknown and some important features are not used to develop suitable business environments. The aim of this paper is to describe and detail the elements for building the next generation of integrated business environments (IBE) and to analyse the features of ESBs as the core of this infrastructure. For this purpose, we evaluate how well-known open-source ESB products fulfil these needs. Moreover, we introduce a scenario in which the collaborative system 'Alfresco' is integrated in the business infrastructure. Finally, we provide a comparison of the different open-source ESBs available for IBE requirements. According to this study, Fuse ESB provides the best results, considering features such as support for a wide variety of standards and specifications, documentation and implementation, security, advanced business trends, ease of integration and performance.

  18. Robust, open-source removal of systematics in Kepler data

    NASA Astrophysics Data System (ADS)

    Aigrain, S.; Parviainen, H.; Roberts, S.; Reece, S.; Evans, T.

    2017-10-01

    We present ARC2 (Astrophysically Robust Correction 2), an open-source python-based systematics-correction pipeline, to correct for the Kepler prime mission long-cadence light curves. The ARC2 pipeline identifies and corrects any isolated discontinuities in the light curves and then removes trends common to many light curves. These trends are modelled using the publicly available co-trending basis vectors, within an (approximate) Bayesian framework with 'shrinkage' priors to minimize the risk of overfitting and the injection of any additional noise into the corrected light curves, while keeping any astrophysical signals intact. We show that the ARC2 pipeline's performance matches that of the standard Kepler PDC-MAP data products using standard noise metrics, and demonstrate its ability to preserve astrophysical signals using injection tests with simulated stellar rotation and planetary transit signals. Although it is not identical, the ARC2 pipeline can thus be used as an open-source alternative to PDC-MAP, whenever the ability to model the impact of the systematics removal process on other kinds of signal is important.

  19. Design and implementation of an open source indexing solution for a large set of radiological reports and images.

    PubMed

    Voet, T; Devolder, P; Pynoo, B; Vercruysse, J; Duyck, P

    2007-11-01

    This paper hopes to share the insights we experienced during designing, building, and running an indexing solution for a large set of radiological reports and images in a production environment for more than 3 years. Several technical challenges were encountered and solved in the course of this project. One hundred four million words in 1.8 million radiological reports from 1989 to the present were indexed and became instantaneously searchable in a user-friendly fashion; the median query duration is only 31 ms. Currently, our highly tuned index holds 332,088 unique words in four languages. The indexing system is feature-rich and language-independent and allows for making complex queries. For research and training purposes it certainly is a valuable and convenient addition to our radiology informatics toolbox. Extended use of open-source technology dramatically reduced both implementation time and cost. All software we developed related to the indexing project has been made available to the open-source community covered by an unrestricted Berkeley Software Distribution-style license.

  20. Bioethanol Production from Waste Potatoes as a Sustainable Waste-to-energy Resource via Enzymatic Hydrolysis

    NASA Astrophysics Data System (ADS)

    Memon, A. A.; Shah, F. A.; Kumar, N.

    2017-07-01

    Ever increasing demand of energy and corresponding looming depletion of fossil fuels have transpired into a burning need of time to vie for alternative energy resources before the traditional energy sources are completely exhausted. Scientists are continuously working on sustainable energy production as an alternate source of energy to meet the present and future requirements. This research deals with conversion of the starch to fermentable carbon source (sugars) by fermentation through liquefaction by using yeast and alpha- amylase. The results show that the significant bioethanol production was achieved while using the parameters like temperature (30 °C) pH (6) and incubation time of 84 hrs. About 90 ml of bioethanol was produced from potato intake of 800 g. Pakistan being an agricultural country is rich in potato crop and this research bodes well to open new vistas to arrest the energy shortage in this part of the world

  1. Special population planner 4 : an open source release.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuiper, J.; Metz, W.; Tanzman, E.

    2008-01-01

    Emergencies like Hurricane Katrina and the recent California wildfires underscore the critical need to meet the complex challenge of planning for individuals with special needs and for institutionalized special populations. People with special needs and special populations often have difficulty responding to emergencies or taking protective actions, and emergency responders may be unaware of their existence and situations during a crisis. Special Population Planner (SPP) is an ArcGIS-based emergency planning system released as an open source product. SPP provides for easy production of maps, reports, and analyses to develop and revise emergency response plans. It includes tools to manage amore » voluntary registry of data for people with special needs, integrated links to plans and documents, tools for response planning and analysis, preformatted reports and maps, and data on locations of special populations, facility and resource characteristics, and contacts. The system can be readily adapted for new settings without programming and is broadly applicable. Full documentation and a demonstration database are included in the release.« less

  2. The Emergence of Open-Source Software in China

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…

  3. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  4. MultiElec: A MATLAB Based Application for MEA Data Analysis.

    PubMed

    Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R

    2015-01-01

    We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.

  5. Open Access, Open Source and Digital Libraries: A Current Trend in University Libraries around the World

    ERIC Educational Resources Information Center

    Krishnamurthy, M.

    2008-01-01

    Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…

  6. Nurturing reliable and robust open-source scientific software

    NASA Astrophysics Data System (ADS)

    Uieda, L.; Wessel, P.

    2017-12-01

    Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo (zenodo.org). However, citations to these sources are not always recognized when computing citation metrics. In summary, the widespread development of reliable and robust open-source software relies on the creation of formal training programs in software development best practices and the recognition of software as a valid form of scholarship.

  7. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Vetter, Jeffrey S

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less

  8. Applying Clausewitz and Systems Thinking to Design

    DTIC Science & Technology

    2012-06-01

    more open system, like a social-political system, does not respond to some stimulus, say, a stock market fluctuation, in a predictable pattern. The un...demanded maximum efficiency from workers, acquired resources for production, and either captured or developed demand for the product in the market ...Towards a System of Systems Concepts,” Management Science, Vol. 17, No. 11, July 1971, pp. 661-671, Business Source Complete, EBSCOhost . 6. Peter

  9. New Open-Source Version of FLORIS Released | News | NREL

    Science.gov Websites

    New Open-Source Version of FLORIS Released New Open-Source Version of FLORIS Released January 26 , 2018 National Renewable Energy Laboratory (NREL) researchers recently released an updated open-source simplified and documented. Because of the living, open-source nature of the newly updated utility, NREL

  10. Impacts and Viability of Open Source Software on Earth Science Metadata Clearing House and Service Registry Applications

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Cechini, M. F.; Mitchell, A.

    2011-12-01

    Earth Science applications typically deal with large amounts of data and high throughput rates, if not also high transaction rates. While Open Source is frequently used for smaller scientific applications, large scale, highly available systems frequently fall back to "enterprise" class solutions like Oracle RAC or commercial grade JEE Application Servers. NASA's Earth Observing System Data and Information System (EOSDIS) provides end-to-end capabilities for managing NASA's Earth science data from multiple sources - satellites, aircraft, field measurements, and various other programs. A core capability of EOSDIS, the Earth Observing System (EOS) Clearinghouse (ECHO), is a highly available search and order clearinghouse of over 100 million pieces of science data that has evolved from its early R&D days to a fully operational system. Over the course of this maturity ECHO has largely transitioned from commercial frameworks, databases, and operating systems to Open Source solutions...and in some cases, back. In this talk we discuss the progression of our technological solutions and our lessons learned in the areas of: ? High performance, large scale searching solutions ? GeoSpatial search capabilities and dealing with multiple coordinate systems ? Search and storage of variable format source (science) data ? Highly available deployment solutions ? Scalable (elastic) solutions to visual searching and image handling Throughout the evolution of the ECHO system we have had to evaluate solutions with respect to performance, cost, developer productivity, reliability, and maintainability in the context of supporting global science users. Open Source solutions have played a significant role in our architecture and development but several critical commercial components remain (or have been reinserted) to meet our operational demands.

  11. Engineering Fluorometabolite Production: Fluorinase Expression in Salinispora tropica Yields Fluorosalinosporamide†

    PubMed Central

    Eustáquio, Alessandra S.; O'Hagan, David; Moore, Bradley S.

    2010-01-01

    Organofluorine compounds play an important role in medicinal chemistry where they are responsible for up to 15% of the pharmaceutical products on the market. While natural products are valuable sources of new chemical entities, natural fluorinated molecules are extremely rare and the pharmaceutical industry has not benefited from a microbial source of this class of compounds. Streptomyces cattleya is an unusual bacterium in that it elaborates fluoroacetate and the amino acid 4-fluorothreonine. The discovery in 2002 of the fluorination enzyme FlA responsible for C-F bond formation in S. cattleya, and its subsequent characterization, opened up for the first time the prospect of genetically engineering fluorometabolite production from fluoride ion in host organisms. As a proof of principle, we report here the induced production of fluorosalinosporamide by replacing the chlorinase gene salL from Salinispora tropica with the fluorinase gene flA. PMID:20085308

  12. AN OPEN-SOURCE NEUTRINO RADIATION HYDRODYNAMICS CODE FOR CORE-COLLAPSE SUPERNOVAE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Connor, Evan, E-mail: evanoconnor@ncsu.edu; CITA, Canadian Institute for Theoretical Astrophysics, Toronto, M5S 3H8

    2015-08-15

    We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrinomore » transport calculations is the neutrino–matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.« less

  13. Genotype x environment interactions in eggplant for fruit phenolic acid content

    USDA-ARS?s Scientific Manuscript database

    Eggplant fruit are a rich source of phenolic acids that contribute to fruit nutritive value and influence culinary quality. We evaluated the influence of production environment on eggplant fruit phenolic acid content. Ten Solanum melongena accessions including five F1 hybrid cultivars, three open-...

  14. Open Source GIS Connectors to NASA GES DISC Satellite Data

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Pham, Long; Yang, Wenli

    2014-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) houses a suite of high spatiotemporal resolution GIS data including satellite-derived and modeled precipitation, air quality, and land surface parameter data. The data are valuable to various GIS research and applications at regional, continental, and global scales. On the other hand, many GIS users, especially those from the ArcGIS community, have difficulties in obtaining, importing, and using our data due to factors such as the variety of data products, the complexity of satellite remote sensing data, and the data encoding formats. We introduce a simple open source ArcGIS data connector that significantly simplifies the access and use of GES DISC data in ArcGIS.

  15. The Organizational Impact of Open Educational Resources

    NASA Astrophysics Data System (ADS)

    Sclater, Niall

    The open educational resource (OER) movement has been growing rapidly since 2001, stimulated by funding from benefactors such as the Hewlett Foundation and UNESCO, and providing educational content freely to institutions and learners across the world. Individuals and organizations are motivated by a variety of drivers to produce OERs, both altruistic and self-interested. There are parallels with the open source movement, where authors and others combine their efforts to provide a product which they and others can use freely and adapt to their own purposes. There are many different ways in which OER initiatives are organized and an infinite range of possibilities for how the OERs themselves are constituted. If institutions are to develop sustainable OER initiatives, they need to build successful change management initiatives, developing models for the production and quality assurance of OERs, licensing them through appropriate mechanisms such as the Creative Commons, and considering how the resources will be discovered and used by learners.

  16. Integration of photovoltaic and concentrated solar thermal technologies for H2 production by the hybrid sulfur cycle

    NASA Astrophysics Data System (ADS)

    Liberatore, Raffaele; Ferrara, Mariarosaria; Lanchi, Michela; Turchetti, Luca

    2017-06-01

    It is widely agreed that hydrogen used as energy carrier and/or storage media may significantly contribute in the reduction of emissions, especially if produced by renewable energy sources. The Hybrid Sulfur (HyS) cycle is considered as one of the most promising processes to produce hydrogen through the water-splitting process. The FP7 project SOL2HY2 (Solar to Hydrogen Hybrid Cycles) investigates innovative material and process solutions for the use of solar heat and power in the HyS process. A significant part of the SOL2HY2 project is devoted to the analysis and optimization of the integration of the solar and chemical (hydrogen production) plants. In this context, this work investigates the possibility to integrate different solar technologies, namely photovoltaic, solar central receiver and solar troughs, to optimize their use in the HyS cycle for a green hydrogen production, both in the open and closed process configurations. The analysis carried out accounts for different combinations of geographical location and plant sizing criteria. The use of a sulfur burner, which can serve both as thermal backup and SO2 source for the open cycle, is also considered.

  17. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.; Rutan, D. A.

    2016-12-01

    The CERES project continues to provide the scientific community a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. Now in its 16-year, CERES products are mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. In order to serve all our users, we developed a web-based Ordering and Visualization Tool (OVT). Using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others, the OVT Team developed a series of specialized functions to be used in the process of CERES Data Quality Control (QC). We mention 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. We are now in the process of integrating ground site observed surface fluxes to further facilitate the CERES project to QC the CERES computed surface fluxes. These features will give users the opportunity to perform their own comparisons of the CERES computed surface fluxes and observed ground site fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  18. Common Approach to Geoprocessing of Uav Data across Application Domains

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Reichardt, M.; Taylor, T.

    2015-08-01

    UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.

  19. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  20. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  1. ProteinTracker: an application for managing protein production and purification

    PubMed Central

    2012-01-01

    Background Laboratories that produce protein reagents for research and development face the challenge of deciding whether to track batch-related data using simple file based storage mechanisms (e.g. spreadsheets and notebooks), or commit the time and effort to install, configure and maintain a more complex laboratory information management system (LIMS). Managing reagent data stored in files is challenging because files are often copied, moved, and reformatted. Furthermore, there is no simple way to query the data if/when questions arise. Commercial LIMS often include additional modules that may be paid for but not actually used, and often require software expertise to truly customize them for a given environment. Findings This web-application allows small to medium-sized protein production groups to track data related to plasmid DNA, conditioned media samples (supes), cell lines used for expression, and purified protein information, including method of purification and quality control results. In addition, a request system was added that includes a means of prioritizing requests to help manage the high demand of protein production resources at most organizations. ProteinTracker makes extensive use of existing open-source libraries and is designed to track essential data related to the production and purification of proteins. Conclusions ProteinTracker is an open-source web-based application that provides organizations with the ability to track key data involved in the production and purification of proteins and may be modified to meet the specific needs of an organization. The source code and database setup script can be downloaded from http://sourceforge.net/projects/proteintracker. This site also contains installation instructions and a user guide. A demonstration version of the application can be viewed at http://www.proteintracker.org. PMID:22574679

  2. Microstructural probing of ferritic/martensitic steels using internal transmutation-based positron source

    NASA Astrophysics Data System (ADS)

    Krsjak, Vladimir; Dai, Yong

    2015-10-01

    This paper presents the use of an internal 44Ti/44Sc radioisotope source for a direct microstructural characterization of ferritic/martensitic (f/m) steels after irradiation in targets of spallation neutron sources. Gamma spectroscopy measurements show a production of ∼1MBq of 44Ti per 1 g of f/m steels irradiated at 1 dpa (displaced per atom) in the mixed proton-neutron spectrum at the Swiss spallation neutron source (SINQ). In the decay chain 44Ti → 44Sc → 44Ca, positrons are produced together with prompt gamma rays which enable the application of different positron annihilation spectroscopy (PAS) analyses, including lifetime and Doppler broadening spectroscopy. Due to the high production yield, long half-life and relatively high energy of positrons of 44Ti, this methodology opens up new potential for simple, effective and inexpensive characterization of radiation induced defects in f/m steels irradiated in a spallation target.

  3. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.

    2017-12-01

    The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  4. Open for Business

    ERIC Educational Resources Information Center

    Voyles, Bennett

    2007-01-01

    People know about the Sakai Project (open source course management system); they may even know about Kuali (open source financials). So, what is the next wave in open source software? This article discusses business intelligence (BI) systems. Though open source BI may still be only a rumor in most campus IT departments, some brave early adopters…

  5. Factors Influencing F/OSS Cloud Computing Software Product Success: A Quantitative Study

    ERIC Educational Resources Information Center

    Letort, D. Brian

    2012-01-01

    Cloud Computing introduces a new business operational model that allows an organization to shift information technology consumption from traditional capital expenditure to operational expenditure. This shift introduces challenges from both the adoption and creation vantage. This study evaluates factors that influence Free/Open Source Software…

  6. A Blended Learning Scenario to Enhance Learners' Oral Production Skills

    ERIC Educational Resources Information Center

    Kim, Hee-Kyung

    2015-01-01

    This paper examines the effectiveness of a mobile assisted blended learning scenario for pronunciation in Korean language. In particular, we analyze how asynchronous oral communication between learners of Korean and native speakers via "kakaotalk" (an open source mobile phone application) may be beneficial to the learner in terms of…

  7. 40 CFR 63.5710 - How do I demonstrate compliance using emissions averaging?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES National Emission Standards for Hazardous Air Pollutants for Boat Manufacturing Standards for Open... resin used in the past 12 months, kilograms per megagram. MR= Mass of production resin used in the past...

  8. Commonality and Variability Analysis for Xenon Family of Separation Virtual Machine Monitors (CVAX)

    DTIC Science & Technology

    2017-07-18

    technical approach is a systematic application of Software Product Line Engineering (SPLE). A systematic application requires describing the family and... engineering Software family September 2016 – October 2016 OSD/OUSD/ATL/ASD(R&E)/RDOffice of Information Systems & Cyber Security RD / ASD(R&E) / AT&L...by the evolving open-source Xen hypervisor. The technical approach is a systematic application of Software Product Line Engineering (SPLE). A

  9. OpenFDA: an innovative platform providing access to a wealth of FDA's publicly available data.

    PubMed

    Kass-Hout, Taha A; Xu, Zhiheng; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A

    2016-05-01

    The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Using cutting-edge technologies deployed on FDA's new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved.

  10. OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data

    PubMed Central

    Kass-Hout, Taha A; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A

    2016-01-01

    Objective The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Materials and Methods Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Results Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. PMID:26644398

  11. Reinventing the role of consumer research in today's open innovation ecosystem.

    PubMed

    Moskowitz, Howard R; Saguy, I Sam

    2013-01-01

    Consumer research (CR) has played a key role in the food and beverage industry. Emerging from laboratory product-tests, it has evolved into a corporate testing service that measures the consumer reactions to products/concepts using a wide range of analyses/metrics. We propose that CR transform itself in light of accelerated knowledge expansion, mounting global, and local economic pressure on corporations and changing consumer needs. The transformation moves from its traditional testing into creating profoundly new knowledge of the product and understanding of the corporation's current and future customers. CR's tasks will involve: contributing/expanding science, applying open innovation principles, and driving consumer-centric innovation. We identify seven paradigm shifts that will change CR, namely: a different way of working--from testing to open sourcing; from good corporate citizen to change leader; open new product development (NPD) process; new management roles/cultures; universities and industry, new education curricula, and cooperation; from battle over control to sustainable sharing is winning model (SiW); and the central role of design. This integrative, innovative CR requires the implementation of three recommendations: start the change process now, fine-tune along the way; create a new marketing/CR department; and educate and professionalize. These recommendations provide the blueprint for jump-starting the process and call for immediate actions to deal with the severity of the crises facing the CR profession.

  12. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  13. Openly Published Environmental Sensing (OPEnS) | Advancing Open-Source Research, Instrumentation, and Dissemination

    NASA Astrophysics Data System (ADS)

    Udell, C.; Selker, J. S.

    2017-12-01

    The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.

  14. Marine aerosol source regions to Prince of Wales Icefield, Ellesmere Island, and influence from the tropical Pacific, 1979-2001

    NASA Astrophysics Data System (ADS)

    Criscitiello, Alison S.; Marshall, Shawn J.; Evans, Matthew J.; Kinnard, Christophe; Norman, Ann-Lise; Sharp, Martin J.

    2016-08-01

    Using a coastal ice core collected from Prince of Wales (POW) Icefield on Ellesmere Island, we investigate source regions of sea ice-modulated chemical species (methanesulfonic acid (MSA) and chloride (Cl-)) to POW Icefield and the influence of large-scale atmospheric variability on the transport of these marine aerosols (1979-2001). Our key findings are (1) MSA in the POW Icefield core is derived primarily from productivity in the sea ice zone of Baffin Bay and the Labrador Sea, with influence from waters within the North Water (NOW) polynya, (2) sea ice formation processes within the NOW polynya may be a significant source of sea-salt aerosols to the POW core site, in addition to offshore open water source regions primarily in Hudson Bay, and (3) the tropical Pacific influences the source and transport of marine aerosols to POW Icefield through its remote control on regional winds and sea ice variability. Regression analyses during times of MSA deposition reveal sea level pressure (SLP) anomalies favorable for opening of the NOW polynya and subsequent oceanic dimethyl sulfide production. Regression analyses during times of Cl- deposition reveal SLP anomalies that indicate a broader oceanic region of sea-salt sources to the core site. These results are supported by Scanning Multichannel Microwave Radiometer- and Special Sensor Microwave/Imager-based sea ice reconstructions and air mass transport density analyses and suggest that the marine biogenic record may capture local polynya variability, while sea-salt transport to the site from larger offshore source regions in Baffin Bay is likely. Regression analyses show a link to tropical dynamics via an atmospheric Rossby wave.

  15. Ergot: from witchcraft to biotechnology.

    PubMed

    Haarmann, Thomas; Rolke, Yvonne; Giesbert, Sabine; Tudzynski, Paul

    2009-07-01

    The ergot diseases of grasses, caused by members of the genus Claviceps, have had a severe impact on human history and agriculture, causing devastating epidemics. However, ergot alkaloids, the toxic components of Claviceps sclerotia, have been used intensively (and misused) as pharmaceutical drugs, and efficient biotechnological processes have been developed for their in vitro production. Molecular genetics has provided detailed insight into the genetic basis of ergot alkaloid biosynthesis and opened up perspectives for the design of new alkaloids and the improvement of production strains; it has also revealed the refined infection strategy of this biotrophic pathogen, opening up the way for better control. Nevertheless, Claviceps remains an important pathogen worldwide, and a source for potential new drugs for central nervous system diseases.

  16. Open-source software: not quite endsville.

    PubMed

    Stahl, Matthew T

    2005-02-01

    Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.

  17. Developing an Open Source Option for NASA Software

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Parks, John W. (Technical Monitor)

    2003-01-01

    We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.

  18. Open-Source Data and the Study of Homicide.

    PubMed

    Parkin, William S; Gruenewald, Jeff

    2015-07-20

    To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.

  19. Engineering Escherichia coli K12 MG1655 to use starch

    PubMed Central

    2014-01-01

    Background To attain a sustainable bioeconomy, fuel, or valuable product, production must use biomass as substrate. Starch is one of the most abundant biomass resources and is present as waste or as a food and agroindustry by-product. Unfortunately, Escherichia coli, one of the most widely used microorganisms in biotechnological processes, cannot use starch as a carbon source. Results We engineered an E. coli strain capable of using starch as a substrate. The genetic design employed the native capability of the bacterium to use maltodextrins as a carbon source plus expression and secretion of its endogenous α-amylase, AmyA, in an adapted background. Biomass production improved using 35% dissolved oxygen and pH 7.2 in a controlled bioreactor. Conclusion The engineered E. coli strain can use starch from the milieu and open the possibility of optimize the process to use agroindustrial wastes to produce biofuels and other valuable chemicals. PMID:24886307

  20. Greenhouse gas emissions from Australian open-cut coal mines: contribution from spontaneous combustion and low-temperature oxidation.

    PubMed

    Day, Stuart J; Carras, John N; Fry, Robyn; Williams, David J

    2010-07-01

    Spontaneous combustion and low-temperature oxidation of waste coal and other carbonaceous material at open-cut coal mines are potentially significant sources of greenhouse gas emissions. However, the magnitude of these emissions is largely unknown. In this study, emissions from spontaneous combustion and low-temperature oxidation were estimated for six Australian open-cut coal mines with annual coal production ranging from 1.7 to more than 16 Mt. Greenhouse emissions from all other sources at these mines were also estimated and compared to those from spontaneous combustion and low-temperature oxidation. In all cases, fugitive emission of methane was the largest source of greenhouse gas; however, in some mines, spontaneous combustion accounted for almost a third of all emissions. For one mine, it was estimated that emissions from spontaneous combustion were around 250,000 t CO(2)-e per annum. The contribution from low-temperature oxidation was generally less than about 1% of the total for all six mines. Estimating areas of spoil affected by spontaneous combustion by ground-based surveys was prone to under-report the area. Airborne infrared imaging appears to be a more reliable method.

  1. phylo-node: A molecular phylogenetic toolkit using Node.js.

    PubMed

    O'Halloran, Damien M

    2017-01-01

    Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis. To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines. phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node.

  2. Open-source products for a lighting experiment device.

    PubMed

    Gildea, Kevin M; Milburn, Nelda

    2014-12-01

    The capabilities of open-source software and microcontrollers were used to construct a device for controlled lighting experiments. The device was designed to ascertain whether individuals with certain color vision deficiencies were able to discriminate between the red and white lights in fielded systems on the basis of luminous intensity. The device provided the ability to control the timing and duration of light-emitting diode (LED) and incandescent light stimulus presentations, to present the experimental sequence and verbal instructions automatically, to adjust LED and incandescent luminous intensity, and to display LED and incandescent lights with various spectral emissions. The lighting device could easily be adapted for experiments involving flashing or timed presentations of colored lights, or the components could be expanded to study areas such as threshold light perception and visual alerting systems.

  3. New Carbon Source From Microbial Degradation of Pre-Production Resin Pellets from the North Pacific Gyre

    NASA Astrophysics Data System (ADS)

    Neal, A.; Mielke, R.; Stam, C. N.; Gonsior, M.; Tsapin, A. I.; Lee, G.; Leftwich, B.; Narayan, R.; Coleman, H.; Argyropoulos, N.; Sheavly, S. B.; Gorby, Y. A.

    2011-12-01

    Numerous pollutants are transported through the world's oceans that impact oceanic health. Diffuse sources include land-based runoff, atmospheric depositions, shipping industry wastes, and others. Synthetic polymer marine debris is a multi-faceted problem that includes interactions with environmental toxins, carbon cycling systems, ocean surface chemistry, fine minerals deposition, and nano-particles. The impact that synthetic polymer-microbe interactions have on carbon input into the open ocean is poorly understood. Here we demonstrate that both biotic and abiotic processes contribute to degradation of pre-production resin pellets (PRPs), in open ocean environments and new methodologies to determine carbon loss from this synthetic polymer debris. Our data shows that material degradation of environmental polyethylene PRPs can potentially deposit 13 mg/g to 65 mg/g of carbon per PRP into our marine environments. Environmental pre-production resin pellets were collected on the S/V Kaisei cruise in 2009 which covered over 3,000 nautical miles and sampled over 102,000 m3 of the first 15cm of the water column in the Subtropical Convergence Zone of the North Pacific Gyre. Environmental PRP degradation and the role microbial communities play in this was evaluated using a combination of Fourier transform infrared spectroscopy, environmental scanning electron microscopy, scanning transmission electron microscopy, X-ray microtomography, and ArcGIS mapping. More research is needed to understand the environmental impact of this new carbon source arising from synthetic polymers as they degrade in oceanic environments.

  4. ToxicDocs (www.ToxicDocs.org): from history buried in stacks of paper to open, searchable archives online.

    PubMed

    Rosner, David; Markowitz, Gerald; Chowkwanyun, Merlin

    2018-02-01

    As a result of a legal mechanism called discovery, the authors accumulated millions of internal corporate and trade association documents related to the introduction of new products and chemicals into workplaces and commerce. What did these private entities discuss among themselves and with their experts? The plethora of documents, both a blessing and a curse, opened new sources and interesting questions about corporate and regulatory histories. But they also posed an almost insurmountable challenge to historians. Thus emerged ToxicDocs, possible only with a technological innovation known as "Big Data." That refers to the sheer volume of new digital data and to the computational power to analyze them. Users will be able to identify what firms knew (or did not know) about the dangers of toxic substances in their products-and when. The database opens many areas to inquiry including environmental studies, business history, government regulation, and public policy. ToxicDocs will remain a resource free and open to all, anywhere in the world.

  5. Stakeholder co-development of farm level nutrient management software

    NASA Astrophysics Data System (ADS)

    Buckley, Cathal; Mechan, Sarah; Macken-Walsh, Aine; Heanue, Kevin

    2013-04-01

    Over the last number of decades intensification in the use nitrogen (N) and phosphorus (P) in agricultural production has lead to excessive accumulations of these nutrients in soils, groundwaters and surface water bodies (Sutton et al., 2011). According to the European Environment Agency (2012) despite some progress diffuse pollution from agriculture is still significant in more than 40% of Europe's water bodies in rivers and coastal waters, and in one third of the water bodies in lakes and transitional waters. Recently it was estimated that approximately 29% of monitored river channel length is polluted to some degree across the Republic of Ireland. Agricultural sources were suspected in 47 per cent of cases (EPA, 2012). Farm level management practices to reduce nutrient transfers from agricultural land to watercourses can be divided into source reduction and source interception approaches (Ribaudo et al., 2001). Source interception approaches involve capturing nutrients post mobilisation through policy instruments such as riparian buffer zones or wetlands. Conversely, the source reduction approach is preventative in nature and promotes strict management of nutrient at farm and field level to reduce risk of mobilisation in the first instance. This has the potential to deliver a double dividend of reduced nutrient loss to the wider ecosystem while maximising economic return to agricultural production at the field and farm levels. Adoption and use of nutrient management plans among farmers is far from the norm. This research engages key farmer and extension stakeholders to explore how current nutrient management planning software and outputs should be developed to make it more user friendly and usable in a practical way. An open innovation technology co-development approach was adopted to investigate what is demanded by the end users - farm advisors and farmers. Open innovation is a knowledge management strategy that uses the input of stakeholders to improve internal innovation processes. Open innovation incorporates processes such as 'user-led' (farmer and advisor) innovation and the 'co-development' (by technologists and users) of a technology. This strategy is increasingly used by a variety of organisations across sectors to try to ensure that the use of their outputs (products/services/technologies) is optimised by their target customers/clients, by incorporating user insights into the development of outputs. This research use the open innovation co-development framework through farmer and farm advisor focus group sessions to inform the development of a desirable software package for nutrient management planners (farm advisors) and desirable output formats for the end user (farmers). References Sutton, M., Oenema, O., Erisman, J. W., Leip, A., Grinsven, H. & Winiwarter, W. 2011. Too much of a good thing. Nature, 472, 159.161. European Environment Agency, 2012. European waters — assessment of status and pressures. Environmental Protection Agency, 2012. Ireland's Environment: An assessment 2012. Ribaudo, M.O., Heimlich, R., Claassen, R., Peters, M., 2001. Least-cost management of nonpoint source pollution: source reduction versus interception strategies for controlling nitrogen loss in the Mississippi Basin. Ecological Economics, 37, 183-197.

  6. Investigating Advances in the Acquisition of Secure Systems Based on Open Architecture, Open Source Software, and Software Product Lines

    DTIC Science & Technology

    2012-01-27

    example is found in games converted to serve a purpose other than entertainment , such as the development and use of games for science, technology, and...These play-session histories can then be further modded via video editing or remixing with other media (e.g., adding music ) to better enable cinematic...available OSS (e.g., the Linux Kernel on the Sony PS3 game console2) that game system hackers seek to undo. Finally, games are one of the most commonly

  7. A perspective on the proliferation risks of plutonium mines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyman, E.S.

    1996-05-01

    The program of geologic disposal of spent fuel and other plutonium-containing materials is increasingly becoming the target of criticism by individuals who argue that in the future, repositories may become low-cost sources of fissile material for nuclear weapons. This paper attempts to outline a consistent framework for analyzing the proliferation risks of these so-called {open_quotes}plutonium mines{close_quotes} and putting them into perspective. First, it is emphasized that the attractiveness of plutonium in a repository as a source of weapons material depends on its accessibility relative to other sources of fissile material. Then, the notion of a {open_quotes}material production standard{close_quotes} (MPS) ismore » proposed: namely, that the proliferation risks posed by geologic disposal will be acceptable if one can demonstrate, under a number of reasonable scenarios, that the recovery of plutonium from a repository is likely to be as difficult as new production of fissile material. A preliminary analysis suggests that the range of circumstances under which current mined repository concepts would fail to meet this standard is fairly narrow. Nevertheless, a broad application of the MPS may impose severe restrictions on repository design. In this context, the relationship of repository design parameters to easy of recovery is discussed.« less

  8. Is Open Science the Future of Drug Development?

    PubMed

    Shaw, Daniel L

    2017-03-01

    Traditional drug development models are widely perceived as opaque and inefficient, with the cost of research and development continuing to rise even as production of new drugs stays constant. Searching for strategies to improve the drug discovery process, the biomedical research field has begun to embrace open strategies. The resulting changes are starting to reshape the industry. Open science-an umbrella term for diverse strategies that seek external input and public engagement-has become an essential tool with researchers, who are increasingly turning to collaboration, crowdsourcing, data sharing, and open sourcing to tackle some of the most pressing problems in medicine. Notable examples of such open drug development include initiatives formed around malaria and tropical disease. Open practices have found their way into the drug discovery process, from target identification and compound screening to clinical trials. This perspective argues that while open science poses some risks-which include the management of collaboration and the protection of proprietary data-these strategies are, in many cases, the more efficient and ethical way to conduct biomedical research.

  9. The Human Exposure Model (HEM): A Tool to Support Rapid Assessment of Human Health Impacts from Near-Field Consumer Product Exposures

    EPA Science Inventory

    The US EPA is developing an open and publically available software program called the Human Exposure Model (HEM) to provide near-field exposure information for Life Cycle Impact Assessments (LCIAs). Historically, LCIAs have often omitted impacts from near-field sources of exposur...

  10. Places to Go: Sakai|http://www.sakaiproject.org/

    ERIC Educational Resources Information Center

    Downes, Stephen

    2006-01-01

    Stephen Downes continues his examination of open source learning management systems (LMSs) with a visit to Sakai's Web site. While Sakai's Web site is not particularly easy to navigate, it provides access to a large community and constellation of related online learning products and initiatives. Visitors can visit discussion forums to ask…

  11. Common characteristics of open source software development and applicability for drug discovery: a systematic review.

    PubMed

    Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne

    2011-09-28

    Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.

  12. Affordable Open-Source Data Loggers for Distributed Measurements of Sap-Flux, Stem Growth, Relative Humidity, Temperature, and Soil Water Content

    NASA Astrophysics Data System (ADS)

    Anderson, T.; Jencso, K. G.; Hoylman, Z. H.; Hu, J.

    2015-12-01

    Characterizing the mechanisms that lead to differences in forest ecosystem productivity across complex terrain remains a challenge. This difficulty can be partially attributed to the cost of installing networks of proprietary data loggers that monitor differences in the biophysical factors contributing to tree growth. Here, we describe the development and initial application of a network of open source data loggers. These data loggers are based on the Arduino platform, but were refined into a custom printed circuit board (PCB). This reduced the cost and complexity of the data loggers, which made them cheap to reproduce and reliable enough to withstand the harsh environmental conditions experienced in Ecohydrology studies. We demonstrate the utility of these loggers for high frequency, spatially-distributed measurements of sap-flux, stem growth, relative humidity, temperature, and soil water content across 36 landscape positions in the Lubrecht Experimental Forest, MT, USA. This new data logging technology made it possible to develop a spatially distributed monitoring network within the constraints of our research budget and may provide new insights into factors affecting forest productivity across complex terrain.

  13. Case study of open-source enterprise resource planning implementation in a small business

    NASA Astrophysics Data System (ADS)

    Olson, David L.; Staley, Jesse

    2012-02-01

    Enterprise resource planning (ERP) systems have been recognised as offering great benefit to some organisations, although they are expensive and problematic to implement. The cost and risk make well-developed proprietorial systems unaffordable to small businesses. Open-source software (OSS) has become a viable means of producing ERP system products. The question this paper addresses is the feasibility of OSS ERP systems for small businesses. A case is reported involving two efforts to implement freely distributed ERP software products in a small US make-to-order engineering firm. The case emphasises the potential of freely distributed ERP systems, as well as some of the hurdles involved in their implementation. The paper briefly reviews highlights of OSS ERP systems, with the primary focus on reporting the case experiences for efforts to implement ERPLite software and xTuple software. While both systems worked from a technical perspective, both failed due to economic factors. While these economic conditions led to imperfect results, the case demonstrates the feasibility of OSS ERP for small businesses. Both experiences are evaluated in terms of risk dimension.

  14. Climate Change for Agriculture, Forest Cover and 3d Urban Models

    NASA Astrophysics Data System (ADS)

    Kapoor, M.; Bassir, D.

    2014-11-01

    This research demonstrates the important role of the remote sensing in finding out the different parameters behind the agricultural crop change, forest cover and urban 3D models. Standalone software is developed to view and analysis the different factors effecting the change in crop productions. Open-source libraries from the Open Source Geospatial Foundation have been used for the development of the shape-file viewer. Software can be used to get the attribute information, scale, zoom in/out and pan the shapefiles. Environmental changes due to pollution and population that are increasing the urbanisation and decreasing the forest cover on the earth. Satellite imagery such as Landsat 5(1984) to Landsat TRIS/8 (2014), Landsat Data Continuity Mission (LDCM) and NDVI are used to analyse the different parameters that are effecting the agricultural crop production change and forest change. It is advisable for the development of good quality of NDVI and forest cover maps to use data collected from the same processing methods for the complete region. Management practices have been developed from the analysed data for the betterment of the crop and saving the forest cover

  15. Open cyberGIS software for geospatial research and education in the big data era

    NASA Astrophysics Data System (ADS)

    Wang, Shaowen; Liu, Yan; Padmanabhan, Anand

    CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  16. Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.

    PubMed

    Benson, Tim

    2016-07-04

    Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.

  17. Weather forecasting with open source software

    NASA Astrophysics Data System (ADS)

    Rautenhaus, Marc; Dörnbrack, Andreas

    2013-04-01

    To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.

  18. Open Source Software Development

    DTIC Science & Technology

    2011-01-01

    Software, 2002, 149(1), 3-17. 3. DiBona , C., Cooper, D., and Stone, M. (Eds.), Open Sources 2.0, 2005, O’Reilly Media, Sebastopol, CA. Also see, C... DiBona , S. Ockman, and M. Stone (Eds.). Open Sources: Vocides from the Open Source Revolution, 1999. O’Reilly Media, Sebastopol, CA. 4. Ducheneaut, N

  19. Open-source hardware for medical devices

    PubMed Central

    2016-01-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528

  20. Open-source hardware for medical devices.

    PubMed

    Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold

    2016-04-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burwood, R.; Fortems, G.; Mycke, B.

    Deposited under lacustrine conditions during the rift-phase opening of the southern Atlantic, the lower Congo Bucomazi Formation is a highly productive source rock sequence. Reaching considerable thickness (1.8 km), a heterogeneous organofacies reflects both rapid accumulation and changing conditions during Early Cretaceous Barremian sedimentation. As a component of organofacies, low resolution studies showed kerogen kinetic parameters (Ea/A) varied widely according to the gross paleoenvironmental conditions prevailing during deposition. As a a general trend, refractory (type I, higher Ea) kerogens of the [open quotes]basin fill[close quotes] Organic Rich Zone (ORZ) give way to more labile (type II, lower Ea) assemblages inmore » the up-section [open quotes]sheet drape[close quotes] sediments. At higher resolution, a considerable fine structure in Ea fluctuation, presumably reflecting micropaleoenvironment control, becomes evident. Using Ea values assembled for the Bucomazi type section, subsidence modeling for a Ponta Vermelha depocenter section showed a wide disparity in behavior. Being more representative of the sheet-drape episode, type II assemblages matured earlier, at lesser overburdens, and provided the initial hydrocarbon charge. For the ORZ assemblages, the dominant type I component was of retarded maturation, only becoming productive at commensurately greater overburdens. Cumulatively, these events merge to provide an extended period of hydrocarbon generation with implications for production of aggregate oils of varied emplacement histories. Significantly, the net effect of the observed Ea contrast results in the less prolific (but more labile) uppermost Bucomazi assuming a more important charging role than the ORZ of superior source richness. The latter can only realize its full potential under the greatest overburdens attainable in the most subsident depocenters.« less

  2. Choosing Open Source ERP Systems: What Reasons Are There For Doing So?

    NASA Astrophysics Data System (ADS)

    Johansson, Björn; Sudzina, Frantisek

    Enterprise resource planning (ERP) systems attract a high attention and open source software does it as well. The question is then if, and if so, when do open source ERP systems take off. The paper describes the status of open source ERP systems. Based on literature review of ERP system selection criteria based on Web of Science articles, it discusses reported reasons for choosing open source or proprietary ERP systems. Last but not least, the article presents some conclusions that could act as input for future research. The paper aims at building up a foundation for the basic question: What are the reasons for an organization to adopt open source ERP systems.

  3. AquaCrop-OS: A tool for resilient management of land and water resources in agriculture

    NASA Astrophysics Data System (ADS)

    Foster, Timothy; Brozovic, Nicholas; Butler, Adrian P.; Neale, Christopher M. U.; Raes, Dirk; Steduto, Pasquale; Fereres, Elias; Hsiao, Theodore C.

    2017-04-01

    Water managers, researchers, and other decision makers worldwide are faced with the challenge of increasing food production under population growth, drought, and rising water scarcity. Crop simulation models are valuable tools in this effort, and, importantly, provide a means of quantifying rapidly crop yield response to water, climate, and field management practices. Here, we introduce a new open-source crop modelling tool called AquaCrop-OS (Foster et al., 2017), which extends the functionality of the globally used FAO AquaCrop model. Through case studies focused on groundwater-fed irrigation in the High Plains and Central Valley of California in the United States, we demonstrate how AquaCrop-OS can be used to understand the local biophysical, behavioural, and institutional drivers of water risks in agricultural production. Furthermore, we also illustrate how AquaCrop-OS can be combined effectively with hydrologic and economic models to support drought risk mitigation and decision-making around water resource management at a range of spatial and temporal scales, and highlight future plans for model development and training. T. Foster, et al. (2017) AquaCrop-OS: An open source version of FAO's crop water productivity model. Agricultural Water Management. 181: 18-22. http://dx.doi.org/10.1016/j.agwat.2016.11.015.

  4. Effect of enhanced UV-B radiation on pollen quantity, quality, and seed yield in Brassica rapa (Brassicaceae)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demchik, S.M.; Day, T.A.

    Three experiments examined the influence of ultraviolet-B radiation (UV-B; 280-320 nm) exposure on reproduction in Brassica rapa (Brassicacaeae). Plants were grown in a greenhouse under three biologically effective UV-B levels that stimulated either an ambient stratospheric ozone level (control), 16% ({open_quotes}low enhanced{close_quotes}), or 32% ({open_quotes}high enhanced{close_quotes}) ozone depletion levels at Morgantown, WV, USA in mid-March. In the first experiment,pollen production and viability per flower were reduced by {approx}50% under both enhanced UV-B levels relative to ambient controls. While plants under high-enhanced UV-B produced over 40% more flowers than plants under the two lower UV-B treatments, whole-plant production of viable pollenmore » was reduced under low-enhanced UV-B to 34% of ambient controls. In the second experiment, the influence of source-plant UV-B exposure on in vitro pollen from plants was examined and whether source-plant UV-B exposure influenced in vitro pollen germination and viability. Pollen from plants under both enhanced-UV-B was reduced from 65 to 18%. Viability of the pollen from plants grown under both enhanced UV-B treatments was reduced to a much lesser extent: only from {approx}43 to 22%. Thus, ambient source-plant pollen was more sensitive to enhanced UV-B levels to fertilize plants growing under ambient-UV-B levels, and assessed subsequent seed production and germination. Seed abortion rates were higher in plants pollinated with pollen from the enhanced UV-B treatments, than from ambient UV-B. Despite this, seed yield (number and mass) per plant was similar, regardless of the UV-B exposure of their pollen source. Our findings demonstrate that enhanced UV-B levels associated with springtime ozone depletion events have the capacity to substantially reduce viable pollen production, and could ultimately reduce reproductive success of B. rapa. 37 refs., 4 figs., 2 tabs.« less

  5. Chemistry Based on Renewable Raw Materials: Perspectives for a Sugar Cane-Based Biorefinery

    PubMed Central

    Villela Filho, Murillo; Araujo, Carlos; Bonfá, Alfredo; Porto, Weber

    2011-01-01

    Carbohydrates are nowadays a very competitive feedstock for the chemical industry because their availability is compatible with world-scale chemical production and their price, based on the carbon content, is comparable to that of petrochemicals. At the same time, demand is rising for biobased products. Brazilian sugar cane is a competitive feedstock source that is opening the door to a wide range of bio-based products. This essay begins with the importance of the feedstock for the chemical industry and discusses developments in sugar cane processing that lead to low cost feedstocks. Thus, sugar cane enables a new chemical industry, as it delivers a competitive raw material and a source of energy. As a result, sugar mills are being transformed into sustainable biorefineries that fully exploit the potential of sugar cane. PMID:21637329

  6. Chemistry based on renewable raw materials: perspectives for a sugar cane-based biorefinery.

    PubMed

    Villela Filho, Murillo; Araujo, Carlos; Bonfá, Alfredo; Porto, Weber

    2011-01-01

    Carbohydrates are nowadays a very competitive feedstock for the chemical industry because their availability is compatible with world-scale chemical production and their price, based on the carbon content, is comparable to that of petrochemicals. At the same time, demand is rising for biobased products. Brazilian sugar cane is a competitive feedstock source that is opening the door to a wide range of bio-based products. This essay begins with the importance of the feedstock for the chemical industry and discusses developments in sugar cane processing that lead to low cost feedstocks. Thus, sugar cane enables a new chemical industry, as it delivers a competitive raw material and a source of energy. As a result, sugar mills are being transformed into sustainable biorefineries that fully exploit the potential of sugar cane.

  7. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    ERIC Educational Resources Information Center

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  8. Implementing Open Source Platform for Education Quality Enhancement in Primary Education: Indonesia Experience

    ERIC Educational Resources Information Center

    Kisworo, Marsudi Wahyu

    2016-01-01

    Information and Communication Technology (ICT)-supported learning using free and open source platform draws little attention as open source initiatives were focused in secondary or tertiary educations. This study investigates possibilities of ICT-supported learning using open source platform for primary educations. The data of this study is taken…

  9. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  10. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  11. The 2017 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year’s theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest. PMID:29118973

  12. The 2017 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year's theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest.

  13. KaBOB: ontology-based semantic integration of biomedical databases.

    PubMed

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for formal reasoning over a wealth of integrated biomedical data.

  14. A GeoNode-Based Multiscale Platform For Management, Visualization And Integration Of DInSAR Data With Different Geospatial Information Sources

    NASA Astrophysics Data System (ADS)

    Buonanno, Sabatino; Fusco, Adele; Zeni, Giovanni; Manunta, Michele; Lanari, Riccardo

    2017-04-01

    This work describes the implementation of an efficient system for managing, viewing, analyzing and updating remotely sensed data, with special reference to Differential Interferometric Synthetic Aperture Radar (DInSAR) data. The DInSAR products measure Earth surface deformation both in space and time, producing deformation maps and time series[1,2]. The use of these data in research or operational contexts requires tools that have to handle temporal and spatial variability with high efficiency. For this aim we present an implementation based on Spatial Data Infrastructure (SDI) for data integration, management and interchange, by using standard protocols[3]. SDI tools provide access to static datasets that operate only with spatial variability . In this paper we use the open source project GeoNode as framework to extend SDI infrastructure functionalities to ingest very efficiently DInSAR deformation maps and deformation time series. GeoNode allows to realize comprehensive and distributed infrastructure, following the standards of the Open Geospatial Consortium, Inc. - OGC, for remote sensing data management, analysis and integration [4,5]. In the current paper we explain the methodology used for manage the data complexity and data integration using the opens source project GeoNode. The solution presented in this work for the ingestion of DinSAR products is a very promising starting point for future developments of the OGC compliant implementation of a semi-automatic remote sensing data processing chain . [1] Berardino, P., Fornaro, G., Lanari, R., & Sansosti, E. (2002). A new Algorithm for Surface Deformation Monitoring based on Small Baseline Differential SAR Interferograms. IEEE Transactions on Geoscience and Remote Sensing, 40, 11, pp. 2375-2383. [2] Lanari R., F. Casu, M. Manzo, G. Zeni,, P. Berardino, M. Manunta and A. Pepe (2007), An overview of the Small Baseline Subset Algorithm: a DInSAR Technique for Surface Deformation Analysis, P. Appl. Geophys., 164, doi: 10.1007/s00024-007-0192-9. [3] Nebert, D.D. (ed). 2000. Developing Spatial data Infrastructures: The SDI Cookbook. [4] Geonode (www.geonode.org) [5] Kolodziej, k. (ed). 2004. OGC OpenGIS Web Map Server Cookbook. Open Geospatial Consortium, 1.0.2 edition.

  15. The 2015 Bioinformatics Open Source Conference (BOSC 2015).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Lapp, Hilmar; Chapman, Brad; Davey, Rob; Fields, Christopher; Hokamp, Karsten; Munoz-Torres, Monica

    2016-02-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.

  16. Open Source, Openness, and Higher Education

    ERIC Educational Resources Information Center

    Wiley, David

    2006-01-01

    In this article David Wiley provides an overview of how the general expansion of open source software has affected the world of education in particular. In doing so, Wiley not only addresses the development of open source software applications for teachers and administrators, he also discusses how the fundamental philosophy of the open source…

  17. The Emergence of Open-Source Software in North America

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…

  18. Open Babel: An open chemical toolbox

    PubMed Central

    2011-01-01

    Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300

  19. NATURAL PRODUCTS: A CONTINUING SOURCE OF NOVEL DRUG LEADS

    PubMed Central

    Cragg, Gordon M.; Newman, David J.

    2013-01-01

    1. Background Nature has been a source of medicinal products for millennia, with many useful drugs developed from plant sources. Following discovery of the penicillins, drug discovery from microbial sources occurred and diving techniques in the 1970s opened the seas. Combinatorial chemistry (late 1980s), shifted the focus of drug discovery efforts from Nature to the laboratory bench. 2. Scope of Review This review traces natural products drug discovery, outlining important drugs from natural sources that revolutionized treatment of serious diseases. It is clear Nature will continue to be a major source of new structural leads, and effective drug development depends on multidisciplinary collaborations. 3. Major Conclusions The explosion of genetic information led not only to novel screens, but the genetic techniques permitted the implementation of combinatorial biosynthetic technology and genome mining. The knowledge gained has allowed unknown molecules to be identified. These novel bioactive structures can be optimized by using combinatorial chemistry generating new drug candidates for many diseases. 4 General Significance: The advent of genetic techniques that permitted the isolation / expression of biosynthetic cassettes from microbes may well be the new frontier for natural products lead discovery. It is now apparent that biodiversity may be much greater in those organisms. The numbers of potential species involved in the microbial world are many orders of magnitude greater than those of plants and multi-celled animals. Coupling these numbers to the number of currently unexpressed biosynthetic clusters now identified (>10 per species) the potential of microbial diversity remains essentially untapped. PMID:23428572

  20. Recent progress in Open Data production and consumption - examples from a Governmental institute (SMHI) and a collaborative EU research project (SWITCH-ON)

    NASA Astrophysics Data System (ADS)

    Arheimer, Berit; Falkenroth, Esa

    2014-05-01

    The Swedish Meteorological and Hydrological Institute (SMHI) has a long tradition both in producing and consuming open data on a national, European and global scale. It is also promoting community building among water scientists in Europe by participating in and initiating collaborative projects. This presentation will exemplify the contemporary European movement imposed by the INSPIRE directive and the Open Data Strategy, by showing the progress in openness and shift in attitudes during the last decade when handling Research Data and Public Sector Information at a national European institute. Moreover, the presentation will inform about a recently started collaborative project (EU FP7 project No 603587) coordinated by SMHI and called SWITCH-ON http://water-switch-on.eu/. The project addresses water concerns and currently untapped potential of open data for improved water management across the EU. The overall goal of the project is to make use of open data, and add value to society by repurposing and refining data from various sources. SWITCH-ON will establish new forms of water research and facilitate the development of new products and services based on principles of sharing and community building in the water society. The SWITCH-ON objectives are to use open data for implementing: 1) an innovative spatial information platform with open data tailored for direct water assessments, 2) an entirely new form of collaborative research for water-related sciences, 3) fourteen new operational products and services dedicated to appointed end-users, 4) new business and knowledge to inform individual and collective decisions in line with the Europe's smart growth and environmental objectives. The presentation will discuss challenges, progress and opportunities with the open data strategy, based on the experiences from working both at a Governmental institute and being part of the global research community.

  1. Realizing the Living Paper using the ProvONE Model for Reproducible Research

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Jones, C. S.; Ludäscher, B.; Missier, P.; Walker, L.; Slaughter, P.; Schildhauer, M.; Cuevas-Vicenttín, V.

    2015-12-01

    Science has advanced through traditional publications that codify research results as a permenant part of the scientific record. But because publications are static and atomic, researchers can only cite and reference a whole work when building on prior work of colleagues. The open source software model has demonstrated a new approach in which strong version control in an open environment can nurture an open ecosystem of software. Developers now commonly fork and extend software giving proper credit, with less repetition, and with confidence in the relationship to original software. Through initiatives like 'Beyond the PDF', an analogous model has been imagined for open science, in which software, data, analyses, and derived products become first class objects within a publishing ecosystem that has evolved to be finer-grained and is realized through a web of linked open data. We have prototyped a Living Paper concept by developing the ProvONE provenance model for scientific workflows, with prototype deployments in DataONE. ProvONE promotes transparency and openness by describing the authenticity, origin, structure, and processing history of research artifacts and by detailing the steps in computational workflows that produce derived products. To realize the Living Paper, we decompose scientific papers into their constituent products and publish these as compound objects in the DataONE federation of archival repositories. Each individual finding and sub-product of a reseach project (such as a derived data table, a workflow or script, a figure, an image, or a finding) can be independently stored, versioned, and cited. ProvONE provenance traces link these fine-grained products within and across versions of a paper, and across related papers that extend an original analysis. This allows for open scientific publishing in which researchers extend and modify findings, creating a dynamic, evolving web of results that collectively represent the scientific enterprise. The Living Paper provides detailed metadata for properly interpreting and verifying individual research findings, for tracing the origin of ideas, for launching new lines of inquiry, and for implementing transitive credit for research and engineering.

  2. How Educational Ideas Catch On: The Promotion of Popular Education Innovations and the Role of Evidence

    ERIC Educational Resources Information Center

    Carrier, Nathalie

    2017-01-01

    Background: Within the wider education industry, a large quantity of ideas, practices and products are routinely promoted as useful innovations. These innovations span many forms such as software applications, open source courseware, online learning platforms and web 2.0 technologies. Coupled with this promotion, there is increasing interest and…

  3. A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code

    ERIC Educational Resources Information Center

    Fischer, Michael

    2011-01-01

    The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…

  4. A Methodological Approach to Support Collaborative Media Creation in an E-Learning Higher Education Context

    ERIC Educational Resources Information Center

    Ornellas, Adriana; Muñoz Carril, Pablo César

    2014-01-01

    This article outlines a methodological approach to the creation, production and dissemination of online collaborative audio-visual projects, using new social learning technologies and open-source video tools, which can be applied to any e-learning environment in higher education. The methodology was developed and used to design a course in the…

  5. Organizational Commitment of Employees of TV Production Center (Educational Television ETV) for Open Education Faculty, Anadolu University

    ERIC Educational Resources Information Center

    Gürses, Nedim; Demiray, Emine

    2009-01-01

    In like manner as conventional education and teaching approaches distance education tends to model the same procedures. Indeed, formerly enriched on printed material served as a primary source. However, thanks to the developments in technology and evolution in education, computerised information has made inroads in distance education programmes.…

  6. Self-fertility of a central Oregon source of ponderosa pine.

    Treesearch

    Frank C. Sorensen

    1970-01-01

    This report will describe the effect of self-, cross-, and open- or wind-pollination on seed and seedling production of 19 ponderosa pine (Pinus ponderosa Laws.) trees in the eastern foothills of the Cascade Mountains south of Bend, Oreg. The study is part of a continuing investigation of self-fertility in several conifers growing in the Pacific...

  7. New developments in cogeneration: opening remarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuster, C.N.

    1982-06-01

    Cogeneration is defined as Total energy, that is, multiple use of a single source of energy. Dual utilization of radiation in an ancient bath in Pompeii is perhaps the earliest such use. Because of PURPA in 1978 development of small power production facilities and cogeneration is encouraged. A map shows the projected cogeneration facilities across the country in 1995.

  8. The 3R's of Solid Waste & the Population Factor for a Sustainable Planet.

    ERIC Educational Resources Information Center

    Wagner, Joan

    1995-01-01

    Opens with a brief history of human awareness of our effect upon the environment. Culminates with a discussion of a strategy to handle solid wastes. This plan includes the 3R's: (1) source reduction; (2) direct reuse of products; and (3) recycling. Also provides statistics on recycling practices of some countries. (ZWH)

  9. Design and construction of a first-generation high-throughput integrated robotic molecular biology platform for bioenergy applications

    USDA-ARS?s Scientific Manuscript database

    The molecular biological techniques for plasmid-based assembly and cloning of gene open reading frames are essential for elucidating the function of the proteins encoded by the genes. These techniques involve the production of full-length cDNA libraries as a source of plasmid-based clones to expres...

  10. Organizational Commitment of Employees of TV Production Center (Educational Television ETV) for Open Education Facility, Anadolu University

    ERIC Educational Resources Information Center

    Gurses, Nedim; Demiray, Emine

    2009-01-01

    In like manner as conventional education and teaching approaches distance education tends to model the same procedures. Indeed, formerly enriched on printed material served as a primary source. However, thanks to the developments in technology and evolution in education, computerised information has made inroads in distance education programmes.…

  11. RdTools: An Open Source Python Library for PV Degradation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deceglie, Michael G; Jordan, Dirk; Nag, Ambarish

    RdTools is a set of Python tools for analysis of photovoltaic data. In particular, PV production data is evaluated over several years to obtain rates of performance degradation over time. Rdtools can handle both high frequency (hourly or better) or low frequency (daily, weekly, etc.) datasets. Best results are obtained with higher frequency data.

  12. Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on

    PubMed Central

    2011-01-01

    Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community. PMID:21999342

  13. Open Genetic Code: on open source in the life sciences.

    PubMed

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  14. The Open Source Teaching Project (OSTP): Research Note.

    ERIC Educational Resources Information Center

    Hirst, Tony

    The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…

  15. Free for All: Open Source Software

    ERIC Educational Resources Information Center

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  16. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  17. Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness

    ERIC Educational Resources Information Center

    Committee for Economic Development, 2006

    2006-01-01

    Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…

  18. The 2015 Bioinformatics Open Source Conference (BOSC 2015)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J. A.; Lapp, Hilmar

    2016-01-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included “Data Science;” “Standards and Interoperability;” “Open Science and Reproducibility;” “Translational Bioinformatics;” “Visualization;” and “Bioinformatics Open Source Project Updates”. In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled “Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community,” that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule. PMID:26914653

  19. The Irony of Iron – Biogenic Iron Oxides as an Iron Source to the Ocean

    PubMed Central

    Emerson, David

    2016-01-01

    Primary productivity in at least a third of the sunlit open ocean is thought to be iron-limited. Primary sources of dissolved iron (dFe) to the ocean are hydrothermal venting, flux from the sediments along continental margins, and airborne dust. This article provides a general review of sources of hydrothermal and sedimentary iron to the ocean, and speculates upon the role that iron-cycling microbes play in controlling iron dynamics from these sources. Special attention is paid to iron-oxidizing bacteria (FeOB) that live by oxidizing iron and producing biogenic iron oxides as waste products. The presence and ubiquity of FeOB both at hydrothermal systems and in sediments is only beginning to be appreciated. The biogenic oxides they produce have unique properties that could contribute significantly to the dynamics of dFe in the ocean. Changes in the physical and chemical characteristics of the ocean due to climate change and ocean acidification will undoubtedly impact the microbial iron cycle. A better understanding of the contemporary role of microbes in the iron cycle will help in predicting how these changes could ultimately influence marine primary productivity. PMID:26779157

  20. A simple object-oriented and open-source model for scientific and policy analyses of the global climate system – Hector v1.0

    DOE PAGES

    Hartin, Corinne A.; Patel, Pralit L.; Schwarber, Adria; ...

    2015-04-01

    Simple climate models play an integral role in the policy and scientific communities. They are used for climate mitigation scenarios within integrated assessment models, complex climate model emulation, and uncertainty analyses. Here we describe Hector v1.0, an open source, object-oriented, simple global climate carbon-cycle model. This model runs essentially instantaneously while still representing the most critical global-scale earth system processes. Hector has a three-part main carbon cycle: a one-pool atmosphere, land, and ocean. The model's terrestrial carbon cycle includes primary production and respiration fluxes, accommodating arbitrary geographic divisions into, e.g., ecological biomes or political units. Hector actively solves the inorganicmore » carbon system in the surface ocean, directly calculating air–sea fluxes of carbon and ocean pH. Hector reproduces the global historical trends of atmospheric [CO 2], radiative forcing, and surface temperatures. The model simulates all four Representative Concentration Pathways (RCPs) with equivalent rates of change of key variables over time compared to current observations, MAGICC (a well-known simple climate model), and models from the 5th Coupled Model Intercomparison Project. Hector's flexibility, open-source nature, and modular design will facilitate a broad range of research in various areas.« less

  1. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    NASA Astrophysics Data System (ADS)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  2. Open Source Web-Based Solutions for Disseminating and Analyzing Flood Hazard Information at the Community Level

    NASA Astrophysics Data System (ADS)

    Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.

    2017-09-01

    We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.

  3. The 2016 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science.

  4. Beyond Open Source: According to Jim Hirsch, Open Technology, Not Open Source, Is the Wave of the Future

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    This article presents an interview with Jim Hirsch, an associate superintendent for technology at Piano Independent School District in Piano, Texas. Hirsch serves as a liaison for the open technologies committee of the Consortium for School Networking. In this interview, he shares his opinion on the significance of open source in K-12.

  5. EMISSIONS OF ORGANIC AIR TOXICS FROM OPEN ...

    EPA Pesticide Factsheets

    A detailed literature search was performed to collect and collate available data reporting emissions of toxic organic substances into the air from open burning sources. Availability of data varied according to the source and the class of air toxics of interest. Volatile organic compound (VOC) and polycyclic aromatic hydrocarbon (PAH) data were available for many of the sources. Data on semivolatile organic compounds (SVOCs) that are not PAHs were available for several sources. Carbonyl and polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofuran (PCDD/F) data were available for only a few sources. There were several sources for which no emissions data were available at all. Several observations were made including: 1) Biomass open burning sources typically emitted less VOCs than open burning sources with anthropogenic fuels on a mass emitted per mass burned basis, particularly those where polymers were concerned; 2) Biomass open burning sources typically emitted less SVOCs and PAHs than anthropogenic sources on a mass emitted per mass burned basis. Burning pools of crude oil and diesel fuel produced significant amounts of PAHs relative to other types of open burning. PAH emissions were highest when combustion of polymers was taking place; and 3) Based on very limited data, biomass open burning sources typically produced higher levels of carbonyls than anthropogenic sources on a mass emitted per mass burned basis, probably due to oxygenated structures r

  6. Single-Step Laser-Assisted Graphene Oxide Reduction and Nonlinear Optical Properties Exploration via CW Laser Excitation

    NASA Astrophysics Data System (ADS)

    Ghasemi, Fatemeh; Razi, Sepehr; Madanipour, Khosro

    2018-02-01

    The synthesis of reduced graphene oxide using pulsed laser irradiation is experimentally investigated. For this purpose, various irradiation conditions were selected and the chemical features of the different products were explored using ultraviolet-visible, Fourier transform infrared and Raman spectroscopy techniques. Moreover, the nonlinear optical properties of the synthesized products were assessed by using open and closed aperture Z-scan techniques, in which continuous wave laser irradiating at 532-nm wavelength was utilized as the exciting source. The results clearly revealed that the degree of graphene oxide reduction not only depends on the amount of the irradiation dose (energy of the laser beam × exposure time) but also on the light source wavelength. Furthermore, strong dependency between the nonlinear optical properties of the products and the amount of the de-oxygenation was observed. The experimental results are discussed in detail.

  7. Is Open Science the Future of Drug Development?

    PubMed Central

    Shaw, Daniel L.

    2017-01-01

    Traditional drug development models are widely perceived as opaque and inefficient, with the cost of research and development continuing to rise even as production of new drugs stays constant. Searching for strategies to improve the drug discovery process, the biomedical research field has begun to embrace open strategies. The resulting changes are starting to reshape the industry. Open science—an umbrella term for diverse strategies that seek external input and public engagement—has become an essential tool with researchers, who are increasingly turning to collaboration, crowdsourcing, data sharing, and open sourcing to tackle some of the most pressing problems in medicine. Notable examples of such open drug development include initiatives formed around malaria and tropical disease. Open practices have found their way into the drug discovery process, from target identification and compound screening to clinical trials. This perspective argues that while open science poses some risks—which include the management of collaboration and the protection of proprietary data—these strategies are, in many cases, the more efficient and ethical way to conduct biomedical research. PMID:28356902

  8. Aerostat-Lofted Instrument Platform and Sampling Method for Determination of Emissions from Open Area Sources

    EPA Science Inventory

    Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...

  9. The Visible Human Data Sets (VHD) and Insight Toolkit (ITk): Experiments in Open Source Software

    PubMed Central

    Ackerman, Michael J.; Yoo, Terry S.

    2003-01-01

    From its inception in 1989, the Visible Human Project was designed as an experiment in open source software. In 1994 and 1995 the male and female Visible Human data sets were released by the National Library of Medicine (NLM) as open source data sets. In 2002 the NLM released the first version of the Insight Toolkit (ITk) as open source software. PMID:14728278

  10. The 2016 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science. PMID:27781083

  11. Development of high-density helicon plasma sources and their applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shinohara, Shunjiro; Hada, Tohru; Motomura, Taisei

    2009-05-15

    We report on the development of unique, high-density helicon plasma sources and describe their applications. Characterization of one of the largest helicon plasma sources yet constructed is made. Scalings of the particle production efficiency are derived from various plasma production devices in open literature and our own data from long and short cylinder devices, i.e., high and low values of the aspect ratio A (the ratio of the axial length to the diameter), considering the power balance in the framework of a simple diffusion model. A high plasma production efficiency is demonstrated, and we clarify the structures of the excitedmore » waves in the low A region down to 0.075 (the large device diameter of 73.8 cm with the axial length as short as 5.5 cm). We describe the application to plasma propulsion using a new concept that employs no electrodes. A very small diameter (2.5 cm) helicon plasma with 10{sup 13} cm{sup -3} density is produced, and the preliminary results of electromagnetic plasma acceleration are briefly described.« less

  12. Integrating global satellite-derived data products as a pre-analysis for hydrological modelling studies: a case study for the Red River Basin

    USDA-ARS?s Scientific Manuscript database

    With changes in weather patterns and intensifying anthropogenic water use, there is an increasing need for spatio-temporal information on water fluxes and stocks in river basins. The assortment of satellite-derived open-access information sources on rainfall (P) and land use / land cover (LULC) is c...

  13. Core Flight System (cFS) a Low Cost Solution for SmallSats

    NASA Technical Reports Server (NTRS)

    McComas, David; Strege, Susanne; Wilmot, Jonathan

    2015-01-01

    The cFS is a FSW product line that uses a layered architecture and compile-time configuration parameters which make it portable and scalable for a wide range of platforms. The software layers that defined the application run-time environment are now under a NASA-wide configuration control board with the goal of sustaining an open-source application ecosystem.

  14. Three-Dimensional Printing of X-Ray Computed Tomography Datasets with Multiple Materials Using Open-Source Data Processing

    ERIC Educational Resources Information Center

    Sander, Ian M.; McGoldrick, Matthew T.; Helms, My N.; Betts, Aislinn; van Avermaete, Anthony; Owers, Elizabeth; Doney, Evan; Liepert, Taimi; Niebur, Glen; Liepert, Douglas; Leevy, W. Matthew

    2017-01-01

    Advances in three-dimensional (3D) printing allow for digital files to be turned into a "printed" physical product. For example, complex anatomical models derived from clinical or pre-clinical X-ray computed tomography (CT) data of patients or research specimens can be constructed using various printable materials. Although 3D printing…

  15. An open-source and low-cost monitoring system for precision enology.

    PubMed

    Di Gennaro, Salvatore Filippo; Matese, Alessandro; Mancin, Mirko; Primicerio, Jacopo; Palliotti, Alberto

    2014-12-05

    Winemaking is a dynamic process, where microbiological and chemical effects may strongly differentiate products from the same vineyard and even between wine vats. This high variability means an increase in work in terms of control and process management. The winemaking process therefore requires a site-specific approach in order to optimize cellar practices and quality management, suggesting a new concept of winemaking, identified as Precision Enology. The Institute of Biometeorology of the Italian National Research Council has developed a wireless monitoring system, consisting of a series of nodes integrated in barrel bungs with sensors for the measurement of wine physical and chemical parameters in the barrel. This paper describes an open-source evolution of the preliminary prototype, using Arduino-based technology. Results have shown good performance in terms of data transmission and accuracy, minimal size and power consumption. The system has been designed to create a low-cost product, which allows a remote and real-time control of wine evolution in each barrel, minimizing costs and time for sampling and laboratory analysis. The possibility of integrating any kind of sensors makes the system a flexible tool that can satisfy various monitoring needs.

  16. OpenMx: An Open Source Extended Structural Equation Modeling Framework

    ERIC Educational Resources Information Center

    Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John

    2011-01-01

    OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are…

  17. Building Energy Management Open Source Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, Saifur

    Funded by the U.S. Department of Energy in November 2013, a Building Energy Management Open Source Software (BEMOSS) platform was engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. According to the Energy Information Administration (EIA), small- (5,000 square feet or smaller) and medium-sized (between 5,001 to 50,000 square feet) commercial buildings constitute about 95% of all commercial buildings in the U.S. These buildings typically do not have Building Automation Systems (BAS) to monitor and control building operation. While commercial BAS solutions exist, including those from Siemens, Honeywell, Johnsons Controls and many more, they aremore » not cost effective in the context of small- and medium-sized commercial buildings, and typically work with specific controller products from the same company. BEMOSS targets small and medium-sized commercial buildings to address this gap.« less

  18. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.

  19. Totally Integrated Munitions Enterprise ''Affordable Munitions Production for the 21st Century''

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burleson, R.R.; Poggio, M.E.; Rosenberg, S.J.

    2000-09-13

    The U.S. Army faces several munitions manufacturing issues: downsizing of the organic production base, timely fielding of affordable smart munitions, and munitions replenishment during national emergencies. Totally Integrated Munitions Enterprise (TIME) is addressing these complex issues via the development and demonstration of an integrated enterprise. The enterprise will include the tools, network, and open modular architecture controllers to enable accelerated acquisition, shortened concept to volume production, lower life cycle costs, capture of critical manufacturing processes, and communication of process parameters between remote sites to rapidly spin-off production for replenishment by commercial sources. TIME addresses the enterprise as a system, integratingmore » design, engineering, manufacturing, administration, and logistics.« less

  20. Totally Integrated Munitions Enterprise ''Affordable Munitions Production for the 21st Century''

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burleson, R.R.; Poggio, M.E.; Rosenberg, S.J.

    2000-07-14

    The U.S. Army faces several munitions manufacturing issues: downsizing of the organic production base, timely fielding of affordable smart munitions, and munitions replenishment during national emergencies. TIME is addressing these complex issues via the development and demonstration of an integrated enterprise. The enterprise will include the tools, network, and open modular architecture controller to enable accelerated acquisition, shortened concept to volume production, lower life cycle costs, capture of critical manufacturing processes, and communication of process parameters between remote sites to rapidly spin-off production for replenishment by commercial sources. TIME addresses the enterprise as a system, integrating design, engineering, manufacturing, administration,more » and logistics.« less

  1. Totally Integrated Munitions Enterprise ''Affordable Munitions Production for the 21st Century''

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burleson, R.R.; Poggio, M.E.; Rosenberg, S.J.

    2000-08-18

    The U.S. Army faces several munitions manufacturing issues: downsizing of the organic production base, timely fielding of affordable smart munitions, and munitions replenishment during national emergencies. Totally Integrated Munitions Enterprise (TIME) is addressing these complex issues via the development and demonstration of an integrated enterprise. The enterprise will include the tools, network, and open modular architecture controllers to enable accelerated acquisition, shortened concept to volume production, lower life cycle costs, capture of critical manufacturing processes, and communication of process parameters between remote sites to rapidly spin-off production for replenishment by commercial sources. TIME addresses the enterprise as a system, integratingmore » design, engineering, manufacturing, administration, and logistics.« less

  2. ObsPy: Establishing and maintaining an open-source community package

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Megies, T.; Barsch, R.

    2017-12-01

    Python's ecosystem evolved into one of the most powerful and productive research environment across disciplines. ObsPy (https://obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It does so by offering Read and write support for essentially every commonly used data format in seismology, Integrated access to the largest data centers, web services, and real-time data streams, A powerful signal processing toolbox tuned to the specific needs of seismologists, and Utility functionality like travel time calculations, geodetic functions, and data visualizations. ObsPy has been in constant unfunded development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. By now around 70 people directly contributed code to ObsPy and we aim to make it a self-sustaining community project.This contributions focusses on several meta aspects of open-source software in science, in particular how we experienced them. During the panel we would like to discuss obvious questions like long-term sustainability with very limited to no funding, insufficient computer science training in many sciences, and gaining hard scientific credits for software development, but also the following questions: How to best deal with the fact that a lot of scientific software is very specialized thus usually solves a complex problem but at the same time can only ever reach a limited pool of developers and users by virtue of it being so specialized? Therefore the "many eyes on the code" approach to develop and improve open-source software only applies in a limited fashion. An initial publication for a significant new scientific software package is fairly straightforward. How to on-board and motivate potential new contributors when they can no longer be lured by a potential co-authorship? When is spending significant time and effort on reusable scientific open-source development a reasonable choice for young researchers? The effort to go from purpose tailored code for a single application resulting in a scientific publication is significantly less compared to generalising and engineering it well enough so it can be used by others.

  3. Sustainability of Open-Source Software Organizations as Underpinning for Sustainable Interoperability on Large Scales

    NASA Astrophysics Data System (ADS)

    Fulker, D. W.; Gallagher, J. H. R.

    2015-12-01

    OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of engineering expertise, via contracts and grants, is the economic engine. Hence sustainability, as needed to address global grand challenges in geoscience, depends on agencies' and others' abilities and willingness to offer grants and let contracts for continually upgrading open-source software from OPeNDAP and others.

  4. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    NASA Astrophysics Data System (ADS)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )

  5. [Example of product development by industry and research solidarity].

    PubMed

    Seki, Masayoshi

    2014-01-01

    When the industrial firms develop the product, the research result from research institutions is used or to reflect the ideas from users on the developed product would be significant in order to improve the product. To state the software product which developed jointly as an example to describe the adopted development technique and its result, and to consider the modality of the industry solidarity seen from the company side and joint development. The software development methods have the merit and demerit and necessary to choose the optimal development technique by the system which develops. We have been jointly developed the dose distribution browsing software. As the software development method, we adopted the prototype model. In order to display the dose distribution information, it is necessary to load four objects which are CT-Image, Structure Set, RT-Plan, and RT-Dose, are displayed in a composite manner. The prototype model which is the development technique was adopted by this joint development was optimal especially to develop the dose distribution browsing software. In a prototype model, since the detail design was created based on the program source code after the program was finally completed, there was merit on the period shortening of document written and consist in design and implementation. This software eventually opened to the public as an open source. Based on this developed prototype software, the release version of the dose distribution browsing software was developed. Developing this type of novelty software, it normally takes two to three years, but since the joint development was adopted, it shortens the development period to one year. Shortening the development period was able to hold down to the minimum development cost for a company and thus, this will be reflected to the product price. The specialists make requests on the product from user's point of view are important, but increase in specialists as professionals for product development will increase the expectations to develop a product to meet the users demand.

  6. The Case for Open Source: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    Open source has continued to evolve and in the past three years the development of a graphical user interface has made it increasingly accessible and viable for end users without special training. Open source relies to a great extent on the free software movement. In this context, the term free refers not to cost, but to the freedom users have to…

  7. Interoperable Data Access Services for NOAA IOOS

    NASA Astrophysics Data System (ADS)

    de La Beaujardiere, J.

    2008-12-01

    The Integrated Ocean Observing System (IOOS) is intended to enhance our ability to collect, deliver, and use ocean information. The goal is to support research and decision-making by providing data on our open oceans, coastal waters, and Great Lakes in the formats, rates, and scales required by scientists, managers, businesses, governments, and the public. The US National Oceanic and Atmospheric Administration (NOAA) is the lead agency for IOOS. NOAA's IOOS office supports the development of regional coastal observing capability and promotes data management efforts to increase data accessibility. Geospatial web services have been established at NOAA data providers including the National Data Buoy Center (NDBC), the Center for Operational Oceanographic Products and Services (CO-OPS), and CoastWatch, and at regional data provider sites. Services established include Open-source Project for a Network Data Access Protocol (OpenDAP), Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), and OGC Web Coverage Service (WCS). These services provide integrated access to data holdings that have been aggregated at each center from multiple sources. We wish to collaborate with other groups to improve our service offerings to maximize interoperability and enhance cross-provider data integration, and to share common service components such as registries, catalogs, data conversion, and gateways. This paper will discuss the current status of NOAA's IOOS efforts and possible next steps.

  8. Web Mapping Architectures Based on Open Specifications and Free and Open Source Software in the Water Domain

    NASA Astrophysics Data System (ADS)

    Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.

    2017-09-01

    The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  9. Genomes to natural products PRediction Informatics for Secondary Metabolomes (PRISM)

    PubMed Central

    Skinnider, Michael A.; Dejong, Chris A.; Rees, Philip N.; Johnston, Chad W.; Li, Haoxin; Webster, Andrew L. H.; Wyatt, Morgan A.; Magarvey, Nathan A.

    2015-01-01

    Microbial natural products are an invaluable source of evolved bioactive small molecules and pharmaceutical agents. Next-generation and metagenomic sequencing indicates untapped genomic potential, yet high rediscovery rates of known metabolites increasingly frustrate conventional natural product screening programs. New methods to connect biosynthetic gene clusters to novel chemical scaffolds are therefore critical to enable the targeted discovery of genetically encoded natural products. Here, we present PRISM, a computational resource for the identification of biosynthetic gene clusters, prediction of genetically encoded nonribosomal peptides and type I and II polyketides, and bio- and cheminformatic dereplication of known natural products. PRISM implements novel algorithms which render it uniquely capable of predicting type II polyketides, deoxygenated sugars, and starter units, making it a comprehensive genome-guided chemical structure prediction engine. A library of 57 tailoring reactions is leveraged for combinatorial scaffold library generation when multiple potential substrates are consistent with biosynthetic logic. We compare the accuracy of PRISM to existing genomic analysis platforms. PRISM is an open-source, user-friendly web application available at http://magarveylab.ca/prism/. PMID:26442528

  10. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  11. When Free Isn't Free: The Realities of Running Open Source in School

    ERIC Educational Resources Information Center

    Derringer, Pam

    2009-01-01

    Despite the last few years' growth in awareness of open-source software in schools and the potential savings it represents, its widespread adoption is still hampered. Randy Orwin, technology director of the Bainbridge Island School District in Washington State and a strong open-source advocate, cautions that installing an open-source…

  12. Automating Mapping Production for the Enterprise: from Contract to Delivery

    NASA Astrophysics Data System (ADS)

    Uebbing, R.; Xie, C.; Beshah, B.; Welter, J.

    2012-07-01

    The ever increasing volume and quality of geospatial data has created new challenges for mapping companies. Due to increased image resolution, fusion of different data sources and more frequent data update requirements, mapping production is forced to streamline the work flow to meet client deadlines. But the data volume alone is not the only barrier for an efficient production work flow. Processing geospatial information traditionally uses domain and vendor specific applications that do not interface with each other, often leading to data duplication and therefore creating sources for error. Also, it creates isolation between different departments within a mapping company resulting in additional communication barriers. North West Geomatics has designed and implemented a data centric enterprise solution for the flight acquisition and production work flow to combat the above challenges. A central data repository containing not only geospatial data in the strictest sense such as images, vector layers and 3D point clouds, but also other information such as product specifications, client requirements, flight acquisition data, production resource usage and much more has been deployed at the company. As there is only one instance of the database shared throughout the whole organization it allows all employees, given they have been granted the appropriate permission, to view the current status of any project with a graphical and table based interface through its life cycle from sales, through flight acquisition, production and product delivery. Not only can users track progress and status of various work flow steps, but the system also allows users and applications to actively schedule or start specific production steps such as data ingestion and triangulation with many other steps (orthorectification, mosaicing, accounting, etc.) in the planning stages. While the complete system is exposed to the users through a web interface and therefore allowing outside customers to also view their data, much of the design and development was focused on work flow automation, scalability and security. Ideally, users will interact with the system to retrieve a specific project status and summaries while the work flow processes are triggered automatically by modeling their dependencies. The enterprise system is built using open source technologies (PostGIS, Hibernate, OpenLayers, GWT and others) and adheres to OGC web services for data delivery (WMS/WFS/WCS) to third party applications.

  13. A framework for air quality monitoring based on free public data and open source tools

    NASA Astrophysics Data System (ADS)

    Nikolov, Hristo; Borisova, Denitsa

    2014-10-01

    In the recent years more and more widely accepted by the Space agencies (e.g. NASA, ESA) is the policy toward provision of Earth observation (EO) data and end products concerning air quality especially in large urban areas without cost to researchers and SMEs. Those EO data are complemented by increasing amount of in-situ data also provided at no cost either from national authorities or having crowdsourced origin. This accessibility together with the increased processing capabilities of the free and open source software is a prerequisite for creation of solid framework for air modeling in support of decision making at medium and large scale. Essential part of this framework is web-based GIS mapping tool responsible for dissemination of the output generated. In this research an attempt is made to establish a running framework based solely on openly accessible data on air quality and on set of freely available software tools for processing and modeling taking into account the present status quo in Bulgaria. Among the primary sources of data, especially for bigger urban areas, for different types of gases and dust particles, noted should be the National Institute of Meteorology and Hydrology of Bulgaria (NIMH) and National System for Environmental Monitoring managed by Bulgarian Executive Environmental Agency (ExEA). Both authorities provide data for concentration of several gases just to mention CO, CO2, NO2, SO2, and fine suspended dust (PM10, PM2.5) on monthly (for some data on daily) basis. In the framework proposed these data will complement the data from satellite-based sensors such as OMI instrument aboard EOS-Aura satellite and from TROPOMI instrument payload for future ESA Sentinel-5P mission. Integral part of the framework is the modern map for the land use/land cover which is provided from EEA by initiative GIO Land CORINE. This map is also a product from EO data distributed at European level. First and above all, our effort is focused on provision to the wider public living in urbanized areas with one reliable source of information on the present conditions concerning the air quality. Also this information might be used as indicator for presence of acid rains in agriculture areas close to industrial or electricity plants. Its availability at regular basis makes such information valuable source in case of manmade industrial disasters or incidents such as forest fires. Key issue in developing this framework is to ensure the delivery of reliable data products related to air quality at larger scale that those available at the moment.

  14. Open Source Vision

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    Increasingly, colleges and universities are turning to open source as a way to meet their technology infrastructure and application needs. Open source has changed life for visionary CIOs and their campus communities nationwide. The author discusses what these technologists see as the benefits--and the considerations.

  15. 76 FR 34634 - Federal Acquisition Regulation; Prioritizing Sources of Supplies and Services for Use by the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-14

    ... contracts before commercial sources in the open market. The proposed rule amends FAR 8.002 as follows: The... requirements for supplies and services from commercial sources in the open market. The proposed FAR 8.004 would... subpart 8.6). (b) Commercial sources (including educational and non-profit institutions) in the open...

  16. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    NASA Astrophysics Data System (ADS)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  17. Experimental Definition and Validation of Protein Coding Transcripts in Chlamydomonas reinhardtii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kourosh Salehi-Ashtiani; Jason A. Papin

    Algal fuel sources promise unsurpassed yields in a carbon neutral manner that minimizes resource competition between agriculture and fuel crops. Many challenges must be addressed before algal biofuels can be accepted as a component of the fossil fuel replacement strategy. One significant challenge is that the cost of algal fuel production must become competitive with existing fuel alternatives. Algal biofuel production presents the opportunity to fine-tune microbial metabolic machinery for an optimal blend of biomass constituents and desired fuel molecules. Genome-scale model-driven algal metabolic design promises to facilitate both goals by directing the utilization of metabolites in the complex, interconnectedmore » metabolic networks to optimize production of the compounds of interest. Using Chlamydomonas reinhardtii as a model, we developed a systems-level methodology bridging metabolic network reconstruction with annotation and experimental verification of enzyme encoding open reading frames. We reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. Our approach to generate a predictive metabolic model integrated with cloned open reading frames, provides a cost-effective platform to generate metabolic engineering resources. While the generated resources are specific to algal systems, the approach that we have developed is not specific to algae and can be readily expanded to other microbial systems as well as higher plants and animals.« less

  18. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    PubMed

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  19. Automated population of an i2b2 clinical data warehouse from an openEHR-based data repository.

    PubMed

    Haarbrandt, Birger; Tute, Erik; Marschollek, Michael

    2016-10-01

    Detailed Clinical Model (DCM) approaches have recently seen wider adoption. More specifically, openEHR-based application systems are now used in production in several countries, serving diverse fields of application such as health information exchange, clinical registries and electronic medical record systems. However, approaches to efficiently provide openEHR data to researchers for secondary use have not yet been investigated or established. We developed an approach to automatically load openEHR data instances into the open source clinical data warehouse i2b2. We evaluated query capabilities and the performance of this approach in the context of the Hanover Medical School Translational Research Framework (HaMSTR), an openEHR-based data repository. Automated creation of i2b2 ontologies from archetypes and templates and the integration of openEHR data instances from 903 patients of a paediatric intensive care unit has been achieved. In total, it took an average of ∼2527s to create 2.311.624 facts from 141.917 XML documents. Using the imported data, we conducted sample queries to compare the performance with two openEHR systems and to investigate if this representation of data is feasible to support cohort identification and record level data extraction. We found the automated population of an i2b2 clinical data warehouse to be a feasible approach to make openEHR data instances available for secondary use. Such an approach can facilitate timely provision of clinical data to researchers. It complements analytics based on the Archetype Query Language by allowing querying on both, legacy clinical data sources and openEHR data instances at the same time and by providing an easy-to-use query interface. However, due to different levels of expressiveness in the data models, not all semantics could be preserved during the ETL process. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Genetics and mapping of the R11 gene conferring resistance to recently emerged rust races, tightly linked to male fertility restoration, in sunflower (Helianthus annuus L.)

    USDA-ARS?s Scientific Manuscript database

    Sunflower oil is one of the major sources of edible oil. As the second largest hybrid crop in the world, hybrid sunflowers are developed by using the PET1 cytoplasmic male sterility system that contributes a 20% yield advantage over the open-pollinated varieties. However, sunflower production in Nor...

  1. Undergraduate Research Opportunities in OSS

    NASA Astrophysics Data System (ADS)

    Boldyreff, Cornelia; Capiluppi, Andrea; Knowles, Thomas; Munro, James

    Using Open Source Software (OSS) in undergraduate teaching in universities is now commonplace. Students use OSS applications and systems in their courses on programming, operating systems, DBMS, web development to name but a few. Studying OSS projects from both a product and a process view also forms part of the software engineering curriculum at various universities. Many students have taken part in OSS projects as well as developers.

  2. Using R to implement spatial analysis in open source environment

    NASA Astrophysics Data System (ADS)

    Shao, Yixi; Chen, Dong; Zhao, Bo

    2007-06-01

    R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.

  3. [The use of open source software in graphic anatomic reconstructions and in biomechanic simulations].

    PubMed

    Ciobanu, O

    2009-01-01

    The objective of this study was to obtain three-dimensional (3D) images and to perform biomechanical simulations starting from DICOM images obtained by computed tomography (CT). Open source software were used to prepare digitized 2D images of tissue sections and to create 3D reconstruction from the segmented structures. Finally, 3D images were used in open source software in order to perform biomechanic simulations. This study demonstrates the applicability and feasibility of open source software developed in our days for the 3D reconstruction and biomechanic simulation. The use of open source software may improve the efficiency of investments in imaging technologies and in CAD/CAM technologies for implants and prosthesis fabrication which need expensive specialized software.

  4. Framework for Assessing Biogenic CO2 Emissions from ...

    EPA Pesticide Factsheets

    This revision of the 2011 report, Accounting Framework for Biogenic CO2 Emissions from Stationary Sources, evaluates biogenic CO2 emissions from stationary sources, including a detailed study of the scientific and technical issues associated with assessing biogenic carbon dioxide emissions from stationary sources. EPA developed the revised report, Framework for Assessing Biogenic CO2 Emissions from Stationary Sources, to present a methodological framework for assessing the extent to which the production, processing, and use of biogenic material at stationary sources for energy production results in a net atmospheric contribution of biogenic CO2 emissions. Biogenic carbon dioxide emissions are defined as CO2 emissions related to the natural carbon cycle, as well as those resulting from the production, harvest, combustion, digestion, decomposition, and processing of biologically-based materials. The EPA is continuing to refine its technical assessment of biogenic CO2 emissions through another round of targeted peer review of the revised study with the EPA Science Advisory Board (SAB). This study was submitted to the SAB's Biogenic Carbon Emissions Panel in February 2015. http://yosemite.epa.gov/sab/sabproduct.nsf/0/3235dac747c16fe985257da90053f252!OpenDocument&TableRow=2.2#2 The revised report will inform efforts by policymakers, academics, and other stakeholders to evaluate the technical aspects related to assessments of biogenic feedstocks used for energy at s

  5. Open access resources for genome-wide association mapping in rice

    PubMed Central

    McCouch, Susan R.; Wright, Mark H.; Tung, Chih-Wei; Maron, Lyza G.; McNally, Kenneth L.; Fitzgerald, Melissa; Singh, Namrata; DeClerck, Genevieve; Agosto-Perez, Francisco; Korniliev, Pavel; Greenberg, Anthony J.; Naredo, Ma. Elizabeth B.; Mercado, Sheila Mae Q.; Harrington, Sandra E.; Shi, Yuxin; Branchini, Darcy A.; Kuser-Falcão, Paula R.; Leung, Hei; Ebana, Kowaru; Yano, Masahiro; Eizenga, Georgia; McClung, Anna; Mezey, Jason

    2016-01-01

    Increasing food production is essential to meet the demands of a growing human population, with its rising income levels and nutritional expectations. To address the demand, plant breeders seek new sources of genetic variation to enhance the productivity, sustainability and resilience of crop varieties. Here we launch a high-resolution, open-access research platform to facilitate genome-wide association mapping in rice, a staple food crop. The platform provides an immortal collection of diverse germplasm, a high-density single-nucleotide polymorphism data set tailored for gene discovery, well-documented analytical strategies, and a suite of bioinformatics resources to facilitate biological interpretation. Using grain length, we demonstrate the power and resolution of our new high-density rice array, the accompanying genotypic data set, and an expanded diversity panel for detecting major and minor effect QTLs and subpopulation-specific alleles, with immediate implications for rice improvement. PMID:26842267

  6. Rapid development of medical imaging tools with open-source libraries.

    PubMed

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  7. Software LS-MIDA for efficient mass isotopomer distribution analysis in metabolic modelling.

    PubMed

    Ahmed, Zeeshan; Zeeshan, Saman; Huber, Claudia; Hensel, Michael; Schomburg, Dietmar; Münch, Richard; Eisenreich, Wolfgang; Dandekar, Thomas

    2013-07-09

    The knowledge of metabolic pathways and fluxes is important to understand the adaptation of organisms to their biotic and abiotic environment. The specific distribution of stable isotope labelled precursors into metabolic products can be taken as fingerprints of the metabolic events and dynamics through the metabolic networks. An open-source software is required that easily and rapidly calculates from mass spectra of labelled metabolites, derivatives and their fragments global isotope excess and isotopomer distribution. The open-source software "Least Square Mass Isotopomer Analyzer" (LS-MIDA) is presented that processes experimental mass spectrometry (MS) data on the basis of metabolite information such as the number of atoms in the compound, mass to charge ratio (m/e or m/z) values of the compounds and fragments under study, and the experimental relative MS intensities reflecting the enrichments of isotopomers in 13C- or 15 N-labelled compounds, in comparison to the natural abundances in the unlabelled molecules. The software uses Brauman's least square method of linear regression. As a result, global isotope enrichments of the metabolite or fragment under study and the molar abundances of each isotopomer are obtained and displayed. The new software provides an open-source platform that easily and rapidly converts experimental MS patterns of labelled metabolites into isotopomer enrichments that are the basis for subsequent observation-driven analysis of pathways and fluxes, as well as for model-driven metabolic flux calculations.

  8. An Investigation of the Radiative Effects and Climate Feedbacks of Sea Ice Sources of Sea Salt Aerosol

    NASA Astrophysics Data System (ADS)

    Horowitz, H. M.; Alexander, B.; Bitz, C. M.; Jaegle, L.; Burrows, S. M.

    2017-12-01

    In polar regions, sea ice is a major source of sea salt aerosol through lofting of saline frost flowers or blowing saline snow from the sea ice surface. Under continued climate warming, an ice-free Arctic in summer with only first-year, more saline sea ice in winter is likely. Previous work has focused on climate impacts in summer from increasing open ocean sea salt aerosol emissions following complete sea ice loss in the Arctic, with conflicting results suggesting no net radiative effect or a negative climate feedback resulting from a strong first aerosol indirect effect. However, the radiative forcing from changes to the sea ice sources of sea salt aerosol in a future, warmer climate has not previously been explored. Understanding how sea ice loss affects the Arctic climate system requires investigating both open-ocean and sea ice sources of sea-salt aerosol and their potential interactions. Here, we implement a blowing snow source of sea salt aerosol into the Community Earth System Model (CESM) dynamically coupled to the latest version of the Los Alamos sea ice model (CICE5). Snow salinity is a key parameter affecting blowing snow sea salt emissions and previous work has assumed constant regional snow salinity over sea ice. We develop a parameterization for dynamic snow salinity in the sea ice model and examine how its spatial and temporal variability impacts the production of sea salt from blowing snow. We evaluate and constrain the snow salinity parameterization using available observations. Present-day coupled CESM-CICE5 simulations of sea salt aerosol concentrations including sea ice sources are evaluated against in situ and satellite (CALIOP) observations in polar regions. We then quantify the present-day radiative forcing from the addition of blowing snow sea salt aerosol with respect to aerosol-radiation and aerosol-cloud interactions. The relative contributions of sea ice vs. open ocean sources of sea salt aerosol to radiative forcing in polar regions is discussed.

  9. Toward Knowledge Systems for Sustainability Science

    NASA Astrophysics Data System (ADS)

    Zaks, D. P.; Jahn, M.

    2011-12-01

    Managing ecosystems for the outcomes of agricultural productivity and resilience will require fundamentally different knowledge management systems. In the industrial paradigm of the 20th century, land was considered an open, unconstrained system managed for maximum yield. While dramatic increases in yield occurred in some crops and locations, unintended but often foreseeable consequences emerged. While productivity remains a key objective, we must develop analytic systems that can identify better management options for the full range of monetized and non-monetized inputs, outputs and outcomes that are captured in the following framing question: How much valued service (e.g. food, materials, energy) can we draw from a landscape while maintaining adequate levels of other valued or necessary services (e.g. biodiversity, water, climate regulation, cultural services) including the long-term productivity of the land? This question is placed within our contemporary framing of valued services, but structured to illuminate the shifts required to achieve long-term sufficiency and planetary resilience. This framing also highlights the need for fundamentally new knowledge systems including information management infrastructures, which effectively support decision-making on landscapes. The purpose of this initiative by authors from diverse fields across government and academic science is to call attention to the need for a vision and investment in sustainability science for landscape management. Substantially enhanced capabilities are needed to compare and integrate information from diverse sources, collected over time that link choices made to meet our needs from landscapes to both short and long term consequences. To further the goal of an information infrastructure for sustainability science, three distinct but interlocking domains are best distinguished: 1) a domain of data, information and knowledge assets; 2) a domain that houses relevant models and tools in a curated space; and 3) a domain that includes decision support tools and systems tailored toward frame particular trade-offs, which may focus on inputs or outputs and may range in scale from local to global. An information infrastructure for sustainability science is best built be built and maintained as a modular, open source, open standard, open access, open content platform. We have defined the scope of this challenge, managing choices within agroecosystems, recognizing that any decision on a landscape involves multidimensional tradeoffs. An effort to address this challenge will need a cohesive, coherent and targeted approach toward an integrated knowledge management infrastructure for sustainability science applied to land management is essential to move more rapidly toward sustainable, productive, and resilient landscapes.

  10. Acquiring Data by Mining the Past: Pairing Communities with Environmental Monitoring Methods through Open Online Collaborative Replication

    NASA Astrophysics Data System (ADS)

    Lippincott, M.; Lewis, E. S.; Gehrke, G. E.; Wise, A.; Pyle, S.; Sinatra, V.; Bland, G.; Bydlowski, D.; Henry, A.; Gilberts, P. A.

    2016-12-01

    Community groups are interested in low-cost sensors to monitor their environment. However, many new commercial sensors are unknown devices without peer-reviewed evaluations of data quality or pathways to regulatory acceptance, and the time to achieve these outcomes may be beyond a community's patience and attention. Rather than developing a device from scratch or validating a new commercial product, a workflow is presented whereby existing technologies, especially those that are out of patent, are replicated through open online collaboration between communities affected by environmental pollution, volunteers, academic institutions, and existing open hardware and open source software projects. Technology case studies will be presented, focusing primarily on a passive PM monitor based on the UNC Passive Monitor. Stages of the project will be detailed moving from identifying community needs, reviewing existing technology, partnership development, technology replication, IP review and licensing, data quality assurance (in process), and field evaluation with community partners (in process), with special attention to partnership development and technology review. We have leveraged open hardware and open source software to lower the cost and access barriers of existing technologies for PM10-2.5 and other atmospheric measures that have already been validated through peer review. Existing validation of and regulatory familiarity with a technology enables a rapid pathway towards collecting data, shortening the time it takes for communities to leverage data in environmental management decisions. Online collaboration requires rigorous documentation that aids in spreading research methods and promoting deep engagement by interested community researchers outside academia. At the same time, careful choice of technology and the use of small-scale fabrication through laser cutting, 3D printing, and open, shared repositories of plans and software enables educational engagement that broadens a project's reach.

  11. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    PubMed

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Zero-power autonomous buoyancy system controlled by microbial gas production

    NASA Astrophysics Data System (ADS)

    Wu, Peter K.; Fitzgerald, Lisa A.; Biffinger, Justin C.; Spargo, Barry J.; Houston, Brian H.; Bucaro, Joseph A.; Ringeisen, Bradley R.

    2011-05-01

    A zero-power ballast control system that could be used to float and submerge a device solely using a gas source was built and tested. This system could be used to convey sensors, data loggers, and communication devices necessary for water quality monitoring and other applications by periodically maneuvering up and down a water column. Operational parameters for the system such as duration of the submerged and buoyant states can be varied according to its design. The gas source can be of any origin, e.g., compressed air, underwater gas vent, gas produced by microbes, etc. The zero-power ballast system was initially tested using a gas pump and further tested using gas produced by Clostridium acetobutylicum. Using microbial gas production as the only source of gas and no electrical power during operation, the system successfully floated and submerged periodically with a period of 30 min for at least 24 h. Together with microbial fuel cells, this system opens up possibilities for underwater monitoring systems that could function indefinitely.

  13. Sharing Water-related Information to Tackle Changes in the Hydrosphere - for Operational Needs (SWITCH-ON)

    NASA Astrophysics Data System (ADS)

    Arheimer, Berit

    2015-04-01

    Recently, a collaborative EU project started called SWITCH-ON (EU FP7 project No 603587) coordinated by SMHI to support the INSPIRE directive and the Open Data Strategy. The overall goal of the project is to establish a "one-stop-shop" web portal for easy access to European water information. The project will use open data, provide infrastructure for sharing and collaboration, and add value to society and research by repurposing and refining data from various sources. The SWITCH-ON project http://www.water-switch-on.eu/ will establish new forms of water research and facilitate the development of new products and services based on principles of sharing and community building in the water society. The SWITCH-ON objectives are to use open data for implementing: 1) an innovative spatial information platform (SIP) to find, bind, transform and publish data, 2) entirely new forms of collaborative research organised in a Virtual Water-Science Laboratory, open for any research group, 3) fourteen new operational products for water management and awareness, 4) outreach facilities for new water business and knowledge in line with the Europe's smart growth and environmental objectives. This poster will describe the overall project goals and especially the recent progress on developing a Virtual Water-Science Laboratory. Contact: waterswitchon@gmail.com

  14. Opening Pandora's Box: The impact of open system modeling on interpretations of anoxia

    NASA Astrophysics Data System (ADS)

    Hotinski, Roberta M.; Kump, Lee R.; Najjar, Raymond G.

    2000-06-01

    The geologic record preserves evidence that vast regions of ancient oceans were once anoxic, with oxygen levels too low to sustain animal life. Because anoxic conditions have been postulated to foster deposition of petroleum source rocks and have been implicated as a kill mechanism in extinction events, the genesis of such anoxia has been an area of intense study. Most previous models of ocean oxygen cycling proposed, however, have either been qualitative or used closed-system approaches. We reexamine the question of anoxia in open-system box models in order to test the applicability of closed-system results over long timescales and find that open and closed-system modeling results may differ significantly on both short and long timescales. We also compare a scenario with basinwide diffuse upwelling (a three-box model) to a model with upwelling concentrated in the Southern Ocean (a four-box model). While a three-box modeling approach shows that only changes in high-latitude convective mixing rate and character of deepwater sources are likely to cause anoxia, four-box model experiments indicate that slowing of thermohaline circulation, a reduction in wind-driven upwelling, and changes in high-latitude export production may also cause dysoxia or anoxia in part of the deep ocean on long timescales. These results suggest that box models must capture the open-system and vertically stratified nature of the ocean to allow meaningful interpretations of long-lived episodes of anoxia.

  15. The validity of open-source data when assessing jail suicides.

    PubMed

    Thomas, Amanda L; Scott, Jacqueline; Mellow, Jeff

    2018-05-09

    The Bureau of Justice Statistics' Deaths in Custody Reporting Program is the primary source for jail suicide research, though the data is restricted from general dissemination. This study is the first to examine whether jail suicide data obtained from publicly available sources can help inform our understanding of this serious public health problem. Of the 304 suicides that were reported through the DCRP in 2009, roughly 56 percent (N = 170) of those suicides were identified through the open-source search protocol. Each of the sources was assessed based on how much information was collected on the incident and the types of variables available. A descriptive analysis was then conducted on the variables that were present in both data sources. The four variables present in each data source were: (1) demographic characteristics of the victim, (2) the location of occurrence within the facility, (3) the location of occurrence by state, and (4) the size of the facility. Findings demonstrate that the prevalence and correlates of jail suicides are extremely similar in both open-source and official data. However, for almost every variable measured, open-source data captured as much information as official data did, if not more. Further, variables not found in official data were identified in the open-source database, thus allowing researchers to have a more nuanced understanding of the situational characteristics of the event. This research provides support for the argument in favor of including open-source data in jail suicide research as it illustrates how open-source data can be used to provide additional information not originally found in official data. In sum, this research is vital in terms of possible suicide prevention, which may be directly linked to being able to manipulate environmental factors.

  16. Use of open source information and commercial satellite imagery for nuclear nonproliferation regime compliance verification by a community of academics

    NASA Astrophysics Data System (ADS)

    Solodov, Alexander

    The proliferation of nuclear weapons is a great threat to world peace and stability. The question of strengthening the nonproliferation regime has been open for a long period of time. In 1997 the International Atomic Energy Agency (IAEA) Board of Governors (BOG) adopted the Additional Safeguards Protocol. The purpose of the protocol is to enhance the IAEA's ability to detect undeclared production of fissile materials in member states. However, the IAEA does not always have sufficient human and financial resources to accomplish this task. Developed here is a concept for making use of human and technical resources available in academia that could be used to enhance the IAEA's mission. The objective of this research was to study the feasibility of an academic community using commercially or publicly available sources of information and products for the purpose of detecting covert facilities and activities intended for the unlawful acquisition of fissile materials or production of nuclear weapons. In this study, the availability and use of commercial satellite imagery systems, commercial computer codes for satellite imagery analysis, Comprehensive Test Ban Treaty (CTBT) verification International Monitoring System (IMS), publicly available information sources such as watchdog groups and press reports, and Customs Services information were explored. A system for integrating these data sources to form conclusions was also developed. The results proved that publicly and commercially available sources of information and data analysis can be a powerful tool in tracking violations in the international nuclear nonproliferation regime and a framework for implementing these tools in academic community was developed. As a result of this study a formation of an International Nonproliferation Monitoring Academic Community (INMAC) is proposed. This would be an independent organization consisting of academics (faculty, staff and students) from both nuclear weapon states (NWS) and non-nuclear weapon states (NNWS). This community analyzes all types of unclassified publicly and commercially available information to aid in detection of violations of the non-proliferation regime. INMAC shares all of this information with the IAEA and the public. Since INMAC is composed solely by members of the academic community, this organization would not demonstrate any biases in its investigations or reporting.

  17. Open source tools and toolkits for bioinformatics: significance, and where are we?

    PubMed

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  18. Open Source 2010: Reflections on 2007

    ERIC Educational Resources Information Center

    Wheeler, Brad

    2007-01-01

    Colleges and universities and commercial firms have demonstrated great progress in realizing the vision proffered for "Open Source 2007," and 2010 will mark even greater progress. Although much work remains in refining open source for higher education applications, the signals are now clear: the collaborative development of software can provide…

  19. Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments

    ERIC Educational Resources Information Center

    Wang, Shuo; Wang, Jing; Gao, Yanjing

    2017-01-01

    An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…

  20. Creating Open Source Conversation

    ERIC Educational Resources Information Center

    Sheehan, Kate

    2009-01-01

    Darien Library, where the author serves as head of knowledge and learning services, launched a new website on September 1, 2008. The website is built with Drupal, an open source content management system (CMS). In this article, the author describes how she and her colleagues overhauled the library's website to provide an open source content…

  1. Integrating an Automatic Judge into an Open Source LMS

    ERIC Educational Resources Information Center

    Georgouli, Katerina; Guerreiro, Pedro

    2011-01-01

    This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…

  2. 76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... be held in the General Services Administration (GSA), Central Office Auditorium, 1800 F Street NW...

  3. The open-source movement: an introduction for forestry professionals

    Treesearch

    Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove

    2005-01-01

    In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....

  4. Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat.

    PubMed

    Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart

    2015-04-21

    Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation.

  5. Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat

    PubMed Central

    Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart

    2015-01-01

    Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation. PMID:25897892

  6. Modeling Primary Productivity in the Margin Ice Zone from Glider-Based Measurements of Chlorophyll and Light during the 2014 Miz Program

    NASA Astrophysics Data System (ADS)

    Perry, M. J.; Lee, C.; Rainville, L.; Cetinic, I.; Yang, E. J.; Kang, S. H.

    2016-02-01

    In late summer 2014 during the Marginal Ice Zone (MIZ) Experiment, an international project sponsored by ONR, four Seagliders transited open water, through the marginal ice zone, and under ice-covered regions in the Beaufort Sea, penetrating as far as 100 km into the ice pack. The gliders navigated either by GPS in open water or, when under the ice, by acoustics from sound sources embedded in the MIZ autonomous observing array. The glider sensor suite included temperature, temperature microstructure, salinity, oxygen, chlorophyll fluorescence, optical backscatter, and multi-spectral downwelling irradiance. Cruises on the IBRV Araon operating in the open Beaufort Sea and on the R/V Ukpik and Norseman operating in continental shelf waters off Alaska's north slope allowed us to construct proxy libraries for converting chlorophyll fluorescence to chlorophyll concentration and optical backscatter to particulate organic carbon concentration. Water samples were collected for chlorophyll and particulate organic carbon analysis on the cruises and aligned with optical profiles of fluorescence and backscatter using sensors that were factory calibrated at the same time as the glider sensors. Fields of chlorophyll, particulate organic carbon, light, and primary productivity are constructed from the glider data. Productivity is modeled as a function of chlorophyll and light, using photosynthesis-light (PE) models with available PE parameters from Arctic measurements. During August the region under the ice was characterized by a deep chlorophyll maximum layer with low rates of production in overlying waters. A phytoplankton bloom developed in open water at the end of September, preceding the rapid reformation of ice, despite shorter days and reduce irradiation.

  7. Open Source and ROI: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    A switch to free open source software can minimize cost and allow funding to be diverted to equipment and other programs. For instance, the OpenOffice suite is an alternative to expensive basic application programs offered by major vendors. Many such programs on the market offer features seldom used in education but for which educators must pay.…

  8. Economic comparison of open pond raceways to photo bio-reactors for profitable production of algae for transportation fuels in the Southwest

    DOE PAGES

    Richardson, James W.; Johnson, Myriah D.; Outlaw, Joe L.

    2012-05-01

    As energy prices continue to climb there is an increasing interest in alternative, renewable energy sources. Currently, “most of the energy consumed in the U.S. comes from fossil fuels - petroleum, coal, and natural gas, with crude oil-based petroleum products as the dominant source of energy”. The use of renewable energy has grown, but is only making a small dent in current consumption at about eight percent of the United States total. Another concern with the use of fossil fuels is the emission of carbon dioxide into the atmosphere and complications to the climate. This is because, according to themore » U.S. Energy Information Administration (EIA) “fossil fuels are responsible for 99% of CO 2 emissions”.« less

  9. Utilizing Free and Open Source Software to access, view and compare in situ observations, EO products and model output data

    NASA Astrophysics Data System (ADS)

    Vines, Aleksander; Hamre, Torill; Lygre, Kjetil

    2014-05-01

    The GreenSeas project (Development of global plankton data base and model system for eco-climate early warning) aims to advance the knowledge and predictive capacities of how marine ecosystems will respond to global change. A main task has been to set up a data delivery and monitoring core service following the open and free data access policy implemented in the Global Monitoring for the Environment and Security (GMES) programme. The aim is to ensure open and free access to historical plankton data, new data (EO products and in situ measurements), model data (including estimates of simulation error) and biological, environmental and climatic indicators to a range of stakeholders, such as scientists, policy makers and environmental managers. To this end, we have developed a geo-spatial database of both historical and new in situ physical, biological and chemical parameters for the Southern Ocean, Atlantic, Nordic Seas and the Arctic, and organized related satellite-derived quantities and model forecasts in a joint geo-spatial repository. For easy access to these data, we have implemented a web-based GIS (Geographical Information Systems) where observed, derived and forcasted parameters can be searched, displayed, compared and exported. Model forecasts can also be uploaded dynamically to the system, to allow modelers to quickly compare their results with available in situ and satellite observations. We have implemented the web-based GIS(Geographical Information Systems) system based on free and open source technologies: Thredds Data Server, ncWMS, GeoServer, OpenLayers, PostGIS, Liferay, Apache Tomcat, PRTree, NetCDF-Java, json-simple, Geotoolkit, Highcharts, GeoExt, MapFish, FileSaver, jQuery, jstree and qUnit. We also wanted to used open standards to communicate between the different services and we use WMS, WFS, netCDF, GML, OPeNDAP, JSON, and SLD. The main advantage we got from using FOSS was that we did not have to invent the wheel all over again, but could use already existing code and functionalities on our software for free: Of course most the software did not have to be open source for this, but in some cases we had to do minor modifications to make the different technologies work together. We could extract the parts of the code that we needed for a specific task. One example of this was to use part of the code from ncWMS and Thredds to help our main application to both read netCDF files and present them in the browser. This presentation will focus on both difficulties we had with and advantages we got from developing this tool with FOSS.

  10. Open source drug discovery--a new paradigm of collaborative research in tuberculosis drug development.

    PubMed

    Bhardwaj, Anshu; Scaria, Vinod; Raghava, Gajendra Pal Singh; Lynn, Andrew Michael; Chandra, Nagasuma; Banerjee, Sulagna; Raghunandanan, Muthukurussi V; Pandey, Vikas; Taneja, Bhupesh; Yadav, Jyoti; Dash, Debasis; Bhattacharya, Jaijit; Misra, Amit; Kumar, Anil; Ramachandran, Srinivasan; Thomas, Zakir; Brahmachari, Samir K

    2011-09-01

    It is being realized that the traditional closed-door and market driven approaches for drug discovery may not be the best suited model for the diseases of the developing world such as tuberculosis and malaria, because most patients suffering from these diseases have poor paying capacity. To ensure that new drugs are created for patients suffering from these diseases, it is necessary to formulate an alternate paradigm of drug discovery process. The current model constrained by limitations for collaboration and for sharing of resources with confidentiality hampers the opportunities for bringing expertise from diverse fields. These limitations hinder the possibilities of lowering the cost of drug discovery. The Open Source Drug Discovery project initiated by Council of Scientific and Industrial Research, India has adopted an open source model to power wide participation across geographical borders. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multi-faceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Since the inventions are community generated, the new chemical entities developed by Open Source Drug Discovery will be taken up for clinical trial in a non-exclusive manner by participation of multiple companies with majority funding from Open Source Drug Discovery. This will ensure availability of drugs through a lower cost community driven drug discovery process for diseases afflicting people with poor paying capacity. Hopefully what LINUX the World Wide Web have done for the information technology, Open Source Drug Discovery will do for drug discovery. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. State-of-the-practice and lessons learned on implementing open data and open source policies.

    DOT National Transportation Integrated Search

    2012-05-01

    This report describes the current government, academic, and private sector practices associated with open data and open source application development. These practices are identified; and the potential uses with the ITS Programs Data Capture and M...

  12. Your Personal Analysis Toolkit - An Open Source Solution

    NASA Astrophysics Data System (ADS)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Gregory S.; Nickless, William K.; Thiede, David R.

    Enterprise level cyber security requires the deployment, operation, and monitoring of many sensors across geographically dispersed sites. Communicating with the sensors to gather data and control behavior is a challenging task when the number of sensors is rapidly growing. This paper describes the system requirements, design, and implementation of T3, the third generation of our transport software that performs this task. T3 relies on open source software and open Internet standards. Data is encoded in MIME format messages and transported via NNTP, which provides scalability. OpenSSL and public key cryptography are used to secure the data. Robustness and ease ofmore » development are increased by defining an internal cryptographic API, implemented by modules in C, Perl, and Python. We are currently using T3 in a production environment. It is freely available to download and use for other projects.« less

  14. The impact of gas-surface reactions on mass spectrometric measurements of atomic nitrogen. [determination of atmosphere ion sources

    NASA Technical Reports Server (NTRS)

    Engebretson, M. J.; Mauersberger, K.

    1979-01-01

    The paper presents a simplified model of the ion source chemistry, explains several details of the data reduction method used in obtaining atomic-nitrogen (N) densities from OSS data, and discusses implications of gas-surface reactions for the design of future satellite-borne mass spectrometers. Because of various surface reactions, N appears in three different forms in the ion source, as N, NO, and NO2. Considering the rather small spin modulation of NO and NO2 in the semi-open ionization chamber used in the OSS instrument, it is not surprising that these reaction products have not been previously identified in closed source instruments as a measure of the presence of atomic nitrogen. Warmup and/or outgassing of the ion source are shown to drastically reduce the NO2 concentration, thereby making possible reliable measurement of ambient N densities.

  15. Low-cost production of 6G-fructofuranosidase with high value-added astaxanthin by Xanthophyllomyces dendrorhous.

    PubMed

    Ning, Yawei; Li, Qiang; Chen, Feng; Yang, Na; Jin, Zhengyu; Xu, Xueming

    2012-01-01

    The effects of medium composition and culture conditions on the production of (6)G-fructofuranosidase with value-added astaxanthin were investigated to reduce the capital cost of neo-fructooligosaccharides (neo-FOS) production by Xanthophyllomyces dendrorhous. The sucrose and corn steep liquor (CSL) were found to be the optimal carbon source and nitrogen source, respectively. CSL and initial pH were selected as the critical factors using Plackett-Burman design. Maximum (6)G-fructofuranosidase 242.57 U/mL with 5.23 mg/L value-added astaxanthin was obtained at CSL 52.5 mL/L and pH 7.89 by central composite design. Neo-FOS yield could reach 238.12 g/L under the optimized medium conditions. Cost analysis suggested 66.3% of substrate cost was reduced compared with that before optimization. These results demonstrated that the optimized medium and culture conditions could significantly enhance the production of (6)G-fructofuranosidase with value-added astaxanthin and remarkably decrease the substrate cost, which opened up possibilities to produce neo-FOS industrially. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Compilation of geospatial data for the mineral industries and related infrastructure of Latin America and the Caribbean

    USGS Publications Warehouse

    Baker, Michael S.; Buteyn, Spencer D.; Freeman, Philip A.; Trippi, Michael H.; Trimmer III, Loyd M.

    2017-07-31

    This report describes the U.S. Geological Survey’s (USGS) ongoing commitment to its mission of understanding the nature and distribution of global mineral commodity supply chains by updating and publishing the georeferenced locations of mineral commodity production and processing facilities, mineral exploration and development sites, and mineral commodity exporting ports in Latin America and the Caribbean. The report includes an overview of data sources and an explanation of the geospatial PDF map format.The geodatabase and geospatial data layers described in this report create a new geographic information product in the form of a geospatial portable document format (PDF) map. The geodatabase contains additional data layers from USGS, foreign governmental, and open-source sources as follows: (1) coal occurrence areas, (2) electric power generating facilities, (3) electric power transmission lines, (4) hydrocarbon resource cumulative production data, (5) liquefied natural gas terminals, (6) oil and gas concession leasing areas, (7) oil and gas field center points, (8) oil and gas pipelines, (9) USGS petroleum provinces, (10) railroads, (11) recoverable proven plus probable hydrocarbon resources, (12) major cities, (13) major rivers, and (14) undiscovered porphyry copper tracts.

  17. Influence of the Surf Zone on the Marine Aerosol Concentration in a Coastal Area

    NASA Astrophysics Data System (ADS)

    Tedeschi, Gilles; van Eijk, Alexander M. J.; Piazzola, Jacques; Kusmierczyk-Michulec, Jolanta T.

    2017-01-01

    Sea-salt aerosol concentrations in the coastal zone are assessed with the numerical aerosol-transport model MACMod that applies separate aerosol source functions for open ocean and the surf zone near the sea-land transition. Numerical simulations of the aerosol concentration as a function of offshore distance from the surf zone compare favourably with experimental data obtained during a surf-zone aerosol experiment in Duck, North Carolina in autumn 2007. Based on numerical simulations, the effect of variations in aerosol production (source strength) and transport conditions (wind speed, air-sea temperature difference), we show that the surf-zone aerosols are replaced by aerosols generated over the open ocean as the airmass advects out to sea. The contribution from the surf-generated aerosol is significant during high wind speeds and high wave events, and is significant up to 30 km away from the production zone. At low wind speeds, the oceanic component dominates, except within 1-5 km of the surf zone. Similar results are obtained for onshore flow, where no further sea-salt aerosol production occurs as the airmass advects out over land. The oceanic aerosols that are well-mixed throughout the boundary layer are then more efficiently transported inland than are the surf-generated aerosols, which are confined to the first few tens of metres above the surface, and are therefore also more susceptible to the type of surface (trees or grass) that determines the deposition velocity.

  18. The Community Intercomparison Suite (CIS)

    NASA Astrophysics Data System (ADS)

    Watson-Parris, Duncan; Schutgens, Nick; Cook, Nick; Kipling, Zak; Kershaw, Phil; Gryspeerdt, Ed; Lawrence, Bryan; Stier, Philip

    2017-04-01

    Earth observations (both remote and in-situ) create vast amounts of data providing invaluable constraints for the climate science community. Efficient exploitation of these complex and highly heterogeneous datasets has been limited however by the lack of suitable software tools, particularly for comparison of gridded and ungridded data, thus reducing scientific productivity. CIS (http://cistools.net) is an open-source, command line tool and Python library which allows the straight-forward quantitative analysis, intercomparison and visualisation of remote sensing, in-situ and model data. The CIS can read gridded and ungridded remote sensing, in-situ and model data - and many other data sources 'out-of-the-box', such as ESA Aerosol and Cloud CCI product, MODIS, Cloud CCI, Cloudsat, AERONET. Perhaps most importantly however CIS also employs a modular plugin architecture to allow for the reading of limitless different data types. Users are able to write their own plugins for reading the data sources which they are familiar with, and share them within the community, allowing all to benefit from their expertise. To enable the intercomparison of this data the CIS provides a number of operations including: the aggregation of ungridded and gridded datasets to coarser representations using a number of different built in averaging kernels; the subsetting of data to reduce its extent or dimensionality; the co-location of two distinct datasets onto a single set of co-ordinates; the visualisation of the input or output data through a number of different plots and graphs; the evaluation of arbitrary mathematical expressions against any number of datasets; and a number of other supporting functions such as a statistical comparison of two co-located datasets. These operations can be performed efficiently on local machines or large computing clusters - and is already available on the JASMIN computing facility. A case-study using the GASSP collection of in-situ aerosol observations will demonstrate the power of using CIS to perform model evaluations. The use of an open-source, community developed tool in this way opens up a huge amount of data which would previously have been inaccessible to many users, while also providing replicable, repeatable analysis which scientists and policy-makers alike can trust and understand.

  19. The Knowledge Economy and Higher Education: Rankings and Classifications, Research Metrics and Learning Outcomes Measures as a System for Regulating the Value of Knowledge

    ERIC Educational Resources Information Center

    Marginson, Simon

    2009-01-01

    This paper describes the global knowledge economy (the k-economy), comprised by (1) open source knowledge flows and (2) commercial markets in intellectual property and knowledge-intensive goods. Like all economy the global knowledge economy is a site of production. It is also social and cultural, taking the form of a one-world community mediated…

  20. A Survey of Techniques for Security Architecture Analysis

    DTIC Science & Technology

    2003-05-01

    to be corrected immediately. 49 DSTO-TR-1438 A software phenomenon is the "user innovation network", examples of such networks being "free" and "open...source" software projects. These networks have innovation development, production, distribution and consumption all being performed by users/self...manufacturers. "User innovation networks can function entirely independently of manufacturers because (1) at least some users have sufficient incentive to

  1. A Disk-Based System for Producing and Distributing Science Products from MODIS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Wolfe, Robert; Sinno, Scott; Ye Gang; Teague, Michael

    2007-01-01

    Since beginning operations in 1999, the MODIS Adaptive Processing System (MODAPS) has evolved to take advantage of trends in information technology, such as the falling cost of computing cycles and disk storage and the availability of high quality open-source software (Linux, Apache and Perl), to achieve substantial gains in processing and distribution capacity and throughput while driving down the cost of system operations.

  2. Autotune Calibrates Models to Building Use Data

    ScienceCinema

    None

    2018-01-16

    Models of existing buildings are currently unreliable unless calibrated manually by a skilled professional. Autotune, as the name implies, automates this process by calibrating the model of an existing building to measured data, and is now available as open source software. This enables private businesses to incorporate Autotune into their products so that their customers can more effectively estimate cost savings of reduced energy consumption measures in existing buildings.

  3. ROMI 3.1 Least-cost lumber grade mix solver using open source statistical software

    Treesearch

    Rebecca A. Buck; Urs Buehlmann; R. Edward Thomas

    2010-01-01

    The least-cost lumber grade mix solution has been a topic of interest to both industry and academia for many years due to its potential to help wood processing operations reduce costs. A least-cost lumber grade mix solver is a rough mill decision support system that describes the lumber grade or grade mix needed to minimize raw material or total production cost (raw...

  4. Global hierarchical classification of deepwater and wetland environments from remote sensing products

    NASA Astrophysics Data System (ADS)

    Fluet-Chouinard, E.; Lehner, B.; Aires, F.; Prigent, C.; McIntyre, P. B.

    2017-12-01

    Global surface water maps have improved in spatial and temporal resolutions through various remote sensing methods: open water extents with compiled Landsat archives and inundation with topographically downscaled multi-sensor retrievals. These time-series capture variations through time of open water and inundation without discriminating between hydrographic features (e.g. lakes, reservoirs, river channels and wetland types) as other databases have done as static representation. Available data sources present the opportunity to generate a comprehensive map and typology of aquatic environments (deepwater and wetlands) that improves on earlier digitized inventories and maps. The challenge of classifying surface waters globally is to distinguishing wetland types with meaningful characteristics or proxies (hydrology, water chemistry, soils, vegetation) while accommodating limitations of remote sensing data. We present a new wetland classification scheme designed for global application and produce a map of aquatic ecosystem types globally using state-of-the-art remote sensing products. Our classification scheme combines open water extent and expands it with downscaled multi-sensor inundation data to capture the maximal vegetated wetland extent. The hierarchical structure of the classification is modified from the Cowardin Systems (1979) developed for the USA. The first level classification is based on a combination of landscape positions and water source (e.g. lacustrine, riverine, palustrine, coastal and artificial) while the second level represents the hydrologic regime (e.g. perennial, seasonal, intermittent and waterlogged). Class-specific descriptors can further detail the wetland types with soils and vegetation cover. Our globally consistent nomenclature and top-down mapping allows for direct comparison across biogeographic regions, to upscale biogeochemical fluxes as well as other landscape level functions.

  5. Getting Open Source Software into Schools: Strategies and Challenges

    ERIC Educational Resources Information Center

    Hepburn, Gary; Buley, Jan

    2006-01-01

    In this article Gary Hepburn and Jan Buley outline different approaches to implementing open source software (OSS) in schools; they also address the challenges that open source advocates should anticipate as they try to convince educational leaders to adopt OSS. With regard to OSS implementation, they note that schools have a flexible range of…

  6. Open Source Library Management Systems: A Multidimensional Evaluation

    ERIC Educational Resources Information Center

    Balnaves, Edmund

    2008-01-01

    Open source library management systems have improved steadily in the last five years. They now present a credible option for small to medium libraries and library networks. An approach to their evaluation is proposed that takes account of three additional dimensions that only open source can offer: the developer and support community, the source…

  7. Open Source as Appropriate Technology for Global Education

    ERIC Educational Resources Information Center

    Carmichael, Patrick; Honour, Leslie

    2002-01-01

    Economic arguments for the adoption of "open source" software in business have been widely discussed. In this paper we draw on personal experience in the UK, South Africa and Southeast Asia to forward compelling reasons why open source software should be considered as an appropriate and affordable alternative to the currently prevailing…

  8. Government Technology Acquisition Policy: The Case of Proprietary versus Open Source Software

    ERIC Educational Resources Information Center

    Hemphill, Thomas A.

    2005-01-01

    This article begins by explaining the concepts of proprietary and open source software technology, which are now competing in the marketplace. A review of recent individual and cooperative technology development and public policy advocacy efforts, by both proponents of open source software and advocates of proprietary software, subsequently…

  9. Open Source Communities in Technical Writing: Local Exigence, Global Extensibility

    ERIC Educational Resources Information Center

    Conner, Trey; Gresham, Morgan; McCracken, Jill

    2011-01-01

    By offering open-source software (OSS)-based networks as an affordable technology alternative, we partnered with a nonprofit community organization. In this article, we narrate the client-based experiences of this partnership, highlighting the ways in which OSS and open-source culture (OSC) transformed our students' and our own expectations of…

  10. Personal Electronic Devices and the ISR Data Explosion: The Impact of Cyber Cameras on the Intelligence Community

    DTIC Science & Technology

    2015-06-01

    ground.aspx?p=1 Texas Tech Security Group, “Automated Open Source Intelligence ( OSINT ) Using APIs.” RaiderSec, Sunday 30 December 2012, http...Open Source Intelligence ( OSINT ) Using APIs,” RaiderSec, Sunday 30 December 2012, http://raidersec.blogspot.com/2012/12/automated-open- source

  11. Open-Source Unionism: New Workers, New Strategies

    ERIC Educational Resources Information Center

    Schmid, Julie M.

    2004-01-01

    In "Open-Source Unionism: Beyond Exclusive Collective Bargaining," published in fall 2002 in the journal Working USA, labor scholars Richard B. Freeman and Joel Rogers use the term "open-source unionism" to describe a form of unionization that uses Web technology to organize in hard-to-unionize workplaces. Rather than depend on the traditional…

  12. Perceptions of Open Source versus Commercial Software: Is Higher Education Still on the Fence?

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2007-01-01

    This exploratory study investigated the perceptions of technology and academic decision-makers about open source benefits and risks versus commercial software applications. The study also explored reactions to a concept for outsourcing campus-wide deployment and maintenance of open source. Data collected from telephone interviews were analyzed,…

  13. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    ERIC Educational Resources Information Center

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  14. Open-Source Learning Management Systems: A Predictive Model for Higher Education

    ERIC Educational Resources Information Center

    van Rooij, S. Williams

    2012-01-01

    The present study investigated the role of pedagogical, technical, and institutional profile factors in an institution of higher education's decision to select an open-source learning management system (LMS). Drawing on the results of previous research that measured patterns of deployment of open-source software (OSS) in US higher education and…

  15. An Embedded Systems Course for Engineering Students Using Open-Source Platforms in Wireless Scenarios

    ERIC Educational Resources Information Center

    Rodriguez-Sanchez, M. C.; Torrado-Carvajal, Angel; Vaquero, Joaquin; Borromeo, Susana; Hernandez-Tamames, Juan A.

    2016-01-01

    This paper presents a case study analyzing the advantages and disadvantages of using project-based learning (PBL) combined with collaborative learning (CL) and industry best practices, integrated with information communication technologies, open-source software, and open-source hardware tools, in a specialized microcontroller and embedded systems…

  16. Technology collaboration by means of an open source government

    NASA Astrophysics Data System (ADS)

    Berardi, Steven M.

    2009-05-01

    The idea of open source software originally began in the early 1980s, but it never gained widespread support until recently, largely due to the explosive growth of the Internet. Only the Internet has made this kind of concept possible, bringing together millions of software developers from around the world to pool their knowledge. The tremendous success of open source software has prompted many corporations to adopt the culture of open source and thus share information they previously held secret. The government, and specifically the Department of Defense (DoD), could also benefit from adopting an open source culture. In acquiring satellite systems, the DoD often builds walls between program offices, but installing doors between programs can promote collaboration and information sharing. This paper addresses the challenges and consequences of adopting an open source culture to facilitate technology collaboration for DoD space acquisitions. DISCLAIMER: The views presented here are the views of the author, and do not represent the views of the United States Government, United States Air Force, or the Missile Defense Agency.

  17. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  18. Comparative analysis of chemical similarity methods for modular natural products with a hypothetical structure enumeration algorithm.

    PubMed

    Skinnider, Michael A; Dejong, Chris A; Franczak, Brian C; McNicholas, Paul D; Magarvey, Nathan A

    2017-08-16

    Natural products represent a prominent source of pharmaceutically and industrially important agents. Calculating the chemical similarity of two molecules is a central task in cheminformatics, with applications at multiple stages of the drug discovery pipeline. Quantifying the similarity of natural products is a particularly important problem, as the biological activities of these molecules have been extensively optimized by natural selection. The large and structurally complex scaffolds of natural products distinguish their physical and chemical properties from those of synthetic compounds. However, no analysis of the performance of existing methods for molecular similarity calculation specific to natural products has been reported to date. Here, we present LEMONS, an algorithm for the enumeration of hypothetical modular natural product structures. We leverage this algorithm to conduct a comparative analysis of molecular similarity methods within the unique chemical space occupied by modular natural products using controlled synthetic data, and comprehensively investigate the impact of diverse biosynthetic parameters on similarity search. We additionally investigate a recently described algorithm for natural product retrobiosynthesis and alignment, and find that when rule-based retrobiosynthesis can be applied, this approach outperforms conventional two-dimensional fingerprints, suggesting it may represent a valuable approach for the targeted exploration of natural product chemical space and microbial genome mining. Our open-source algorithm is an extensible method of enumerating hypothetical natural product structures with diverse potential applications in bioinformatics.

  19. Open source Modeling and optimization tools for Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peles, S.

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less

  20. Detection of Non-Photochemical Superoxide in Coastal and Open Ocean Seawater: Particulate Versus Dissolved Sources

    NASA Astrophysics Data System (ADS)

    Roe, K. L.; Rand, T.; Hansel, C. M.; Voelker, B. M.

    2016-02-01

    Superoxide radical (O2-) could have a significant effect on marine metal redox chemistry, but little data exists on its marine concentrations. In this study, we measured superoxide steady-state concentrations in both filtered and unfiltered samples collected near the California coast and at Station ALOHA. Particle-generated superoxide, defined as the difference between unfiltered and filtered concentrations, ranged from undetectable to 0.019 nM at Station ALOHA and from undetectable to 0.052 nM in samples from the southern California Current. We also show that a transient superoxide signal is generated during filtering, an artifact that may have affected previously reported concentrations of particle-generated superoxide in the ocean. High concentrations of superoxide (range) were measured in filtered samples from ALOHA station and the California Current, raising concerns about possible sources of background signals. Further study of background signals revealed that some superoxide production occurs even in artificial seawater and very aged filtered seawater samples, and that a small additional background signal is generated as the sample travels from the container to the flow cell where it is mixed with reagent for CL analysis. However, filtered seawater samples collected from the Scripps Pier had significantly higher superoxide production rates than those measured in artificial seawater, and production rates in unfiltered samples were no higher than those in filtered samples. Therefore, production by dissolved sources was the dominant non-photochemical source of superoxide in these samples. Production rates decreased in the presence of DTPA, suggesting involvement of metal ions in superoxide production. Laboratory experiments with natural organic matter (NOM) indicate that superoxide formation occurs during oxidation of reduced moieties of NOM by oxygen.

  1. Limitations of Phased Array Beamforming in Open Rotor Noise Source Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Csaba; Envia, Edmane; Podboy, Gary G.

    2013-01-01

    Phased array beamforming results of the F31/A31 historical baseline counter-rotating open rotor blade set were investigated for measurement data taken on the NASA Counter-Rotating Open Rotor Propulsion Rig in the 9- by 15-Foot Low-Speed Wind Tunnel of NASA Glenn Research Center as well as data produced using the LINPROP open rotor tone noise code. The planar microphone array was positioned broadside and parallel to the axis of the open rotor, roughly 2.3 rotor diameters away. The results provide insight as to why the apparent noise sources of the blade passing frequency tones and interaction tones appear at their nominal Mach radii instead of at the actual noise sources, even if those locations are not on the blades. Contour maps corresponding to the sound fields produced by the radiating sound waves, taken from the simulations, are used to illustrate how the interaction patterns of circumferential spinning modes of rotating coherent noise sources interact with the phased array, often giving misleading results, as the apparent sources do not always show where the actual noise sources are located. This suggests that a more sophisticated source model would be required to accurately locate the sources of each tone. The results of this study also have implications with regard to the shielding of open rotor sources by airframe empennages.

  2. National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents

    NASA Astrophysics Data System (ADS)

    Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.

    2014-12-01

    The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.

  3. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    NASA Astrophysics Data System (ADS)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  4. Development of an Open Source, Air-Deployable Weather Station

    NASA Astrophysics Data System (ADS)

    Krejci, A.; Lopez Alcala, J. M.; Nelke, M.; Wagner, J.; Udell, C.; Higgins, C. W.; Selker, J. S.

    2017-12-01

    We created a packaged weather station intended to be deployed in the air on tethered systems. The device incorporates lightweight sensors and parts and runs for up to 24 hours off of lithium polymer batteries, allowing the entire package to be supported by a thin fiber. As the fiber does not provide a stable platform, additional data (pitch and roll) from typical weather parameters (e.g. temperature, pressure, humidity, wind speed, and wind direction) are determined using an embedded inertial motion unit. All designs are open sourced including electronics, CAD drawings, and descriptions of assembly and can be found on the OPEnS lab website at http://www.open-sensing.org/lowcost-weather-station/. The Openly Published Environmental Sensing Lab (OPEnS: Open-Sensing.org) expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting-edge technology. New OPEnS labs are now being established in India, France, Switzerland, the Netherlands, and Ghana.

  5. Software for Real-Time Analysis of Subsonic Test Shot Accuracy

    DTIC Science & Technology

    2014-03-01

    used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains

  6. Genomes to natural products PRediction Informatics for Secondary Metabolomes (PRISM).

    PubMed

    Skinnider, Michael A; Dejong, Chris A; Rees, Philip N; Johnston, Chad W; Li, Haoxin; Webster, Andrew L H; Wyatt, Morgan A; Magarvey, Nathan A

    2015-11-16

    Microbial natural products are an invaluable source of evolved bioactive small molecules and pharmaceutical agents. Next-generation and metagenomic sequencing indicates untapped genomic potential, yet high rediscovery rates of known metabolites increasingly frustrate conventional natural product screening programs. New methods to connect biosynthetic gene clusters to novel chemical scaffolds are therefore critical to enable the targeted discovery of genetically encoded natural products. Here, we present PRISM, a computational resource for the identification of biosynthetic gene clusters, prediction of genetically encoded nonribosomal peptides and type I and II polyketides, and bio- and cheminformatic dereplication of known natural products. PRISM implements novel algorithms which render it uniquely capable of predicting type II polyketides, deoxygenated sugars, and starter units, making it a comprehensive genome-guided chemical structure prediction engine. A library of 57 tailoring reactions is leveraged for combinatorial scaffold library generation when multiple potential substrates are consistent with biosynthetic logic. We compare the accuracy of PRISM to existing genomic analysis platforms. PRISM is an open-source, user-friendly web application available at http://magarveylab.ca/prism/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Using Support Vector Machines to Automatically Extract Open Water Signatures from POLDER Multi-Angle Data Over Boreal Regions

    NASA Technical Reports Server (NTRS)

    Pierce, J.; Diaz-Barrios, M.; Pinzon, J.; Ustin, S. L.; Shih, P.; Tournois, S.; Zarco-Tejada, P. J.; Vanderbilt, V. C.; Perry, G. L.; Brass, James A. (Technical Monitor)

    2002-01-01

    This study used Support Vector Machines to classify multiangle POLDER data. Boreal wetland ecosystems cover an estimated 90 x 10(exp 6) ha, about 36% of global wetlands, and are a major source of trace gases emissions to the atmosphere. Four to 20 percent of the global emission of methane to the atmosphere comes from wetlands north of 4 degrees N latitude. Large uncertainties in emissions exist because of large spatial and temporal variation in the production and consumption of methane. Accurate knowledge of the areal extent of open water and inundated vegetation is critical to estimating magnitudes of trace gas emissions. Improvements in land cover mapping have been sought using physical-modeling approaches, neural networks, and active microwave, examples that demonstrate the difficulties of separating open water, inundated vegetation and dry upland vegetation. Here we examine the feasibility of using a support vector machine to classify POLDER data representing open water, inundated vegetation and dry upland vegetation.

  8. Building a Snow Data Management System using Open Source Software (and IDL)

    NASA Astrophysics Data System (ADS)

    Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.

    2012-12-01

    At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01

  9. A Cloud-Based Infrastructure for Near-Real-Time Processing and Dissemination of NPP Data

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Valente, E. G.; Chettri, S. S.

    2011-12-01

    We are building a scalable cloud-based infrastructure for generating and disseminating near-real-time data products from a variety of geospatial and meteorological data sources, including the new National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP). Our approach relies on linking Direct Broadcast and other data streams to a suite of scientific algorithms coordinated by NASA's International Polar-Orbiter Processing Package (IPOPP). The resulting data products are directly accessible to a wide variety of end-user applications, via industry-standard protocols such as OGC Web Services, Unidata Local Data Manager, or OPeNDAP, using open source software components. The processing chain employs on-demand computing resources from Amazon.com's Elastic Compute Cloud and NASA's Nebula cloud services. Our current prototype targets short-term weather forecasting, in collaboration with NASA's Short-term Prediction Research and Transition (SPoRT) program and the National Weather Service. Direct Broadcast is especially crucial for NPP, whose current ground segment is unlikely to deliver data quickly enough for short-term weather forecasters and other near-real-time users. Direct Broadcast also allows full local control over data handling, from the receiving antenna to end-user applications: this provides opportunities to streamline processes for data ingest, processing, and dissemination, and thus to make interpreted data products (Environmental Data Records) available to practitioners within minutes of data capture at the sensor. Cloud computing lets us grow and shrink computing resources to meet large and rapid fluctuations in data availability (twice daily for polar orbiters) - and similarly large fluctuations in demand from our target (near-real-time) users. This offers a compelling business case for cloud computing: the processing or dissemination systems can grow arbitrarily large to sustain near-real time data access despite surges in data volumes or user demand, but that computing capacity (and hourly costs) can be dropped almost instantly once the surge passes. Cloud computing also allows low-risk experimentation with a variety of machine architectures (processor types; bandwidth, memory, and storage capacities, etc.) and of system configurations (including massively parallel computing patterns). Finally, our service-based approach (in which user applications invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored products on demand. To maximize the usefulness and impact of our technology, we have emphasized open, industry-standard software interfaces. We are also using and developing open source software to facilitate the widespread adoption of similar, derived, or interoperable systems for processing and serving near-real-time data from NPP and other sources.

  10. Open source electronic health records and chronic disease management.

    PubMed

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-02-01

    To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC.

  11. An Evaluation of Open Source Learning Management Systems According to Administration Tools and Curriculum Design

    ERIC Educational Resources Information Center

    Ozdamli, Fezile

    2007-01-01

    Distance education is becoming more important in the universities and schools. The aim of this research is to evaluate the current existing Open Source Learning Management Systems according to Administration tool and Curriculum Design. For this, seventy two Open Source Learning Management Systems have been subjected to a general evaluation. After…

  12. Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic Publishing

    ERIC Educational Resources Information Center

    Samuels, Ruth Gallegos; Griffy, Henry

    2012-01-01

    This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…

  13. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    ERIC Educational Resources Information Center

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  14. Open Source Meets Virtual Reality--An Instructor's Journey Unearths New Opportunities for Learning, Community, and Academia

    ERIC Educational Resources Information Center

    O'Connor, Eileen A.

    2015-01-01

    Opening with the history, recent advances, and emerging ways to use avatar-based virtual reality, an instructor who has used virtual environments since 2007 shares how these environments bring more options to community building, teaching, and education. With the open-source movement, where the source code for virtual environments was made…

  15. The Implications of Incumbent Intellectual Property Strategies for Open Source Software Success and Commercialization

    ERIC Educational Resources Information Center

    Wen, Wen

    2012-01-01

    While open source software (OSS) emphasizes open access to the source code and avoids the use of formal appropriability mechanisms, there has been little understanding of how the existence and exercise of formal intellectual property rights (IPR) such as patents influence the direction of OSS innovation. This dissertation seeks to bridge this gap…

  16. Migrations of the Mind: The Emergence of Open Source Education

    ERIC Educational Resources Information Center

    Glassman, Michael; Bartholomew, Mitchell; Jones, Travis

    2011-01-01

    The authors describe an Open Source approach to education. They define Open Source Education (OSE) as a teaching and learning framework where the use and presentation of information is non-hierarchical, malleable, and subject to the needs and contributions of students as they become "co-owners" of the course. The course transforms itself into an…

  17. Prepare for Impact

    ERIC Educational Resources Information Center

    Waters, John K.

    2010-01-01

    Open source software is poised to make a profound impact on K-12 education. For years industry experts have been predicting the widespread adoption of open source tools by K-12 school districts. They're about to be proved right. The impact may not yet have been profound, but it's fair to say that some open source systems and non-proprietary…

  18. 7 Questions to Ask Open Source Vendors

    ERIC Educational Resources Information Center

    Raths, David

    2012-01-01

    With their budgets under increasing pressure, many campus IT directors are considering open source projects for the first time. On the face of it, the savings can be significant. Commercial emergency-planning software can cost upward of six figures, for example, whereas the open source Kuali Ready might run as little as $15,000 per year when…

  19. Cognitive Readiness Assessment and Reporting: An Open Source Mobile Framework for Operational Decision Support and Performance Improvement

    ERIC Educational Resources Information Center

    Heric, Matthew; Carter, Jenn

    2011-01-01

    Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…

  20. Combustion products of plastics as indicators for refuse burning in the atmosphere.

    PubMed

    Simoneit, Bernd R T; Medeiros, Patricia M; Didyk, Borys M

    2005-09-15

    Despite all of the economic problems and environmental discussions on the dangers and hazards of plastic materials, plastic production worldwide is growing at a rate of about 5% per year. Increasing techniques for recycling polymeric materials have been developed during the last few years; however, a large fraction of plastics are still being discarded in landfills or subjected to intentional or incidental open-fire burning. To identify specific tracer compounds generated during such open-fire combustion, both smoke particles from burning and plastic materials from shopping bags, roadside trash, and landfill garbage were extracted for gas chromatography-mass spectrometry analyses. Samples were collected in Concón, Chile, an area frequently affected by wildfire incidents and garbage burning, and the United States for comparison. Atmospheric samples from various aerosol sampling programs are also presented as supportive data. The major components of plastic extracts were even-carbon-chain n-alkanes (C16-C40), the plasticizer di-2-ethylhexyl phthalate, and the antioxidants and lubricants/antiadhesives Irganox 1076, Irgafos 168, and its oxidation product tris(2,4-di-tertbutylphenyl) phosphate. Major compounds in smoke from burning plastics include the non-source-specific n-alkanes (mainly even predominance), terephthalic acid, phthalates, and 4-hydroxybenzoic acid, with minor amounts of polycyclic aromatic hydrocarbons (including triphenylbenzenes) and tris(2,4-di-tert-butylphenyl)phosphate. 1,3,5-Triphenylbenzene and tris(2,4-di-tert-butylphenyl)- phosphate were found in detectable amounts in atmospheric samples where plastics and refuse were burned in open fires, and thus we propose these two compounds as specific tracers for the open-burning of plastics.

  1. GDAL Enhancements for Interoperability with EOS Data (GEE)

    NASA Astrophysics Data System (ADS)

    Tisdale, B.

    2015-12-01

    Historically, Earth Observing Satellite (EOS) data products have been difficult to consume by GIS tools, weather commercial or open-source. This has resulted in a reduced acceptance of these data products by GIS and general user communities. Common problems and challenges experienced by these data users include difficulty when: Consuming data products from NASA Distributed Active Archive Centers (DAACs) that pre-date modern application software with commercial and open-source geospatial tools; Identifying[MI1] an initial approach for developing a framework and plug-ins that interpret non-compliant data; Defining a methodology that is extensible across NASA Earth Observing System Data and Information System (EOSDIS), scientific communities, and GIS communities by enabling other data centers to construct their own plug-ins and adjust specific data products; and Promoting greater use of NASA Data and new analysis utilizing GIS tools. To address these challenges and make EOS data products more accessible and interpretable by GIS applications, a collaborative approach has been taken that includes the NASA Langley Atmospheric Science Data Center (ASDC), Esri, George Mason University (GMU), and the Hierarchical Data Format (HDF) Group to create a framework and plugins to be applied to Geospatial Data Abstraction Library (GDAL). This framework and its plugins offer advantages of extensibility within NASA EOSDIS, permitting other data centers to construct their own plugins necessary to adjust their data products. In this session findings related to the framework and the development of GDAL plugins will be reviewed. Specifically, this session will offer a workshop to review documentation and training materials that have been generated for the purpose of guiding other NASA DAACs through the process of constructing plug-ins consistent with the framework as well as a review of the certification process by which the plugins can be independently verified as properly converting the data to the format and content required for use in GIS software.

  2. GDAL Enhancements for Interoperability with EOS Data

    NASA Astrophysics Data System (ADS)

    Tisdale, M.; Mathews, T. J.; Tisdale, B.; Sun, M.; Yang, C. P.; Lee, H.; Habermann, T.

    2015-12-01

    Historically, Earth Observing Satellite (EOS) data products have been difficult to consume by GIS tools, weather commercial or open-source. This has resulted in a reduced acceptance of these data products by GIS and general user communities. Common problems and challenges experienced by these data users include difficulty when: Consuming data products from NASA Distributed Active Archive Centers (DAACs) that pre-date modern application software with commercial and open-source geospatial tools; Identifying an initial approach for developing a framework and plug-ins that interpret non-compliant data; Defining a methodology that is extensible across NASA Earth Observing System Data and Information System (EOSDIS), scientific communities, and GIS communities by enabling other data centers to construct their own plug-ins and adjust specific data products; and Promoting greater use of NASA Data and new analysis utilizing GIS tools. To address these challenges and to make EOS data products more accessible and interpretable by GIS applications, a collaborative approach has been taken that includes the NASA Langley Atmospheric Science Data Center (ASDC), Esri, George Mason University (GMU), and the Hierarchical Data Format (HDF) Group to create a framework and plugins to be applied to Geospatial Data Abstraction Library (GDAL). This framework and its plugins offer advantages of extensibility within NASA EOSDIS, permitting other data centers to construct their own plugins necessary to adjust their data products. In this session findings related to the framework and the development of GDAL plugins will be reviewed. Specifically, this session will offer a workshop to review documentation and training materials that have been generated for the purpose of guiding other NASA DAACs through the process of constructing plug-ins consistent with the framework as well as a review of the certification process by which the plugins can be independently verified as properly converting the data to the format and content required for use in GIS software.

  3. Improvement of dem Generation from Aster Images Using Satellite Jitter Estimation and Open Source Implementation

    NASA Astrophysics Data System (ADS)

    Girod, L.; Nuth, C.; Kääb, A.

    2015-12-01

    The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) system embarked on the Terra (EOS AM-1) satellite has been a source of stereoscopic images covering the whole globe at a 15m resolution at a consistent quality for over 15 years. The potential of this data in terms of geomorphological analysis and change detection in three dimensions is unrivaled and needs to be exploited. However, the quality of the DEMs and ortho-images currently delivered by NASA (ASTER DMO products) is often of insufficient quality for a number of applications such as mountain glacier mass balance. For this study, the use of Ground Control Points (GCPs) or of other ground truth was rejected due to the global "big data" type of processing that we hope to perform on the ASTER archive. We have therefore developed a tool to compute Rational Polynomial Coefficient (RPC) models from the ASTER metadata and a method improving the quality of the matching by identifying and correcting jitter induced cross-track parallax errors. Our method outputs more accurate DEMs with less unmatched areas and reduced overall noise. The algorithms were implemented in the open source photogrammetric library and software suite MicMac.

  4. An Open-source Community Web Site To Support Ground-Water Model Testing

    NASA Astrophysics Data System (ADS)

    Kraemer, S. R.; Bakker, M.; Craig, J. R.

    2007-12-01

    A community wiki wiki web site has been created as a resource to support ground-water model development and testing. The Groundwater Gourmet wiki is a repository for user supplied analytical and numerical recipes, howtos, and examples. Members are encouraged to submit analytical solutions, including source code and documentation. A diversity of code snippets are sought in a variety of languages, including Fortran, C, C++, Matlab, Python. In the spirit of a wiki, all contributions may be edited and altered by other users, and open source licensing is promoted. Community accepted contributions are graduated into the library of analytic solutions and organized into either a Strack (Groundwater Mechanics, 1989) or Bruggeman (Analytical Solutions of Geohydrological Problems, 1999) classification. The examples section of the wiki are meant to include laboratory experiments (e.g., Hele Shaw), classical benchmark problems (e.g., Henry Problem), and controlled field experiments (e.g., Borden landfill and Cape Cod tracer tests). Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.

  5. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  6. Upon the Shoulders of Giants: Open-Source Hardware and Software in Analytical Chemistry.

    PubMed

    Dryden, Michael D M; Fobel, Ryan; Fobel, Christian; Wheeler, Aaron R

    2017-04-18

    Isaac Newton famously observed that "if I have seen further it is by standing on the shoulders of giants." We propose that this sentiment is a powerful motivation for the "open-source" movement in scientific research, in which creators provide everything needed to replicate a given project online, as well as providing explicit permission for users to use, improve, and share it with others. Here, we write to introduce analytical chemists who are new to the open-source movement to best practices and concepts in this area and to survey the state of open-source research in analytical chemistry. We conclude by considering two examples of open-source projects from our own research group, with the hope that a description of the process, motivations, and results will provide a convincing argument about the benefits that this movement brings to both creators and users.

  7. Open-Source 3-D Platform for Low-Cost Scientific Instrument Ecosystem.

    PubMed

    Zhang, C; Wijnen, B; Pearce, J M

    2016-08-01

    The combination of open-source software and hardware provides technically feasible methods to create low-cost, highly customized scientific research equipment. Open-source 3-D printers have proven useful for fabricating scientific tools. Here the capabilities of an open-source 3-D printer are expanded to become a highly flexible scientific platform. An automated low-cost 3-D motion control platform is presented that has the capacity to perform scientific applications, including (1) 3-D printing of scientific hardware; (2) laboratory auto-stirring, measuring, and probing; (3) automated fluid handling; and (4) shaking and mixing. The open-source 3-D platform not only facilities routine research while radically reducing the cost, but also inspires the creation of a diverse array of custom instruments that can be shared and replicated digitally throughout the world to drive down the cost of research and education further. © 2016 Society for Laboratory Automation and Screening.

  8. OpenSesame: an open-source, graphical experiment builder for the social sciences.

    PubMed

    Mathôt, Sebastiaan; Schreij, Daniel; Theeuwes, Jan

    2012-06-01

    In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality, such as support for eyetrackers, input devices, and video playback, is available through plug-ins. OpenSesame can be used in combination with existing software for creating experiments.

  9. The Privacy and Security Implications of Open Data in Healthcare.

    PubMed

    Kobayashi, Shinji; Kane, Thomas B; Paton, Chris

    2018-04-22

     The International Medical Informatics Association (IMIA) Open Source Working Group (OSWG) initiated a group discussion to discuss current privacy and security issues in the open data movement in the healthcare domain from the perspective of the OSWG membership.  Working group members independently reviewed the recent academic and grey literature and sampled a number of current large-scale open data projects to inform the working group discussion.  This paper presents an overview of open data repositories and a series of short case reports to highlight relevant issues present in the recent literature concerning the adoption of open approaches to sharing healthcare datasets. Important themes that emerged included data standardisation, the inter-connected nature of the open source and open data movements, and how publishing open data can impact on the ethics, security, and privacy of informatics projects.  The open data and open source movements in healthcare share many common philosophies and approaches including developing international collaborations across multiple organisations and domains of expertise. Both movements aim to reduce the costs of advancing scientific research and improving healthcare provision for people around the world by adopting open intellectual property licence agreements and codes of practice. Implications of the increased adoption of open data in healthcare include the need to balance the security and privacy challenges of opening data sources with the potential benefits of open data for improving research and healthcare delivery. Georg Thieme Verlag KG Stuttgart.

  10. Arctic Sea Salt Aerosol from Blowing Snow and Sea Ice Surfaces - a Missing Natural Source in Winter

    NASA Astrophysics Data System (ADS)

    Frey, M. M.; Norris, S. J.; Brooks, I. M.; Nishimura, K.; Jones, A. E.

    2015-12-01

    Atmospheric particles in the polar regions consist mostly of sea salt aerosol (SSA). SSA plays an important role in regional climate change through influencing the surface energy balance either directly or indirectly via cloud formation. SSA irradiated by sunlight also releases very reactive halogen radicals, which control concentrations of ozone, a pollutant and greenhouse gas. However, models under-predict SSA concentrations in the Arctic during winter pointing to a missing source. It has been recently suggested that salty blowing snow above sea ice, which is evaporating, to be that source as it may produce more SSA than equivalent areas of open ocean. Participation in the 'Norwegian Young Sea Ice Cruise (N-ICE 2015)' on board the research vessel `Lance' allowed to test this hypothesis in the Arctic sea ice zone during winter. Measurements were carried out from the ship frozen into the pack ice North of 80º N during February to March 2015. Observations at ground level (0.1-2 m) and from the ship's crows nest (30 m) included number concentrations and size spectra of SSA (diameter range 0.3-10 μm) as well as snow particles (diameter range 50-500 μm). During and after blowing snow events significant SSA production was observed. In the aerosol and snow phase sulfate is fractionated with respect to sea water, which confirms sea ice surfaces and salty snow, and not the open ocean, to be the dominant source of airborne SSA. Aerosol shows depletion in bromide with respect to sea water, especially after sunrise, indicating photochemically driven release of bromine. We discuss the SSA source strength from blowing snow in light of environmental conditions (wind speed, atmospheric turbulence, temperature and snow salinity) and recommend improved model parameterisations to estimate regional aerosol production. N-ICE 2015 results are then compared to a similar study carried out previously in the Weddell Sea during the Antarctic winter.

  11. Spatial and temporal distributions of bromoform and dibromomethane in the Atlantic Ocean and their relationship with photosynthetic biomass

    NASA Astrophysics Data System (ADS)

    Liu, Yina; Yvon-Lewis, Shari A.; Thornton, Daniel C. O.; Butler, James H.; Bianchi, Thomas S.; Campbell, Lisa; Hu, Lei; Smith, Richard W.

    2013-08-01

    Atmospheric mixing ratios and seawater concentrations of bromoform (CHBr3), dibromomethane (CH2Br2), and other brominated very short-lived substances (BrVSLS) were measured during five cruises from 1994 to 2010. These cruises were conducted over large latitudinal (62°N-60°S) and longitudinal transects (11°W-86°W) in the Atlantic Ocean. Elevated seawater concentrations of CHBr3 and CH2Br2 were often observed in regions where chlorophyll a concentrations were also elevated, which suggests biogeochemical processes associated with photosynthetic biomass may be related to CHBr3 and CH2Br2 production. Our results suggest that, at least in the open ocean, several phytoplankton taxa may contribute to the production of these trace gases. While observed correlations between CHBr3 and CH2Br2 in different regions are usually interpreted as common sources for these compounds, results in this study suggest different biogeochemical processes may contribute separately to the production of these trace gases. Heterotrophic bacterial abundance was significantly correlated with CH2Br2, but not with CHBr3, which suggests the biogeochemical processes associated with heterotrophic bacteria may be related to CH2Br2 in seawater but probably not to CHBr3. In general, the Atlantic Ocean is a net source for CHBr3 and CH2Br2, except for a few locations where these trace gases were undersaturated in seawater. Assuming fluxes measured in the Atlantic open ocean are globally representative, the resulting extrapolated, global open-ocean annual net sea-to-air fluxes calculated from data from the five cruises was estimated at 0.24-3.80 Gmol Br yr-1 for CHBr3 and 0.11-0.77 Gmol Br yr-1 for CH2Br2.

  12. Geo Issue Tracking System

    NASA Astrophysics Data System (ADS)

    Khakpour, Mohammad; Paulik, Christoph; Hahn, Sebastian

    2016-04-01

    Communication about remote sensing data quality between data providers and users as well as between the users is often difficult. The users have a hard time figuring out if a product has known problems over their region of interest and data providers have to spend a lot of effort to make this information available, if it exists. Scientific publications are one tool for communicating with the users base but they are static and mostly one way. As a data provider it is also often difficult to make feedback, received from users, available to the complete user base. The Geo Issue Tracking System (GeoITS) is an Open Source Web Application which has been developed to mitigate these problems. GeoITS combines a mapping interface (Google Maps) with a simple wiki platform. It allows users to give region specific feedback on a remote sensing product by drawing a polygon on the map and describing the problems they had using the remote sensing product in this area. These geolocated wiki entries are then viewable by other users as well as the data providers which can modify and extend the entries. In this way the conversations between the users and the data provider are no longer hidden in e.g. emails but open for all users of the dataset. This new kind of communication platform can enable better cooperation between users and data providers. It will also provide data providers with the ability to track problems their dataset might have in certain areas and resolve them with new product releases. The source code is available via http://github.com/TUW-GEO/geoits_dev A running instance can be tried at https://geoits.herokuapp.com/

  13. Simulation of partially coherent light propagation using parallel computing devices

    NASA Astrophysics Data System (ADS)

    Magalhães, Tiago C.; Rebordão, José M.

    2017-08-01

    Light acquires or loses coherence and coherence is one of the few optical observables. Spectra can be derived from coherence functions and understanding any interferometric experiment is also relying upon coherence functions. Beyond the two limiting cases (full coherence or incoherence) the coherence of light is always partial and it changes with propagation. We have implemented a code to compute the propagation of partially coherent light from the source plane to the observation plane using parallel computing devices (PCDs). In this paper, we restrict the propagation in free space only. To this end, we used the Open Computing Language (OpenCL) and the open-source toolkit PyOpenCL, which gives access to OpenCL parallel computation through Python. To test our code, we chose two coherence source models: an incoherent source and a Gaussian Schell-model source. In the former case, we divided into two different source shapes: circular and rectangular. The results were compared to the theoretical values. Our implemented code allows one to choose between the PyOpenCL implementation and a standard one, i.e using the CPU only. To test the computation time for each implementation (PyOpenCL and standard), we used several computer systems with different CPUs and GPUs. We used powers of two for the dimensions of the cross-spectral density matrix (e.g. 324, 644) and a significant speed increase is observed in the PyOpenCL implementation when compared to the standard one. This can be an important tool for studying new source models.

  14. Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets.

    PubMed

    Clark, Alex M; Dole, Krishna; Coulon-Spektor, Anna; McNutt, Andrew; Grass, George; Freundlich, Joel S; Reynolds, Robert C; Ekins, Sean

    2015-06-22

    On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user's own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery.

  15. Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets

    PubMed Central

    2015-01-01

    On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user’s own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery. PMID:25994950

  16. GOC-TX: A Reliable Ticket Synchronization Application for the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Hayashi, Soichi; Gopu, Arvind; Quick, Robert

    2011-12-01

    One of the major operational issues faced by large multi-institutional collaborations is permitting its users and support staff to use their native ticket tracking environment while also exchanging these tickets with collaborators. After several failed attempts at email-parser based ticket exchanges, the OSG Operations Group has designed a comprehensive ticket synchronizing application. The GOC-TX application uses web-service interfaces offered by various commercial, open source and other homegrown ticketing systems, to synchronize tickets between two or more of these systems. GOC-TX operates independently from any ticketing system. It can be triggered by one ticketing system via email, active messaging, or a web-services call to check for current sync-status, pull applicable recent updates since prior synchronizations to the source ticket, and apply the updates to a destination ticket. The currently deployed production version of GOC-TX is able to synchronize tickets between the Numara Footprints ticketing system used by the OSG and the following systems: European Grid Initiative's system Global Grid User Support (GGUS) and the Request Tracker (RT) system used by Brookhaven. Additional interfaces to the BMC Remedy system used by Fermilab, and to other instances of RT used by other OSG partners, are expected to be completed in summer 2010. A fully configurable open source version is expected to be made available by early autumn 2010. This paper will cover the structure of the GOC-TX application, its evolution, and the problems encountered by OSG Operations group with ticket exchange within the OSG Collaboration.

  17. Open-Source Radiation Exposure Extraction Engine (RE3) with Patient-Specific Outlier Detection.

    PubMed

    Weisenthal, Samuel J; Folio, Les; Kovacs, William; Seff, Ari; Derderian, Vana; Summers, Ronald M; Yao, Jianhua

    2016-08-01

    We present an open-source, picture archiving and communication system (PACS)-integrated radiation exposure extraction engine (RE3) that provides study-, series-, and slice-specific data for automated monitoring of computed tomography (CT) radiation exposure. RE3 was built using open-source components and seamlessly integrates with the PACS. RE3 calculations of dose length product (DLP) from the Digital imaging and communications in medicine (DICOM) headers showed high agreement (R (2) = 0.99) with the vendor dose pages. For study-specific outlier detection, RE3 constructs robust, automatically updating multivariable regression models to predict DLP in the context of patient gender and age, scan length, water-equivalent diameter (D w), and scanned body volume (SBV). As proof of concept, the model was trained on 811 CT chest, abdomen + pelvis (CAP) exams and 29 outliers were detected. The continuous variables used in the outlier detection model were scan length (R (2)  = 0.45), D w (R (2) = 0.70), SBV (R (2) = 0.80), and age (R (2) = 0.01). The categorical variables were gender (male average 1182.7 ± 26.3 and female 1047.1 ± 26.9 mGy cm) and pediatric status (pediatric average 710.7 ± 73.6 mGy cm and adult 1134.5 ± 19.3 mGy cm).

  18. A clinic compatible, open source electrophysiology system.

    PubMed

    Hermiz, John; Rogers, Nick; Kaestner, Erik; Ganji, Mehran; Cleary, Dan; Snider, Joseph; Barba, David; Dayeh, Shadi; Halgren, Eric; Gilja, Vikash

    2016-08-01

    Open source electrophysiology (ephys) recording systems have several advantages over commercial systems such as customization and affordability enabling more researchers to conduct ephys experiments. Notable open source ephys systems include Open-Ephys, NeuroRighter and more recently Willow, all of which have high channel count (64+), scalability, and advanced software to develop on top of. However, little work has been done to build an open source ephys system that is clinic compatible, particularly in the operating room where acute human electrocorticography (ECoG) research is performed. We developed an affordable (<; $10,000) and open system for research purposes that features power isolation for patient safety, compact and water resistant enclosures and 256 recording channels sampled up to 20ksam/sec, 16-bit. The system was validated by recording ECoG with a high density, thin film device for an acute, awake craniotomy study at UC San Diego, Thornton Hospital Operating Room.

  19. Freeing Worldview's development process: Open source everything!

    NASA Astrophysics Data System (ADS)

    Gunnoe, T.

    2016-12-01

    Freeing your code and your project are important steps for creating an inviting environment for collaboration, with the added side effect of keeping a good relationship with your users. NASA Worldview's codebase was released with the open source NOSA (NASA Open Source Agreement) license in 2014, but this is only the first step. We also have to free our ideas, empower our users by involving them in the development process, and open channels that lead to the creation of a community project. There are many highly successful examples of Free and Open Source Software (FOSS) projects of which we can take note: the Linux kernel, Debian, GNOME, etc. These projects owe much of their success to having a passionate mix of developers/users with a great community and a common goal in mind. This presentation will describe the scope of this openness and how Worldview plans to move forward with a more community-inclusive approach.

  20. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  1. Interim Open Source Software (OSS) Policy

    EPA Pesticide Factsheets

    This interim Policy establishes a framework to implement the requirements of the Office of Management and Budget's (OMB) Federal Source Code Policy to achieve efficiency, transparency and innovation through reusable and open source software.

  2. Catalysis for biomass and CO2 use through solar energy: opening new scenarios for a sustainable and low-carbon chemical production.

    PubMed

    Lanzafame, Paola; Centi, Gabriele; Perathoner, Siglinda

    2014-11-21

    The use of biomass, bio-waste and CO2 derived raw materials, the latter synthesized using H2 produced using renewable energy sources, opens new scenarios to develop a sustainable and low carbon chemical production, particularly in regions such as Europe lacking in other resources. This tutorial review discusses first this new scenario with the aim to point out, between the different possible options, those more relevant to enable this new future scenario for the chemical production, commenting in particular the different drivers (economic, technological and strategic, environmental and sustainability and socio-political) which guide the selection. The case of the use of non-fossil fuel based raw materials for the sustainable production of light olefins is discussed in more detail, but the production of other olefins and polyolefins, of drop-in intermediates and other platform molecules are also analysed. The final part discusses the role of catalysis in establishing this new scenario, summarizing the development of catalysts with respect to industrial targets, for (i) the production of light olefins by catalytic dehydration of ethanol and by CO2 conversion via FTO process, (ii) the catalytic synthesis of butadiene from ethanol, butanol and butanediols, and (iii) the catalytic synthesis of HMF and its conversion to 2,5-FDCA, adipic acid, caprolactam and 1,6-hexanediol.

  3. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  4. Open Source Software Development Experiences on the Students' Resumes: Do They Count?--Insights from the Employers' Perspectives

    ERIC Educational Resources Information Center

    Long, Ju

    2009-01-01

    Open Source Software (OSS) is a major force in today's Information Technology (IT) landscape. Companies are increasingly using OSS in mission-critical applications. The transparency of the OSS technology itself with openly available source codes makes it ideal for students to participate in the OSS project development. OSS can provide unique…

  5. Open Source Initiative Powers Real-Time Data Streams

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Under an SBIR contract with Dryden Flight Research Center, Creare Inc. developed a data collection tool called the Ring Buffered Network Bus. The technology has now been released under an open source license and is hosted by the Open Source DataTurbine Initiative. DataTurbine allows anyone to stream live data from sensors, labs, cameras, ocean buoys, cell phones, and more.

  6. Xtreme Learning Control: Examples of the Open Source Movement's Impact on Our Educational Practice in a University Setting.

    ERIC Educational Resources Information Center

    Dunlap, Joanna C.; Wilson, Brent G.; Young, David L.

    This paper describes how Open Source philosophy, a movement that has developed in opposition to the proprietary software industry, has influenced educational practice in the pursuit of scholarly freedom and authentic learning activities for students and educators. This paper provides a brief overview of the Open Source movement, and describes…

  7. Adopting Open-Source Software Applications in U. S. Higher Education: A Cross-Disciplinary Review of the Literature

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2009-01-01

    Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…

  8. Assessing the Impact of Security Behavior on the Awareness of Open-Source Intelligence: A Quantitative Study of IT Knowledge Workers

    ERIC Educational Resources Information Center

    Daniels, Daniel B., III

    2014-01-01

    There is a lack of literature linking end-user behavior to the availability of open-source intelligence (OSINT). Most OSINT literature has been focused on the use and assessment of open-source intelligence, not the proliferation of personally or organizationally identifiable information (PII/OII). Additionally, information security studies have…

  9. Looking toward the Future: A Case Study of Open Source Software in the Humanities

    ERIC Educational Resources Information Center

    Quamen, Harvey

    2006-01-01

    In this article Harvey Quamen examines how the philosophy of open source software might be of particular benefit to humanities scholars in the near future--particularly for academic journals with limited financial resources. To this end he provides a case study in which he describes his use of open source technology (MySQL database software and…

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The plpdfa software is a product of an LDRD project at LLNL entitked "Adaptive Sampling for Very High Throughput Data Streams" (tracking number 11-ERD-035). This software was developed by a graduate student summer intern, Chris Challis, who worked under project PI Dan Merl furing the summer of 2011. The software the source code is implementing is a statistical analysis technique for clustering and classification of text-valued data. The method had been previously published by the PI in the open literature.

  11. Analyst Performance Measures. Volume 1: Persistent Surveillance Data Processing, Storage and Retrieval

    DTIC Science & Technology

    2011-09-01

    solutions to address these important challenges . The Air Force is seeking innovative architectures to process and store massive data sets in a flexible...Google Earth, the Video LAN Client ( VLC ) media player, and the Environmental Systems Research Institute corporation‘s (ESRI) ArcGIS product — to...Earth, Quantum GIS, VLC Media Player, NASA WorldWind, ESRI ArcGIS and many others. Open source GIS and media visualization software can also be

  12. VisIt: An End-User Tool for Visualizing and Analyzing Very Large Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Childs, Hank; Brugger, Eric; Whitlock, Brad

    2012-11-01

    VisIt is a popular open source tool for visualizing and analyzing big data. It owes its success to its foci of increasing data understanding, large data support, and providing a robust and usable product, as well as its underlying design that fits today's supercomputing landscape. This report, which draws heavily from an earlier publication at the SciDAC Conference in 2011 describes the VisIt project and its accomplishments.

  13. Preparing a scientific manuscript in Linux: Today's possibilities and limitations.

    PubMed

    Tchantchaleishvili, Vakhtang; Schmitto, Jan D

    2011-10-22

    Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux.

  14. Open Source Service Agent (OSSA) in the intelligence community's Open Source Architecture

    NASA Technical Reports Server (NTRS)

    Fiene, Bruce F.

    1994-01-01

    The Community Open Source Program Office (COSPO) has developed an architecture for the intelligence community's new Open Source Information System (OSIS). The architecture is a multi-phased program featuring connectivity, interoperability, and functionality. OSIS is based on a distributed architecture concept. The system is designed to function as a virtual entity. OSIS will be a restricted (non-public), user configured network employing Internet communications. Privacy and authentication will be provided through firewall protection. Connection to OSIS can be made through any server on the Internet or through dial-up modems provided the appropriate firewall authentication system is installed on the client.

  15. Exploring the Role of Value Networks for Software Innovation

    NASA Astrophysics Data System (ADS)

    Morgan, Lorraine; Conboy, Kieran

    This paper describes a research-in-progress that aims to explore the applicability and implications of open innovation practices in two firms - one that employs agile development methods and another that utilizes open source software. The open innovation paradigm has a lot in common with open source and agile development methodologies. A particular strength of agile approaches is that they move away from 'introverted' development, involving only the development personnel, and intimately involves the customer in all areas of software creation, supposedly leading to the development of a more innovative and hence more valuable information system. Open source software (OSS) development also shares two key elements of the open innovation model, namely the collaborative development of the technology and shared rights to the use of the technology. However, one shortfall with agile development in particular is the narrow focus on a single customer representative. In response to this, we argue that current thinking regarding innovation needs to be extended to include multiple stakeholders both across and outside the organization. Additionally, for firms utilizing open source, it has been found that their position in a network of potential complementors determines the amount of superior value they create for their customers. Thus, this paper aims to get a better understanding of the applicability and implications of open innovation practices in firms that employ open source and agile development methodologies. In particular, a conceptual framework is derived for further testing.

  16. Design and Deployment of a General Purpose, Open Source LoRa to Wi-Fi Hub and Data Logger

    NASA Astrophysics Data System (ADS)

    DeBell, T. C.; Udell, C.; Kwon, M.; Selker, J. S.; Lopez Alcala, J. M.

    2017-12-01

    Methods and technologies facilitating internet connectivity and near-real-time status updates for in site environmental sensor data are of increasing interest in Earth Science. However, Open Source, Do-It-Yourself technologies that enable plug and play functionality for web-connected sensors and devices remain largely inaccessible for typical researchers in our community. The Openly Published Environmental Sensing Lab at Oregon State University (OPEnS Lab) constructed an Open Source 900 MHz Long Range Radio (LoRa) receiver hub with SD card data logger, Ethernet and Wi-Fi shield, and 3D printed enclosure that dynamically uploads transmissions from multiple wirelessly-connected environmental sensing devices. Data transmissions may be received from devices up to 20km away. The hub time-stamps, saves to SD card, and uploads all transmissions to a Google Drive spreadsheet to be accessed in near-real-time by researchers and GeoVisualization applications (such as Arc GIS) for access, visualization, and analysis. This research expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting edge technology. This poster details our methods and evaluates the application of using 3D printing, Arduino Integrated Development Environment (IDE), Adafruit's Open-Hardware Feather development boards, and the WIZNET5500 Ethernet shield for designing this open-source, general purpose LoRa to Wi-Fi data logger.

  17. Open Vehicle Sketch Pad Aircraft Modeling Strategies

    NASA Technical Reports Server (NTRS)

    Hahn, Andrew S.

    2013-01-01

    Geometric modeling of aircraft during the Conceptual design phase is very different from that needed for the Preliminary or Detailed design phases. The Conceptual design phase is characterized by the rapid, multi-disciplinary analysis of many design variables by a small engineering team. The designer must walk a line between fidelity and productivity, picking tools and methods with the appropriate balance of characteristics to achieve the goals of the study, while staying within the available resources. Identifying geometric details that are important, and those that are not, is critical to making modeling and methodology choices. This is true for both the low-order analysis methods traditionally used in Conceptual design as well as the highest-order analyses available. This paper will highlight some of Conceptual design's characteristics that drive the designer s choices as well as modeling examples for several aircraft configurations using the open source version of the Vehicle Sketch Pad (Open VSP) aircraft Conceptual design geometry modeler.

  18. Sticks AND Carrots: Encouraging Open Science at its source.

    PubMed

    Leonelli, Sabina; Spichtinger, Daniel; Prainsack, Barbara

    2015-06-30

    The Open Science (OS) movement has been seen as an important facilitator for public participation in science. This has been underpinned by the assumption that widespread and free access to research outputs leads to (i) better and more efficient science, (ii) economic growth, in particular for small and medium-sized enterprises wishing to capitalise on research findings and (iii) increased transparency of knowledge production and its outcomes. The latter in particular could function as a catalyst for public participation and engagement. Whether OS is likely to help realise these benefits, however, will depend on the emergence of systemic incentives for scientists to utilise OS in a meaningful manner. While some areas, the environmental sciences have a long tradition of open ethos, citizen inclusion and global collaborations, such activities need to be more systematically supported and promoted by funders and learned societies in order to improve scientific research and public participation.

  19. The use of open source electronic health records within the federal safety net.

    PubMed

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    To conduct a federally funded study that examines the acquisition, implementation and operation of open source electronic health records (EHR) within safety net medical settings, such as federally qualified health centers (FQHC). The study was conducted by the National Opinion Research Center (NORC) at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to West Virginia, California and Arizona FQHC that were currently using an open source EHR. Five of the six sites that were chosen as part of the study found a number of advantages in the use of their open source EHR system, such as utilizing a large community of users and developers to modify their EHR to fit the needs of their provider and patient communities, and lower acquisition and implementation costs as compared to a commercial system. Despite these advantages, many of the informants and site visit participants felt that widespread dissemination and use of open source was restrained due to a negative connotation regarding this type of software. In addition, a number of participants stated that there is a necessary level of technical acumen needed within the FQHC to make an open source EHR effective. An open source EHR provides advantages for FQHC that have limited resources to acquire and implement an EHR, but additional study is needed to evaluate its overall effectiveness.

  20. Open source electronic health records and chronic disease management

    PubMed Central

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    Objective To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). Methods and Materials The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Results Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. Discussion The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. Conclusions The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC. PMID:23813566

  1. Product differentiation among health maintenance organizations: causes and consequences of offering open-ended products.

    PubMed

    Wholey, D R; Christianson, J B

    1994-01-01

    Open-ended products that allow an HMO enrollee to use providers who are not affiliated with the HMO have become an important component of the Clinton administration's health reform proposal, because these products maintain consumer freedom of choice of any provider. However, little is known about the consequences of offering an open-ended product from an organizational standpoint. This paper uses a theory of "spatial competition" to examine the decisions of health maintenance organizations to offer an open-ended product and the effect of offering an open-ended product on their enrollment.

  2. Smoke and mirrors: unanswered questions and misleading statements obscure the truth about organ sources in China.

    PubMed

    Rogers, Wendy A; Trey, Torsten; Fiatarone Singh, Maria; Bridgett, Madeleine; Bramstedt, Katrina A; Lavee, Jacob

    2016-08-01

    This response refutes the claim made in a recent article that organs for transplantation in China will no longer be sourced from executed prisoners. We identify ongoing ethical problems due to the lack of transparent data on current numbers of transplants in China; implausible and conflicting claims about voluntary donations; and obfuscation about who counts as a voluntary donor. The big unanswered question in Chinese transplant ethics is the source of organs, and until there is an open and independently audited system in China, legitimate concerns remain about organ harvesting from prisoners of conscience. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. X-Ray source populations in old open clusters: Collinder 261

    NASA Astrophysics Data System (ADS)

    Vats, Smriti; van den Berg, Maureen; Wijnands, Rudy

    2014-09-01

    We are carrying out an X-ray survey of old open clusters with the Chandra X-ray Observatory. Single old stars, being slow rotators, are very faint in X-rays (L_X < 1×10^27 erg/s). Hence, X-rays produced by mass transfer in cataclysmic variables (CVs) or by rapid rotation of the stars in tidally locked, detached binaries (active binaries; ABs) can be detected, without contamination from single stars. By comparing the properties of various types of interacting binaries in different environments (the Galactic field, old open clusters, globular clusters), we aim to study binary evolution and how it may be affected by dynamical encounters with other cluster stars. Stellar clusters are good targets to study binaries, as age, distance, chemical composition, are well constrained. Collinder (Cr) 261 is an old open cluster (age ~ 7 Gyr), with one of the richest populations inferred of close binaries and blue stragglers of all open clusters and is therefore an obvious target to study the products of close encounters in open clusters. We will present the first results of this study, detailing the low-luminosity X-ray population of Cr 261, in conjunction with other open clusters in our survey (NGC 188, Berkeley 17, NGC 6253, M67, NGC 6791) and in comparison with populations in globular clusters.

  4. OpenCFU, a new free and open-source software to count cell colonies and other circular objects.

    PubMed

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net.

  5. Using Open Source Software in Visual Simulation Development

    DTIC Science & Technology

    2005-09-01

    increased the use of the technology in training activities. Using open source/free software tools in the process can expand these possibilities...resulting in even greater cost reduction and allowing the flexibility needed in a training environment. This thesis presents a configuration and architecture...to be used when developing training visual simulations using both personal computers and open source tools. Aspects of the requirements needed in a

  6. Open-Source Intelligence in the Czech Military: Knowledge System and Process Design

    DTIC Science & Technology

    2002-06-01

    in Open-Source Intelligence OSINT, as one of the intelligence disciplines, bears some of the general problems of intelligence " business " OSINT...ADAPTING KNOWLEDGE MANAGEMENT THEORY TO THE CZECH MILITARY INTELLIGENCE Knowledge work is the core business of the military intelligence . As...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited OPEN-SOURCE INTELLIGENCE IN THE

  7. Writing in the Disciplines versus Corporate Workplaces: On the Importance of Conflicting Disciplinary Discourses in the Open Source Movement and the Value of Intellectual Property

    ERIC Educational Resources Information Center

    Ballentine, Brian D.

    2009-01-01

    Writing programs and more specifically, Writing in the Disciplines (WID) initiatives have begun to embrace the use of and the ideology inherent to, open source software. The Conference on College Composition and Communication has passed a resolution stating that whenever feasible educators and their institutions consider open source applications.…

  8. Anatomy of BioJS, an open source community for the life sciences.

    PubMed

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-07-08

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects.

  9. Expanding Human Capabilities through the Adoption and Utilization of Free, Libre, and Open Source Software

    ERIC Educational Resources Information Center

    Simpson, James Daniel

    2014-01-01

    Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…

  10. Altered [99mTc]Tc-MDP biodistribution from neutron activation sourced 99Mo.

    PubMed

    Demeter, Sandor; Szweda, Roman; Patterson, Judy; Grigoryan, Marine

    2018-01-01

    Given potential worldwide shortages of fission sourced 99 Mo/ 99m Tc medical isotopes there is increasing interest in alternate production strategies. A neutron activated 99 Mo source was utilized in a single center phase III open label study comparing 99m Tc, as 99m Tc Methylene Diphosphonate ([ 99m Tc]Tc-MDP), obtained from solvent generator separation of neutron activation produced 99 Mo, versus nuclear reactor produced 99 Mo (e.g., fission sourced) in oncology patients for which an [ 99m Tc]Tc-MDP bone scan would normally have been indicated. Despite the investigational [ 99m Tc]Tc-MDP passing all standard, and above standard of care, quality assurance tests, which would normally be sufficient to allow human administration, there was altered biodistribution which could lead to erroneous clinical interpretation. The cause of the altered biodistribution remains unknown and requires further research.

  11. A Framework for the Systematic Collection of Open Source Intelligence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pouchard, Line Catherine; Trien, Joseph P; Dobson, Jonathan D

    2009-01-01

    Following legislative directions, the Intelligence Community has been mandated to make greater use of Open Source Intelligence (OSINT). Efforts are underway to increase the use of OSINT but there are many obstacles. One of these obstacles is the lack of tools helping to manage the volume of available data and ascertain its credibility. We propose a unique system for selecting, collecting and storing Open Source data from the Web and the Open Source Center. Some data management tasks are automated, document source is retained, and metadata containing geographical coordinates are added to the documents. Analysts are thus empowered to search,more » view, store, and analyze Web data within a single tool. We present ORCAT I and ORCAT II, two implementations of the system.« less

  12. Order Without Intellectual Property Law: Open Science in Influenza.

    PubMed

    Kapczynski, Amy

    Today, intellectual property (IP) scholars accept that IP as an approach to information production has serious limits. But what lies beyond IP? A new literature on "intellectual production without IP" (or "IP without IP") has emerged to explore this question, but its examples and explanations have yet to convince skeptics. This Article reorients this new literature via a study of a hard case: a global influenza virus-sharing network that has for decades produced critically important information goods, at significant expense, and in a loose-knit group--all without recourse to IP. I analyze the Network as an example of "open science," a mode of information production that differs strikingly from conventional IP, and yet that successfully produces important scientific goods in response to social need. The theory and example developed here refute the most powerful criticisms of the emerging "IP without IP" literature, and provide a stronger foundation for this important new field. Even where capital costs are high, creation without IP can be reasonably effective in social terms, if it can link sources of funding to reputational and evaluative feedback loops like those that characterize open science. It can also be sustained over time, even by loose-knit groups and where the stakes are high, because organizations and other forms of law can help to stabilize cooperation. I also show that contract law is well suited to modes of information production that rely upon a "supply side" rather than "demand side" model. In its most important instances, "order without IP" is not order without governance, nor order without law. Recognizing this can help us better ground this new field, and better study and support forms of knowledge production that deserve our attention, and that sometimes sustain our very lives.

  13. The open-source neutral-mass spectrometer on Atmosphere Explorer-C, -D, and -E.

    NASA Technical Reports Server (NTRS)

    Nier, A. O.; Potter, W. E.; Hickman, D. R.; Mauersberger, K.

    1973-01-01

    The open-source mass spectrometer will be used to obtain the number densities of the neutral atmospheric gases in the mass range 1 to 48 amu at the satellite location. The ion source has been designed to allow gas particles to enter the ionizing region with the minimum practicable number of prior collisions with surfaces. This design minimizes the loss of atomic oxygen and other reactive species due to reactions with the walls of the ion source. The principal features of the open-source spectrometer and the laboratory calibration system are discussed.

  14. Methyl bromide: ocean sources, ocean sinks, and climate sensitivity

    NASA Technical Reports Server (NTRS)

    Anbar, A. D.; Yung, Y. L.; Chavez, F. P.

    1996-01-01

    The oceans play an important role in the geochemical cycle of methyl bromide (CH3Br), the major carrier of O3-destroying bromine to the stratosphere. The quantity of CH3Br produced annually in seawater is comparable to the amount entering the atmosphere each year from natural and anthropogenic sources. The production mechanism is unknown but may be biological. Most of this CH3Br is consumed in situ by hydrolysis or reaction with chloride. The size of the fraction which escapes to the atmosphere is poorly constrained; measurements in seawater and the atmosphere have been used to justify both a large oceanic CH3Br flux to the atmosphere and a small net ocean sink. Since the consumption reactions are extremely temperature-sensitive, small temperature variations have large effects on the CH3Br concentration in seawater, and therefore on the exchange between the atmosphere and the ocean. The net CH3Br flux is also sensitive to variations in the rate of CH3Br production. We have quantified these effects using a simple steady state mass balance model. When CH3Br production rates are linearly scaled with seawater chlorophyll content, this model reproduces the latitudinal variations in marine CH3Br concentrations observed in the east Pacific Ocean by Singh et al. [1983] and by Lobert et al. [1995]. The apparent correlation of CH3Br production with primary production explains the discrepancies between the two observational studies, strengthening recent suggestions that the open ocean is a small net sink for atmospheric CH3Br, rather than a large net source. The Southern Ocean is implicated as a possible large net source of CH3Br to the atmosphere. Since our model indicates that both the direction and magnitude of CH3Br exchange between the atmosphere and ocean are extremely sensitive to temperature and marine productivity, and since the rate of CH3Br production in the oceans is comparable to the rate at which this compound is introduced to the atmosphere, even small perturbations to temperature or productivity can modify atmospheric CH3Br. Therefore atmospheric CH3Br should be sensitive to climate conditions. Our modeling indicates that climate-induced CH3Br variations can be larger than those resulting from small (+/- 25%) changes in the anthropogenic source, assuming that this source comprises less than half of all inputs. Future measurements of marine CH3Br, temperature, and primary production should be combined with such models to determine the relationship between marine biological activity and CH3Br production. Better understanding of the biological term is especially important to assess the importance of non-anthropogenic sources to stratospheric ozone loss and the sensitivity of these sources to global climate change.

  15. Methyl bromide: ocean sources, ocean sinks, and climate sensitivity.

    PubMed

    Anbar, A D; Yung, Y L; Chavez, F P

    1996-03-01

    The oceans play an important role in the geochemical cycle of methyl bromide (CH3Br), the major carrier of O3-destroying bromine to the stratosphere. The quantity of CH3Br produced annually in seawater is comparable to the amount entering the atmosphere each year from natural and anthropogenic sources. The production mechanism is unknown but may be biological. Most of this CH3Br is consumed in situ by hydrolysis or reaction with chloride. The size of the fraction which escapes to the atmosphere is poorly constrained; measurements in seawater and the atmosphere have been used to justify both a large oceanic CH3Br flux to the atmosphere and a small net ocean sink. Since the consumption reactions are extremely temperature-sensitive, small temperature variations have large effects on the CH3Br concentration in seawater, and therefore on the exchange between the atmosphere and the ocean. The net CH3Br flux is also sensitive to variations in the rate of CH3Br production. We have quantified these effects using a simple steady state mass balance model. When CH3Br production rates are linearly scaled with seawater chlorophyll content, this model reproduces the latitudinal variations in marine CH3Br concentrations observed in the east Pacific Ocean by Singh et al. [1983] and by Lobert et al. [1995]. The apparent correlation of CH3Br production with primary production explains the discrepancies between the two observational studies, strengthening recent suggestions that the open ocean is a small net sink for atmospheric CH3Br, rather than a large net source. The Southern Ocean is implicated as a possible large net source of CH3Br to the atmosphere. Since our model indicates that both the direction and magnitude of CH3Br exchange between the atmosphere and ocean are extremely sensitive to temperature and marine productivity, and since the rate of CH3Br production in the oceans is comparable to the rate at which this compound is introduced to the atmosphere, even small perturbations to temperature or productivity can modify atmospheric CH3Br. Therefore atmospheric CH3Br should be sensitive to climate conditions. Our modeling indicates that climate-induced CH3Br variations can be larger than those resulting from small (+/- 25%) changes in the anthropogenic source, assuming that this source comprises less than half of all inputs. Future measurements of marine CH3Br, temperature, and primary production should be combined with such models to determine the relationship between marine biological activity and CH3Br production. Better understanding of the biological term is especially important to assess the importance of non-anthropogenic sources to stratospheric ozone loss and the sensitivity of these sources to global climate change.

  16. Higgs bosons with large transverse momentum at the LHC

    NASA Astrophysics Data System (ADS)

    Kudashkin, Kirill; Lindert, Jonas M.; Melnikov, Kirill; Wever, Christopher

    2018-07-01

    We compute the next-to-leading order QCD corrections to the production of Higgs bosons with large transverse momentum p⊥ ≫ 2mt at the LHC. To accomplish this, we combine the two-loop amplitudes for processes gg → Hg, qg → Hq and q q bar → Hg, recently computed in the approximation of nearly massless top quarks, with the numerical calculation of the squared one-loop amplitudes for gg → Hgg, qg → Hqg and q q bar → Hgg processes. The latter computation is performed with OpenLoops. We find that the QCD corrections to the Higgs transverse momentum distribution at very high p⊥ are large but quite similar to the QCD corrections obtained for point-like Hgg coupling. Our result removes one of the largest sources of theoretical uncertainty in the description of high-p⊥ Higgs boson production and opens a way to use the high-p⊥ region to search for physics beyond the Standard Model.

  17. High abundances of oxalic, azelaic, and glyoxylic acids and methylglyoxal in the open ocean with high biological activity: Implication for secondary OA formation from isoprene

    NASA Astrophysics Data System (ADS)

    Bikkina, Srinivas; Kawamura, Kimitaka; Miyazaki, Yuzo; Fu, Pingqing

    2014-05-01

    Atmospheric dicarboxylic acids (DCA) are a ubiquitous water-soluble component of secondary organic aerosols (SOA), which can act as cloud condensation nuclei (CCN), affecting the Earth's climate. Despite the high abundances of oxalic acid and related compounds in the marine aerosols, there is no consensus on what controls their distributions over the open ocean. Marine biological productivity could play a role in the production of DCA, but there is no substantial evidence to support this hypothesis. Here we present latitudinal distributions of DCA, oxoacids and α-dicarbonyls in the marine aerosols from the remote Pacific. Their concentrations were found several times higher in more biologically influenced aerosols (MBA) than less biologically influenced aerosols. We propose isoprene and unsaturated fatty acids as sources of DCA as inferred from significantly higher abundances of isoprene-SOA tracers and azelaic acid in MBA. These results have implications toward the reassessment of climate forcing feedbacks of marine-derived SOA.

  18. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Michael T.; Safdari, Masoud; Kress, Jessica E.

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enablemore » coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site. There are over 100 unit tests provided that run through the Illinois Rocstar Application Development (IRAD) lightweight testing infrastructure that is also supplied along with IMPACT. The package as a whole provides an excellent base for developing high-quality multiphysics applications using modern software development practices. To facilitate understanding how to utilize IMPACT effectively, two multiphysics systems have been developed and are available open-source through gitHUB. The simpler of the two systems, named ElmerFoamFSI in the repository, is a multiphysics, fluid-structure-interaction (FSI) coupling of the solid mechanics package Elmer with a fluid dynamics module from OpenFOAM. This coupling illustrates how to combine software packages that are unrelated by either author or architecture and combine them into a robust, parallel multiphysics system. A more complex multiphysics tool is the Illinois Rocstar Rocstar Multiphysics code that was rebuilt during the project around IMPACT. Rocstar Multiphysics was already an HPC multiphysics tool, but now that it has been rearchitected around IMPACT, it can be readily expanded to capture new and different physics in the future. In fact, during this project, the Elmer and OpenFOAM tools were also coupled into Rocstar Multiphysics and demonstrated. The full Rocstar Multiphysics codebase is also available on gitHUB, and licensed for any organization to use as they wish. Finally, the new IMPACT product is already being used in several multiphysics code coupling projects for the Air Force, NASA and the Missile Defense Agency, and initial work on expansion of the IMPACT-enabled Rocstar Multiphysics has begun in support of a commercial company. These initiatives promise to expand the interest and reach of IMPACT and Rocstar Multiphysics, ultimately leading to the envisioned standardization and consortium of users that was one of the goals of this project.« less

  19. A Clinician-Centered Evaluation of the Usability of AHLTA and Automated Clinical Practice Guidelines at TAMC

    DTIC Science & Technology

    2011-03-31

    evidence based medicine into clinical practice. It will decrease costs and enable multiple stakeholders to work in an open content/source environment to exchange clinical content, develop and test technology and explore processes in applied CDS. Design: Comparative study between the KMR infrastructure and capabilities developed as an open source, vendor agnostic solution for aCPG execution within AHLTA and the current DoD/MHS standard evaluating: H1: An open source, open standard KMR and Clinical Decision Support Engine can enable organizations to share domain

  20. Applications of open-path Fourier transform infrared for identification of volatile organic compound pollution sources and characterization of source emission behaviors.

    PubMed

    Lin, Chitsan; Liou, Naiwei; Sun, Endy

    2008-06-01

    An open-path Fourier transform infrared spectroscopy (OP-FTIR) system was set up for 3-day continuous line-averaged volatile organic compound (VOC) monitoring in a paint manufacturing plant. Seven VOCs (toluene, m-xylene, p-xylene, styrene, methanol, acetone, and 2-butanone) were identified in the ambient environment. Daytime-only batch operation mode was well explained by the time-series concentration plots. Major sources of methanol, m-xylene, acetone, and 2-butanone were identified in the southeast direction where paint solvent manufacturing processes are located. However, an attempt to uncover sources of styrene was not successful because the method detection limit (MDL) of the OP-FTIR system was not sensitive enough to produce conclusive data. In the second scenario, the OP-FTIR system was set up in an industrial complex to distinguish the origins of several VOCs. Eight major VOCs were identified in the ambient environment. The pollutant detected wind-rose percentage plots that clearly showed that ethylene, propylene, 2-butanone, and toluene mainly originated from the tank storage area, whereas the source of n-butane was mainly from the butadiene manufacturing processes of the refinery plant, and ammonia was identified as an accompanying reduction product in the gasoline desulfuration process. Advantages of OP-FTIR include its ability to simultaneously and continuously analyze many compounds, and its long path length monitoring has also shown advantages in obtaining more comprehensive data than the traditional multiple, single-point monitoring methods.

  1. Nitrates in drinking water: relation with intensive livestock production.

    PubMed

    Giammarino, M; Quatto, P

    2015-01-01

    An excess of nitrates causes environmental pollution in receiving water bodies and health risk for human, if contaminated water is source of drinking water. The directive 91/676/ CEE [1] aims to reduce the nitrogen pressure in Europe from agriculture sources and identifies the livestock population as one of the predominant sources of surplus of nutrients that could be released in water and air. Directive is concerned about cattle, sheep, pigs and poultry and their territorial loads, but it does not deal with fish farms. Fish farms effluents may contain pollutants affecting ecosystem water quality. On the basis of multivariate statistical analysis, this paper aims to establish what types of farming affect the presence of nitrates in drinking water in the province of Cuneo, Piedmont, Italy. In this regard, we have used data from official sources on nitrates in drinking water and data Arvet database, concerning the presence of intensive farming in the considered area. For model selection we have employed automatic variable selection algorithm. We have identified fish farms as a major source of nitrogen released into the environment, while pollution from sheep and poultry has appeared negligible. We would like to emphasize the need to include in the "Nitrate Vulnerable Zones" (as defined in Directive 91/676/CEE [1]), all areas where there are intensive farming of fish with open-system type of water use. Besides, aquaculture open-system should be equipped with adequate downstream system of filtering for removing nitrates in the wastewater.

  2. Development of multi-ampered D{sup {minus}} source for fusion applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacquot, C.; Belchenko, Y.; Bucalossi, J.

    1996-07-01

    Large current and high current density deuterium negative ion sources are investigated on the MANTIS test bed with the objective of producing several amperes of D{sup {minus}} beams, at an accelerated current density in the range 10{endash}20 mA/cm{sup 2}, for possible application in future neutral beam injectors, e.g. ITER. As a first step, the DRAGON source, which was built by Culham Laboratory was tested on the MANTIS test bed in order to test this large source using only {open_quote}{open_quote}pure volume{close_quote}{close_quote} production of negative ions. The accelerated negative ion current is found to be a strong function of the source operatingmore » pressure and the arc power, and a significant isotopic effect is observed. The maximum accelerated currents are 1.3 A of H{sup {minus}} (3.3 mA/cm{sup 2}) and 0.5 A (1.3 mA/cm{sup 2}) at 110 kW of arc power. Cesium injection from a non conventional dispenser together with an improved extraction system, have significantly improved the D-current. A maximum of 14 mA/cm{sup 2} of D{sup {minus}1} are accelerated at 30 kV, which corresponds potentially, to more than 5 A for a full aperture extraction with an arc power of 140 kW (2250 A of arc current). {copyright} {ital 1996 American Institute of Physics.}« less

  3. Deletion of the Saccharomyces cerevisiae ARO8 gene, encoding an aromatic amino acid transaminase, enhances phenylethanol production from glucose.

    PubMed

    Romagnoli, Gabriele; Knijnenburg, Theo A; Liti, Gianni; Louis, Edward J; Pronk, Jack T; Daran, Jean-Marc

    2015-01-01

    Phenylethanol has a characteristic rose-like aroma that makes it a popular ingredient in foods, beverages and cosmetics. Microbial production of phenylethanol currently relies on whole-cell bioconversion of phenylalanine with yeasts that harbour an Ehrlich pathway for phenylalanine catabolism. Complete biosynthesis of phenylethanol from a cheap carbon source, such as glucose, provides an economically attractive alternative for phenylalanine bioconversion. In this study, synthetic genetic array (SGA) screening was applied to identify genes involved in regulation of phenylethanol synthesis in Saccharomyces cerevisiae. The screen focused on transcriptional regulation of ARO10, which encodes the major decarboxylase involved in conversion of phenylpyruvate to phenylethanol. A deletion in ARO8, which encodes an aromatic amino acid transaminase, was found to underlie the transcriptional upregulation of ARO10 during growth, with ammonium sulphate as the sole nitrogen source. Physiological characterization revealed that the aro8Δ mutation led to substantial changes in the absolute and relative intracellular concentrations of amino acids. Moreover, deletion of ARO8 led to de novo production of phenylethanol during growth on a glucose synthetic medium with ammonium as the sole nitrogen source. The aro8Δ mutation also stimulated phenylethanol production when combined with other, previously documented, mutations that deregulate aromatic amino acid biosynthesis in S. cerevisiae. The resulting engineered S. cerevisiae strain produced >3 mm phenylethanol from glucose during growth on a simple synthetic medium. The strong impact of a transaminase deletion on intracellular amino acid concentrations opens new possibilities for yeast-based production of amino acid-derived products. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Preparing a scientific manuscript in Linux: Today's possibilities and limitations

    PubMed Central

    2011-01-01

    Background Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Findings Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux. PMID:22018246

  5. Open source bioimage informatics for cell biology.

    PubMed

    Swedlow, Jason R; Eliceiri, Kevin W

    2009-11-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.

  6. Implementation, reliability, and feasibility test of an Open-Source PACS.

    PubMed

    Valeri, Gianluca; Zuccaccia, Matteo; Badaloni, Andrea; Ciriaci, Damiano; La Riccia, Luigi; Mazzoni, Giovanni; Maggi, Stefania; Giovagnoni, Andrea

    2015-12-01

    To implement a hardware and software system able to perform the major functions of an Open-Source PACS, and to analyze it in a simulated real-world environment. A small home network was implemented, and the Open-Source operating system Ubuntu 11.10 was installed in a laptop containing the Dcm4chee suite with the software devices needed. The Open-Source PACS implemented is compatible with Linux OS, Microsoft OS, and Mac OS X; furthermore, it was used with operating systems that guarantee the operation in portable devices (smartphone, tablet) Android and iOS. An OSS PACS is useful for making tutorials and workshops on post-processing techniques for educational and training purposes.

  7. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  8. Utilization of organic matter by invertebrates along an estuarine gradient in an intermittently open estuary

    NASA Astrophysics Data System (ADS)

    Lautenschlager, Agnes D.; Matthews, Ty G.; Quinn, Gerry P.

    2014-08-01

    In intermittently open estuaries, the sources of organic matter sustaining benthic invertebrates are likely to vary seasonally, particularly between periods of connection and disconnection with the ocean and higher and lower freshwater flows. This study investigated the contribution of allochthonous and autochthonous primary production to the diet of representative invertebrate species using stable isotope analysis (SIA) during the austral summer and winter (2008, 2009) in an intermittently open estuary on the south-eastern coast of Australia. As the study was conducted towards the end of a prolonged period of drought, a reduced influence of freshwater/terrestrial organic matter was expected. Sampling was conducted along an estuarine gradient, including upper, middle and lower reaches and showed that the majority of assimilated organic matter was derived from autochthonous estuarine food sources. Additionally, there was an input of allochthonous organic matter, which varied along the length of the estuary, indicated by distinct longitudinal trends in carbon and nitrogen stable isotope signatures along the estuarine gradient. Marine seaweed contributed to invertebrate diets in the lower reaches of the estuary, while freshwater/terrestrial organic matter had increased influence in the upper reaches. Suspension-feeding invertebrates derived large parts of their diet from freshwater/terrestrial material, despite flows being greatly reduced in comparison with non-drought years.

  9. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent processes from a different domain or have different spatial and temporal resolutions. An open source framework that bridges OpenMI and OpenDA is presented. The framework provides a generic and easy means for any OpenMI compliant model to assimilate observation measurements. An example test case will be presented using MikeSHE, and OpenMI compliant fully coupled integrated hydrological model that can accurately simulate the feedback dynamics of overland flow, unsaturated zone and saturated zone.

  10. A cost analysis of microalgal biomass and biodiesel production in open raceways treating municipal wastewater and under optimum light wavelength.

    PubMed

    Kang, Zion; Kim, Byung-Hyuk; Ramanan, Rishiram; Choi, Jong-Eun; Yang, Ji-Won; Oh, Hee-Mock; Kim, Hee-Sik

    2015-01-01

    Open raceway ponds are cost-efficient for mass cultivation of microalgae compared with photobioreactors. Although low-cost options like wastewater as nutrient source is studied to overcome the commercialization threshold for biodiesel production from microalgae, a cost analysis on the use of wastewater and other incremental increases in productivity has not been elucidated. We determined the effect of using wastewater and wavelength filters on microalgal productivity. Experimental results were then fitted into a model, and cost analysis was performed in comparison with control raceways. Three different microalgal strains, Chlorella vulgaris AG10032, Chlorella sp. JK2, and Scenedesmus sp. JK10, were tested for nutrient removal under different light wavelengths (blue, green, red, and white) using filters in batch cultivation. Blue wavelength showed an average of 27% higher nutrient removal and at least 42% higher chemical oxygen demand removal compared with white light. Naturally, the specific growth rate of microalgae cultivated under blue wavelength was on average 10.8% higher than white wavelength. Similarly, lipid productivity was highest in blue wavelength, at least 46.8% higher than white wavelength, whereas FAME composition revealed a mild increase in oleic and palmitic acid levels. Cost analysis reveals that raceways treating wastewater and using monochromatic wavelength would decrease costs from 2.71 to 0.73 $/kg biomass. We prove that increasing both biomass and lipid productivity is possible through cost-effective approaches, thereby accelerating the commercialization of low-value products from microalgae, like biodiesel.

  11. Practices in NASA's EOSDIS to Promote Open Data and Research Integrity

    NASA Astrophysics Data System (ADS)

    Behnke, J.; Ramapriyan, H.

    2017-12-01

    The purpose of this paper is to highlight the key practices adopted by NASA in its Earth Observing System Data and Information System (EOSDIS) to promote and facilitate open data and research integrity. EOSDIS is the system that manages most of NASA's Earth science data from various sources - satellites, aircraft, field campaigns and some research projects. Since its inception in 1990 as a part of the Earth Observing System (EOS) Program, EOSDIS has been following NASA's free and open data and information policy, whereby data are shared with all users on a non-discriminatory basis and are provided at no cost. To ensure that the data are discoverable and accessible to the user community, NASA follows an evolutionary development approach, whereby the latest technologies that can be practically adopted are infused into EOSDIS. This results in continuous improvements in system capabilities such that technologies that users are accustomed to in other environments are brought to bear in their access to NASA's Earth observation data. Mechanisms have existed for ensuring that the data products offered by EOSDIS are vetted by the community before they are released. Information about data products such as Algorithm Theoretical Basis Documents and quality assessments are openly available with the products. The EOSDIS Distributed Active Archive Centers (DAACs) work with the science teams responsible for product generation to assist with proper use of metadata. The DAACs have knowledgeable staff to answer users' questions and have access to scientific experts as needed. Citation of data products in scientific papers are facilitated by assignment of Digital Object Identifiers (DOIs) - at present, over 50% of data products in EOSDIS have been assigned DOIs. NASA gathers and publishes citation metrics for the datasets offered by the DAACs. Through its Software and Services Citations Working Group, NASA is currently investigating broadening DOI assignments to promote greater provenance traceability. NASA has developed Preservation Content Specifications for Earth science data to ensure that provenance and context are captured and preserved for the future and is applying them to data and information from its missions. All these actions promote availability of information to promote integrity in scientific research.

  12. Matching Livestock Production Systems and Environment

    NASA Astrophysics Data System (ADS)

    Becchetti, T.; Stackhouse, J.; Snell, L.; Lile, D.; George, H.; Harper, J. M.; Larson, S.; Mashiri, F.; Doran, M.; Barry, S.

    2015-12-01

    Livestock production systems vary greatly over the world. Producers try to match the resources they have with the demands of production, this can vary by species, class of animal, number of animals, and production goals, etc. Using California's diversity in production systems as an example, we explored how livestock producers best utilize the forage and feed found in different ecosystems and available in different parts of the state. Livestock grazing, the predominant land use in California and in much of the world, makes efficient use of the natural vegetation produced without additional water (irrigation), minimal inputs such as fertilizer while often supporting a variety of conservation objectives including vegetation management, fire fuels management, and habitat and open space conservation. The numerous by-products produced by other sectors of California's agriculture as well as food industries, such as brewer's grain, cottonseeds, and almond hulls are utilized as a feed source for livestock. These by-products are not only an important feed source especially in drought years but are diverted from our waste stream when utilized by livestock. The concept of matching available resources to livestock needs throughout the world is often overlooked and production systems are often over simplified in projects conducting a life cycle analysis or developing carbon foot prints for livestock production systems. This paper provides details on the various production systems found in California, the ecosystem they have adapted to, and how the producers use science and ecological knowledge to match the biological requirements of the livestock and conservation objectives to feed and forage resources.

  13. Sharing Water-related Information to Tackle Changes in the Hydrosphere - for Operational Needs (SWITCH-ON)

    NASA Astrophysics Data System (ADS)

    Arheimer, Berit

    2014-05-01

    A recently started EU project (FP7 project No 603587) called SWITCH-ON will establish new infrastructure for water research in Europe. The overall goal of the project is to make use of open data, and add value to society by repurposing and refining data from various sources. SWITCH-ON will establish new forms of water research and facilitate the development of new products and services based on principles of sharing and community building. The basic for this work is a virtual water-science laboratory, which consists of open data, dedicated software tools and a set of protocols, hosted at the "SWITCH-ON water information" portal at http://water-switch-on.eu/. The laboratory will seamlessly integrate the open data with harmonised modelling tools and facilities the performance of virtual experiments of comparative science. Comparative science is a new form of research, which will advance science by contrasting water related processes in different environments and help understand complex processes in a more holistic way than individual studies The SWITCH-ON objectives are to use open data for implementing: 1) an innovative spatial information platform with open data tailored for direct water assessments, 2) an entirely new form of collaborative research for water-related sciences, 3) fourteen new operational products and services dedicated to appointed end-users, 4) new business and knowledge to inform individual and collective decisions in line with the Europe's smart growth and environmental objectives. The SWITCH-ON project will be one trigger in a contemporary global movement to better address environmental and societal challenges through openness and collaboration. The poster will present the project visions and achievements so far, and invite more research groups to use the virtual water-science laboratory.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.

    When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modularmore » In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.« less

  15. Open source OCR framework using mobile devices

    NASA Astrophysics Data System (ADS)

    Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan

    2008-02-01

    Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.

  16. Open-source colorimeter.

    PubMed

    Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M

    2013-04-19

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.

  17. Open-Source Colorimeter

    PubMed Central

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032

  18. OpenMebius: an open source software for isotopically nonstationary 13C-based metabolic flux analysis.

    PubMed

    Kajihata, Shuichi; Furusawa, Chikara; Matsuda, Fumio; Shimizu, Hiroshi

    2014-01-01

    The in vivo measurement of metabolic flux by (13)C-based metabolic flux analysis ((13)C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a (13)C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas (13)C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary (13)C metabolic flux analysis (INST-(13)C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-(13)C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-(13)C-MFA. Confidence intervals determined by INST-(13)C-MFA were less than those determined by conventional methods, indicating the potential of INST-(13)C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-(13)C-MFA.

  19. Formation of a high intensity low energy positron string

    NASA Astrophysics Data System (ADS)

    Donets, E. D.; Donets, E. E.; Syresin, E. M.; Itahashi, T.; Dubinov, A. E.

    2004-05-01

    The possibility of a high intensity low energy positron beam production is discussed. The proposed Positron String Trap (PST) is based on the principles and technology of the Electron String Ion Source (ESIS) developed in JINR during the last decade. A linear version of ESIS has been used successfully for the production of intense highly charged ion beams of various elements. Now the Tubular Electron String Ion Source (TESIS) concept is under study and this opens really new promising possibilities in physics and technology. In this report, we discuss the application of the tubular-type trap for the storage of positrons cooled to the cryogenic temperatures of 0.05 meV. It is intended that the positron flux at the energy of 1-5 eV, produced by the external source, is injected into the Tubular Positron Trap which has a similar construction as the TESIS. Then the low energy positrons are captured in the PST Penning trap and are cooled down because of their synchrotron radiation in the strong (5-10 T) applied magnetic field. It is expected that the proposed PST should permit storing and cooling to cryogenic temperature of up to 5×109 positrons. The accumulated cooled positrons can be used further for various physics applications, for example, antihydrogen production.

  20. Simulation for Dynamic Situation Awareness and Prediction III

    DTIC Science & Technology

    2010-03-01

    source Java ™ library for capturing and sending network packets; 4) Groovy – an open source, Java -based scripting language (version 1.6 or newer). Open...DMOTH Analyzer application. Groovy is an open source dynamic scripting language for the Java Virtual Machine. It is consistent with Java syntax...between temperature, pressure, wind and relative humidity, and 3) a precipitation editing algorithm. The Editor can be used to prepare scripted changes

  1. Transforming High School Classrooms with Free/Open Source Software: "It's Time for an Open Source Software Revolution"

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2008-01-01

    Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…

  2. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  3. Open source tools for fluorescent imaging.

    PubMed

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  5. Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.

    PubMed

    Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel

    2015-01-01

    There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).

  6. Effect of open air drying, LPG based drier and pretreatments on the quality of Indian gooseberry (aonla).

    PubMed

    Gudapaty, Pratibha; Indavarapu, Srinivas; Korwar, Girish R; Shankar, Arun Kumar; Adake, Ravi Kant V; Bandi, Venkateshwarlu; Kanchu, Srinivas Rao

    2010-10-01

    The aonla fruits (whole fruit, pricking, splits, segments) were subjected to pretreatments like blanching, osmotic dehydration with salt (2%) and sugar (40%) in different experiments before drying to obtain a product with better keeping quality. An LPG based drier (CRIDA drier) with capacity to dry 50 kg of fresh Indian gooseberry (aonla) was used. Nutritional quality and rehydration characteristics of CRIDA drier dried products were higher and free from contamination. Drying time was shortest for blanched and osmotically dehydrated segments dried in CRIDA drier and the product had better vitamin C retention, rehydration characteristics and sensory acceptability compared to sun or cabinet drier dried product. The additional expenditure spent on gas in CRIDA drier is compensated by reduced labour cost and higher price for the better quality product. Alternate energy sources like biogas and biomass can be used as fuel in the CRIDA drier.

  7. Heterotrophic cultivation of microalgae for pigment production: A review.

    PubMed

    Hu, Jianjun; Nagarajan, Dillirani; Zhang, Quanguo; Chang, Jo-Shu; Lee, Duu-Jong

    Pigments (mainly carotenoids) are important nutraceuticals known for their potent anti-oxidant activities and have been used extensively as high end health supplements. Microalgae are the most promising sources of natural carotenoids and are devoid of the toxic effects associated with synthetic derivatives. Compared to photoautotrophic cultivation, heterotrophic cultivation of microalgae in well-controlled bioreactors for pigments production has attracted much attention for commercial applications due to overcoming the difficulties associated with the supply of CO 2 and light, as well as avoiding the contamination problems and land requirements in open autotrophic culture systems. In this review, the heterotrophic metabolic potential of microalgae and their uses in pigment production are comprehensively described. Strategies to enhance pigment production under heterotrophic conditions are critically discussed and the challenges faced in heterotrophic pigment production with possible alternative solutions are presented. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. The use of open source electronic health records within the federal safety net

    PubMed Central

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    Objective To conduct a federally funded study that examines the acquisition, implementation and operation of open source electronic health records (EHR) within safety net medical settings, such as federally qualified health centers (FQHC). Methods and materials The study was conducted by the National Opinion Research Center (NORC) at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to West Virginia, California and Arizona FQHC that were currently using an open source EHR. Results Five of the six sites that were chosen as part of the study found a number of advantages in the use of their open source EHR system, such as utilizing a large community of users and developers to modify their EHR to fit the needs of their provider and patient communities, and lower acquisition and implementation costs as compared to a commercial system. Discussion Despite these advantages, many of the informants and site visit participants felt that widespread dissemination and use of open source was restrained due to a negative connotation regarding this type of software. In addition, a number of participants stated that there is a necessary level of technical acumen needed within the FQHC to make an open source EHR effective. Conclusions An open source EHR provides advantages for FQHC that have limited resources to acquire and implement an EHR, but additional study is needed to evaluate its overall effectiveness. PMID:23744787

  9. Integrating Clinical Trial Imaging Data Resources Using Service-Oriented Architecture and Grid Computing

    PubMed Central

    Cladé, Thierry; Snyder, Joshua C.

    2010-01-01

    Clinical trials which use imaging typically require data management and workflow integration across several parties. We identify opportunities for all parties involved to realize benefits with a modular interoperability model based on service-oriented architecture and grid computing principles. We discuss middleware products for implementation of this model, and propose caGrid as an ideal candidate due to its healthcare focus; free, open source license; and mature developer tools and support. PMID:20449775

  10. An automated, open-source pipeline for mass production of digital elevation models (DEMs) from very-high-resolution commercial stereo satellite imagery

    NASA Astrophysics Data System (ADS)

    Shean, David E.; Alexandrov, Oleg; Moratto, Zachary M.; Smith, Benjamin E.; Joughin, Ian R.; Porter, Claire; Morin, Paul

    2016-06-01

    We adapted the automated, open source NASA Ames Stereo Pipeline (ASP) to generate digital elevation models (DEMs) and orthoimages from very-high-resolution (VHR) commercial imagery of the Earth. These modifications include support for rigorous and rational polynomial coefficient (RPC) sensor models, sensor geometry correction, bundle adjustment, point cloud co-registration, and significant improvements to the ASP code base. We outline a processing workflow for ˜0.5 m ground sample distance (GSD) DigitalGlobe WorldView-1 and WorldView-2 along-track stereo image data, with an overview of ASP capabilities, an evaluation of ASP correlator options, benchmark test results, and two case studies of DEM accuracy. Output DEM products are posted at ˜2 m with direct geolocation accuracy of <5.0 m CE90/LE90. An automated iterative closest-point (ICP) co-registration tool reduces absolute vertical and horizontal error to <0.5 m where appropriate ground-control data are available, with observed standard deviation of ˜0.1-0.5 m for overlapping, co-registered DEMs (n = 14, 17). While ASP can be used to process individual stereo pairs on a local workstation, the methods presented here were developed for large-scale batch processing in a high-performance computing environment. We are leveraging these resources to produce dense time series and regional mosaics for the Earth's polar regions.

  11. APPLYING OPEN-PATH OPTICAL SPECTROSCOPY TO HEAVY-DUTY DIESEL EMISSIONS

    EPA Science Inventory

    Non-dispersive infrared absorption has been used to measure gaseous emissions for both stationary and mobile sources. Fourier transform infrared spectroscopy has been used for stationary sources as both extractive and open-path methods. We have applied the open-path method for bo...

  12. Bootstrap inversion technique for atmospheric trace gas source detection and quantification using long open-path laser measurements

    NASA Astrophysics Data System (ADS)

    Alden, Caroline B.; Ghosh, Subhomoy; Coburn, Sean; Sweeney, Colm; Karion, Anna; Wright, Robert; Coddington, Ian; Rieker, Gregory B.; Prasad, Kuldeep

    2018-03-01

    Advances in natural gas extraction technology have led to increased activity in the production and transport sectors in the United States and, as a consequence, an increased need for reliable monitoring of methane leaks to the atmosphere. We present a statistical methodology in combination with an observing system for the detection and attribution of fugitive emissions of methane from distributed potential source location landscapes such as natural gas production sites. We measure long (> 500 m), integrated open-path concentrations of atmospheric methane using a dual frequency comb spectrometer and combine measurements with an atmospheric transport model to infer leak locations and strengths using a novel statistical method, the non-zero minimum bootstrap (NZMB). The new statistical method allows us to determine whether the empirical distribution of possible source strengths for a given location excludes zero. Using this information, we identify leaking source locations (i.e., natural gas wells) through rejection of the null hypothesis that the source is not leaking. The method is tested with a series of synthetic data inversions with varying measurement density and varying levels of model-data mismatch. It is also tested with field observations of (1) a non-leaking source location and (2) a source location where a controlled emission of 3.1 × 10-5 kg s-1 of methane gas is released over a period of several hours. This series of synthetic data tests and outdoor field observations using a controlled methane release demonstrates the viability of the approach for the detection and sizing of very small leaks of methane across large distances (4+ km2 in synthetic tests). The field tests demonstrate the ability to attribute small atmospheric enhancements of 17 ppb to the emitting source location against a background of combined atmospheric (e.g., background methane variability) and measurement uncertainty of 5 ppb (1σ), when measurements are averaged over 2 min. The results of the synthetic and field data testing show that the new observing system and statistical approach greatly decreases the incidence of false alarms (that is, wrongly identifying a well site to be leaking) compared with the same tests that do not use the NZMB approach and therefore offers increased leak detection and sizing capabilities.

  13. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.

    PubMed

    Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  14. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy

    NASA Astrophysics Data System (ADS)

    Barabas, Federico M.; Masullo, Luciano A.; Stefani, Fernando D.

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  15. Observational insights into aerosol formation from isoprene.

    PubMed

    Worton, David R; Surratt, Jason D; Lafranchi, Brian W; Chan, Arthur W H; Zhao, Yunliang; Weber, Robin J; Park, Jeong-Hoo; Gilman, Jessica B; de Gouw, Joost; Park, Changhyoun; Schade, Gunnar; Beaver, Melinda; Clair, Jason M St; Crounse, John; Wennberg, Paul; Wolfe, Glenn M; Harrold, Sara; Thornton, Joel A; Farmer, Delphine K; Docherty, Kenneth S; Cubison, Michael J; Jimenez, Jose-Luis; Frossard, Amanda A; Russell, Lynn M; Kristensen, Kasper; Glasius, Marianne; Mao, Jingqiu; Ren, Xinrong; Brune, William; Browne, Eleanor C; Pusede, Sally E; Cohen, Ronald C; Seinfeld, John H; Goldstein, Allen H

    2013-10-15

    Atmospheric photooxidation of isoprene is an important source of secondary organic aerosol (SOA) and there is increasing evidence that anthropogenic oxidant emissions can enhance this SOA formation. In this work, we use ambient observations of organosulfates formed from isoprene epoxydiols (IEPOX) and methacrylic acid epoxide (MAE) and a broad suite of chemical measurements to investigate the relative importance of nitrogen oxide (NO/NO2) and hydroperoxyl (HO2) SOA formation pathways from isoprene at a forested site in California. In contrast to IEPOX, the calculated production rate of MAE was observed to be independent of temperature. This is the result of the very fast thermolysis of MPAN at high temperatures that affects the distribution of the MPAN reservoir (MPAN / MPA radical) reducing the fraction that can react with OH to form MAE and subsequently SOA (F(MAE formation)). The strong temperature dependence of F(MAE formation) helps to explain our observations of similar concentrations of IEPOX-derived organosulfates (IEPOX-OS; ~1 ng m(-3)) and MAE-derived organosulfates (MAE-OS; ~1 ng m(-3)) under cooler conditions (lower isoprene concentrations) and much higher IEPOX-OS (~20 ng m(-3)) relative to MAE-OS (<0.0005 ng m(-3)) at higher temperatures (higher isoprene concentrations). A kinetic model of IEPOX and MAE loss showed that MAE forms 10-100 times more ring-opening products than IEPOX and that both are strongly dependent on aerosol water content when aerosol pH is constant. However, the higher fraction of MAE ring opening products does not compensate for the lower MAE production under warmer conditions (higher isoprene concentrations) resulting in lower formation of MAE-derived products relative to IEPOX at the surface. In regions of high NOx, high isoprene emissions and strong vertical mixing the slower MPAN thermolysis rate aloft could increase the fraction of MPAN that forms MAE resulting in a vertically varying isoprene SOA source.

  16. OpenCFU, a New Free and Open-Source Software to Count Cell Colonies and Other Circular Objects

    PubMed Central

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net. PMID:23457446

  17. Evolution of a chemically reacting plume in a ventilated room

    NASA Astrophysics Data System (ADS)

    Conroy, D. T.; Smith, Stefan G. Llewellyn; Caulfield, C. P.

    2005-08-01

    The dynamics of a second-order chemical reaction in an enclosed space driven by the mixing produced by a turbulent buoyant plume are studied theoretically, numerically and experimentally. An isolated turbulent buoyant plume source is located in an enclosure with a single external opening. Both the source and the opening are located at the bottom of the enclosure. The enclosure is filled with a fluid of a given density with a fixed initial concentration of a chemical. The source supplies a constant volume flux of fluid of different density containing a different chemical of known and constant concentration. These two chemicals undergo a second-order non-reversible reaction, leading to the creation of a third product chemical. For simplicity, we restrict attention to the situation where the reaction process does not affect the density of the fluids involved. Because of the natural constraint of volume conservation, fluid from the enclosure is continually vented. We study the evolution of the various chemical species as they are advected by the developing ventilated filling box process within the room that is driven by the plume dynamics. In particular, we study both the mean and vertical distributions of the chemical species as a function of time within the room. We compare the results of analogue laboratory experiments with theoretical predictions derived from reduced numerical models, and find excellent agreement. Important parameters for the behaviour of the system are associated with the source volume flux and specific momentum flux relative to the source specific buoyancy flux, the ratio of the initial concentrations of the reacting chemical input in the plume and the reacting chemical in the enclosed space, the reaction rate of the chemicals and the aspect ratio of the room. Although the behaviour of the system depends on all these parameters in a non-trivial way, in general the concentration within the room of the chemical input at the isolated source passes through three distinct phases. Initially, as the source fluid flows into the room, the mean concentration of the input chemical increases due to the inflow, with some loss due to the reaction with the chemical initially within the room. After a finite time, the layer of fluid contaminated by the inflow reaches the opening to the exterior at the base of the room. During an ensuing intermediate phase, the rate of increase in the concentration of the input chemical then drops non-trivially, due to the extra sink for the input chemical of the outflow through the opening. During this intermediate stage, the concentration of the input chemical continues to rise, but at a rate that is reduced due to the reaction with the fluid in the room. Ultimately, all the fluid (and hence the chemical) that was originally within the room is lost, both through reaction and outflow through the opening, and the room approaches its final steady state, being filled completely with source fluid.

  18. Utilization of open source electronic health record around the world: A systematic review.

    PubMed

    Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahamdi, Maryam

    2014-01-01

    Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems.

  19. Bioavailable iron in the Southern Ocean: the significance of the iceberg conveyor belt.

    PubMed

    Raiswell, Rob; Benning, Liane G; Tranter, Martyn; Tulaczyk, Slawek

    2008-05-30

    Productivity in the Southern Oceans is iron-limited, and the supply of iron dissolved from aeolian dust is believed to be the main source from outside the marine reservoir. Glacial sediment sources of iron have rarely been considered, as the iron has been assumed to be inert and non-bioavailable. This study demonstrates the presence of potentially bioavailable Fe as ferrihydrite and goethite in nanoparticulate clusters, in sediments collected from icebergs in the Southern Ocean and glaciers on the Antarctic landmass. Nanoparticles in ice can be transported by icebergs away from coastal regions in the Southern Ocean, enabling melting to release bioavailable Fe to the open ocean. The abundance of nanoparticulate iron has been measured by an ascorbate extraction. This data indicates that the fluxes of bioavailable iron supplied to the Southern Ocean from aeolian dust (0.01-0.13 Tg yr(-1)) and icebergs (0.06-0.12 Tg yr(-1)) are comparable. Increases in iceberg production thus have the capacity to increase productivity and this newly identified negative feedback may help to mitigate fossil fuel emissions.

  20. Numerical simulations of highly buoyant flows in the Castel Giorgio - Torre Alfina deep geothermal reservoir

    NASA Astrophysics Data System (ADS)

    Volpi, Giorgio; Crosta, Giovanni B.; Colucci, Francesca; Fischer, Thomas; Magri, Fabien

    2017-04-01

    Geothermal heat is a viable source of energy and its environmental impact in terms of CO2 emissions is significantly lower than conventional fossil fuels. However, nowadays its utilization is inconsistent with the enormous amount of energy available underneath the surface of the earth. This is mainly due to the uncertainties associated with it, as for example the lack of appropriate computational tools, necessary to perform effective analyses. The aim of the present study is to build an accurate 3D numerical model, to simulate the exploitation process of the deep geothermal reservoir of Castel Giorgio - Torre Alfina (central Italy), and to compare results and performances of parallel simulations performed with TOUGH2 (Pruess et al. 1999), FEFLOW (Diersch 2014) and the open source software OpenGeoSys (Kolditz et al. 2012). Detailed geological, structural and hydrogeological data, available for the selected area since early 70s, show that Castel Giorgio - Torre Alfina is a potential geothermal reservoir with high thermal characteristics (120 ° C - 150 ° C) and fluids such as pressurized water and gas, mainly CO2, hosted in a carbonate formation. Our two steps simulations firstly recreate the undisturbed natural state of the considered system and then perform the predictive analysis of the industrial exploitation process. The three adopted software showed a strong numerical simulations accuracy, which has been verified by comparing the simulated and measured temperature and pressure values of the geothermal wells in the area. The results of our simulations have demonstrated the sustainability of the investigated geothermal field for the development of a 5 MW pilot plant with total fluids reinjection in the same original formation. From the thermal point of view, a very efficient buoyant circulation inside the geothermal system has been observed, thus allowing the reservoir to support the hypothesis of a 50 years production time with a flow rate of 1050 t/h. Furthermore, with the modeled distances our simulations showed no interference effects between the production and re-injection wells. Besides providing valuable guidelines for future exploitation of the Castel Giorgio - Torre Alfina deep geothermal reservoir, this example also highlights the large applicability and the high performance of the OpenGeoSys open-source code in handling coupled hydro-thermal simulations. REFERENCES Diersch, H. J. (2014). FEFLOW Finite Element Modeling of Flow, Mass and Heat Transport in Porous and Fractured Media, Springer-Verlag Berlin Heidelberg, ISBN 978-3-642-38738-8. Kolditz, O., Bauer, S., Bilke, L., Böttcher, N., Delfs, J. O., Fischer, T., U. J. Görke, T. Kalbacher, G. Kosakowski, McDermott, C. I., Park, C. H., Radu, F., Rink, K., Shao, H., Shao, H.B., Sun, F., Sun, Y., Sun, A., Singh, K., Taron, J., Walther, M., Wang,W., Watanabe, N., Wu, Y., Xie, M., Xu, W., Zehner, B. (2012). OpenGeoSys: an open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THM/C) processes in porous media. Environmental Earth Sciences, 67(2), 589-599. Pruess, K., Oldenburg, C. M., & Moridis, G. J. (1999). TOUGH2 user's guide version 2. Lawrence Berkeley National Laboratory.

  1. A product lifecycle management framework to support the exchange of prototyping and testing information

    NASA Astrophysics Data System (ADS)

    Toche Fumchum, Luc Boris

    2011-12-01

    The modern perspective on product life cycle and the rapid evolution of Information and Communication Technologies in general have opened a new era in product representation and product information sharing between participants, both inside and outside the enterprise and throughout the product life. In particular, the Product Development Process relies on cross-functional activities involving different domains of expertise that each have their own dedicated tools. This has generated new challenges in terms of collaboration and dissemination of information at large between companies or even within the same organization. Within this context, the work reported herein focuses on a specific stakeholder within product development activities - the prototyping and testing department. Its business is typically related to the planning and building of prototypes in order to perform specific tests on the future product or one of its sub-assemblies. The research project aims at investigating an appropriate framework that leverages configured engineering product information, based on complementary information structures, to share and exchange prototyping and testing information in a Product Lifecycle Management (PLM) perspective. As a first step, a case study based on the retrofit of an aircraft engine is deployed to implement a scenario demonstrating the functionalities to be available within the intended framework. For this purpose, complementary and configurable structures are simulated within the project's PLM system. In a second step are considered the software interoperability issues that don't only affect Design -- Testing interactions, but many other interfaces within either the company -- due to the silo-arrangement -- or the consortiums with partners, in which case the whole PLM platforms could simply be incompatible. A study based on an open source initiative and relying on an improved model of communication is described to show how two natively disparate PLM tools can dialogue to merge information in a central environment. The principles applied in both steps are therefore transposed to introduce the Open Exchange Nest as a generic PLM-driven and web-based concept to support the collaborative work in the aforementioned context.

  2. Open Source Hbim for Cultural Heritage: a Project Proposal

    NASA Astrophysics Data System (ADS)

    Diara, F.; Rinaudo, F.

    2018-05-01

    Actual technologies are changing Cultural Heritage research, analysis, conservation and development ways, allowing new innovative approaches. The possibility of integrating Cultural Heritage data, like archaeological information, inside a three-dimensional environment system (like a Building Information Modelling) involve huge benefits for its management, monitoring and valorisation. Nowadays there are many commercial BIM solutions. However, these tools are thought and developed mostly for architecture design or technical installations. An example of better solution could be a dynamic and open platform that might consider Cultural Heritage needs as priority. Suitable solution for better and complete data usability and accessibility could be guaranteed by open source protocols. This choice would allow adapting software to Cultural Heritage needs and not the opposite, thus avoiding methodological stretches. This work will focus exactly on analysis and experimentations about specific characteristics of these kind of open source software (DBMS, CAD, Servers) applied to a Cultural Heritage example, in order to verifying their flexibility, reliability and then creating a dynamic HBIM open source prototype. Indeed, it might be a starting point for a future creation of a complete HBIM open source solution that we could adapt to others Cultural Heritage researches and analysis.

  3. Open Source Clinical NLP - More than Any Single System.

    PubMed

    Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang

    2014-01-01

    The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.

  4. The Use of Open Source Software in the Global Land Ice Measurements From Space (GLIMS) Project, and the Relevance to Institutional Cooperation

    Treesearch

    Christopher W. Helm

    2006-01-01

    GLIMS is a NASA funded project that utilizes Open-Source Software to achieve its goal of creating a globally complete inventory of glaciers. The participation of many international institutions and the development of on-line mapping applications to provide access to glacial data have both been enhanced by Open-Source GIS capabilities and play a crucial role in the...

  5. Meteorological Error Budget Using Open Source Data

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using Open- Source Data by J Cogan, J Smith, P...needed. Do not return it to the originator. ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using...Error Budget Using Open-Source Data 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) J Cogan, J Smith, P Haines

  6. Open source bioimage informatics for cell biology

    PubMed Central

    Swedlow, Jason R.; Eliceiri, Kevin W.

    2009-01-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery. PMID:19833518

  7. Numerical Simulation of Dispersion from Urban Greenhouse Gas Sources

    NASA Astrophysics Data System (ADS)

    Nottrott, Anders; Tan, Sze; He, Yonggang; Winkler, Renato

    2017-04-01

    Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model scalar emissions from various components of the natural gas distribution system, to study the impact of urban meteorology on mobile greenhouse gas measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of plumes, due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments. The Boussinesq approximation was applied to investigate the effects of canopy layer temperature gradients and convection on sensor footprints.

  8. Building a Snow Data System on the Apache OODT Open Technology Stack

    NASA Astrophysics Data System (ADS)

    Goodale, C. E.; Painter, T. H.; Mattmann, C. A.; Hart, A. F.; Ramirez, P.; Zimdars, P.; Bryant, A. C.; Snow Data System Team

    2011-12-01

    Snow cover and its melt dominate regional climate and hydrology in many of the world's mountainous regions. One-sixth of Earth's population depends on snow- or glacier-melt for water resources. Operationally, seasonal forecasts of snowmelt-generated streamflow are leveraged through empirical relations based on past snowmelt periods. These historical data show that climate is changing, but the changes reduce the reliability of the empirical relations. Therefore optimal future management of snowmelt derived water resources will require explicit physical models driven by remotely sensed snow property data. Toward this goal, the Snow Optics Laboratory at the Jet Propulsion Laboratory has initiated a near real-time processing pipeline to generate and publish post-processed snow data products within a few hours of satellite acquisition. To solve this challenge, a Scientific Data Management and Processing System was required and the JPL Team leveraged an open-source project called Object Oriented Data Technology (OODT). OODT was developed within NASA's Jet Propulsion Laboratory across the last 10 years. OODT has supported various scientific data management and processing projects, providing solutions in the Earth, Planetary, and Medical science fields. It became apparent that the project needed to be opened to a larger audience to foster and promote growth and adoption. OODT was open-sourced at the Apache Software Foundation in November 2010 and has a growing community of users and committers that are constantly improving the software. Leveraging OODT, the JPL Snow Data System (SnowDS) Team was able to install and configure a core Data Management System (DMS) that would download MODIS raw data files and archive the products in a local repository for post processing. The team has since built an online data portal, and an algorithm-processing pipeline using the Apache OODT software as the foundation. We will present the working SnowDS system with its core remote sensing components: the MODIS Snow Covered Area and Grain size model (MODSCAG) and the MODIS Dust Radiative Forcing in Snow (MOD-DRFS). These products will be delivered in near real time to water managers and the broader cryosphere and climate community beginning in Winter 2012. We will then present the challenges and opportunities we see in the future as the SnowDS matures and contributions are made back to the OODT project.

  9. Analysis of vehicular traffic flow in the major areas of Kuala Lumpur utilizing open-traffic

    NASA Astrophysics Data System (ADS)

    Manogaran, Saargunawathy; Ali, Muhammad; Yusof, Kamaludin Mohamad; Suhaili, Ramdhan

    2017-09-01

    Vehicular traffic congestion occurs when a large number of drivers are overcrowded on the road and the traffic flow does not run smoothly. Traffic congestion causes chaos on the road and interruption to daily activities of users. Time consumed on road give lots of negative effects on productivity, social behavior, environmental and cost to economy. Congestion is worsens and leads to havoc during the emergency such as flood, accidents, road maintenance and etc., where behavior of traffic flow is always unpredictable and uncontrollable. Real-time and historical traffic data are critical inputs for most traffic flow analysis applications. Researcher attempt to predict traffic using simulations as there is no exact model of traffic flow exists due to its high complexity. Open Traffic is an open source platform available for traffic data analysis linked to Open Street Map (OSM). This research is aimed to study and understand the Open Traffic platform. The real-time traffic flow pattern in Kuala Lumpur area was successfully been extracted and analyzed using Open Traffic. It was observed that the congestion occurs on every major road in Kuala Lumpur and most of it owes to the offices and the economic and commercial centers during rush hours. At some roads the congestion occurs at night due to the tourism activities.

  10. Openness to Using Non-cigarette Tobacco Products Among U.S. Young Adults

    PubMed Central

    Mays, Darren; Arrazola, René A.; Tworek, Cindy; Rolle, Italia V.; Neff, Linda J.; Portnoy, David B.

    2017-01-01

    Introduction National data indicate that the prevalence of non-cigarette tobacco product use is highest among young adults; however, little is known about their openness to use these products in the future and associated risk factors. This study sought to characterize openness to using non-cigarette tobacco products and associated factors among U.S. young adults. Methods In 2014, National Adult Tobacco Survey data (2012–2013) were analyzed to characterize openness to using the following tobacco products among all young adults aged 18–29 years (N=5,985): cigars; electronic cigarettes (“e-cigarettes”); hookah; pipe tobacco; chew, snuff, or dip; snus; and dissolvables. Among those who were not current users of each product, multivariable logistic regression was used to examine associations between demographics, cigarette smoking status, lifetime use of other non-cigarette products, perceived harm and addictiveness of smoking, and receipt of tobacco industry promotions and openness to using each product. Results Among all young adults, openness to using non-cigarette tobacco products was greatest for hookah (28.2%); e-cigarettes (25.5%); and cigars (19.1%). In multivariable analyses, which included non-current users of each product, non-current ever, current, and former smokers were more likely than never smokers to be open to using most examined products, as were men and adults aged 18–24 years. Receipt of tobacco industry promotions was associated with openness to using e-cigarettes; chew, snuff, or dip; and snus. Conclusions There is substantial openness to trying non-cigarette tobacco products among U.S. young adults. Young adults are an important population to consider for interventions targeting non-cigarette tobacco product use. PMID:26549502

  11. Openness to Using Non-cigarette Tobacco Products Among U.S. Young Adults.

    PubMed

    Mays, Darren; Arrazola, René A; Tworek, Cindy; Rolle, Italia V; Neff, Linda J; Portnoy, David B

    2016-04-01

    National data indicate that the prevalence of non-cigarette tobacco product use is highest among young adults; however, little is known about their openness to use these products in the future and associated risk factors. This study sought to characterize openness to using non-cigarette tobacco products and associated factors among U.S. young adults. In 2014, National Adult Tobacco Survey data (2012-2013) were analyzed to characterize openness to using the following tobacco products among all young adults aged 18-29 years (N=5,985): cigars; electronic cigarettes ("e-cigarettes"); hookah; pipe tobacco; chew, snuff, or dip; snus; and dissolvables. Among those who were not current users of each product, multivariable logistic regression was used to examine associations between demographics, cigarette smoking status, lifetime use of other non-cigarette products, perceived harm and addictiveness of smoking, and receipt of tobacco industry promotions and openness to using each product. Among all young adults, openness to using non-cigarette tobacco products was greatest for hookah (28.2%); e-cigarettes (25.5%); and cigars (19.1%). In multivariable analyses, which included non-current users of each product, non-current ever, current, and former smokers were more likely than never smokers to be open to using most examined products, as were men and adults aged 18-24 years. Receipt of tobacco industry promotions was associated with openness to using e-cigarettes; chew, snuff, or dip; and snus. There is substantial openness to trying non-cigarette tobacco products among U.S. young adults. Young adults are an important population to consider for interventions targeting non-cigarette tobacco product use. Published by Elsevier Inc.

  12. Practical guide: Tools and methodologies for an oil and gas industry emission inventory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, C.C.; Killian, T.L.

    1996-12-31

    During the preparation of Title V Permit applications, the quantification and speciation of emission sources from oil and gas facilities were reevaluated to determine the {open_quotes}potential-to-emit.{close_quotes} The existing emissions were primarily based on EPA emission factors such as AP-42, for tanks, combustion sources, and fugitive emissions from component leaks. Emissions from insignificant activities and routine operations that are associated with maintenance, startups and shutdowns, and releases to control devices also required quantification. To reconcile EPA emission factors with test data, process knowledge, and manufacturer`s data, a careful review of other estimation options was performed. This paper represents the results ofmore » this analysis of emission sources at oil and gas facilities, including exploration and production, compressor stations and gas plants.« less

  13. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  14. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  15. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  16. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  17. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...

  18. Soil and nutrient retention in winter-flooded ricefields with implications for watershed management

    USGS Publications Warehouse

    Manley, S.W.; Kaminski, R.M.; Rodrigue, P.B.; Dewey, J.C.; Schoenholtz, S.H.; Gerard, P.D.; Reinecke, K.J.

    2009-01-01

    The ability of water resources to support aquatic life and human needs depends, in part, on reducing nonpoint source pollution amid contemporary agricultural practices. Winter retention of shallow water on rice and other agricultural fields is an accepted management practice for wildlife conservation; however, soil and water conservation benefits are not well documented. We evaluated the ability of four post-harvest ricefield treatment combinations (stubble-flooded, stubble-open, disked-flooded and disked-open) to abate nonpoint source exports into watersheds of the Mississippi Alluvial Valley. Total suspended solid exports were 1,121 kg ha-1 (1,000 lb ac-1) from disked-open fields where rice stubble was disked after harvest and fields were allowed to drain, compared with 35 kg ha-1 (31 lb ac-1) from stubble-flooded fields where stubble was left standing after harvest and fields captured rainfall from November 1 to March 1. Estimates of total suspended solid exports from ricefields based on Landsat imagery and USDA crop data are 0.43 and 0.40 Mg km-2 day-1 in the Big Sunflower and L'Anguille watersheds, respectively. Estimated reductions in total suspended solid exports from ricefields into the Big Sunflower and L'Anguille water-sheds range from 26% to 64% under hypothetical scenarios in which 65% to 100% of the rice production area is managed to capture winter rainfall. Winter ricefield management reduced nonpoint source export by decreasing concentrations of solids and nutrients in, and reducing runoff volume from, ricefields in the Mississippi Alluvial Valley.

  19. Sharing Lessons-Learned on Effective Open Data, Open-Source Practices from OpenAQ, a Global Open Air Quality Community.

    NASA Astrophysics Data System (ADS)

    Hasenkopf, C. A.

    2017-12-01

    Increasingly, open data, open-source projects are unearthing rich datasets and tools, previously impossible for more traditional avenues to generate. These projects are possible, in part, because of the emergence of online collaborative and code-sharing tools, decreasing costs of cloud-based services to fetch, store, and serve data, and increasing interest of individuals to contribute their time and skills to 'open projects.' While such projects have generated palpable enthusiasm from many sectors, many of these projects face uncharted paths for sustainability, visibility, and acceptance. Our project, OpenAQ, is an example of an open-source, open data community that is currently forging its own uncharted path. OpenAQ is an open air quality data platform that aggregates and universally formats government and research-grade air quality data from 50 countries across the world. To date, we make available more than 76 million air quality (PM2.5, PM10, SO2, NO2, O3, CO and black carbon) data points through an open Application Programming Interface (API) and a user-customizable download interface at https://openaq.org. The goal of the platform is to enable an ecosystem of users to advance air pollution efforts from science to policy to the private sector. The platform is also an open-source project (https://github.com/openaq) and has only been made possible through the coding and data contributions of individuals around the world. In our first two years of existence, we have seen requests for data to our API skyrocket to more than 6 million datapoints per month, and use-cases as varied as ingesting data aggregated from our system into real-time models of wildfires to building open-source statistical packages (e.g. ropenaq and py-openaq) on top of the platform to creating public-friendly apps and chatbots. We will share a whirl-wind trip through our evolution and the many lessons learned so far related to platform structure, community engagement, organizational model type and sustainability.

  20. Comprehensive Routing Security Development and Deployment for the Internet

    DTIC Science & Technology

    2015-02-01

    feature enhancement and bug fixes. • MySQL : MySQL is a widely used and popular open source database package. It was chosen for database support in the...RPSTIR depends on several other open source packages. • MySQL : MySQL is used for the the local RPKI database cache. • OpenSSL: OpenSSL is used for...cryptographic libraries for X.509 certificates. • ODBC mySql Connector: ODBC (Open Database Connectivity) is a standard programming interface (API) for

  1. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  2. Multi-INT fusion to support port and harbor security and general maritime awareness

    NASA Astrophysics Data System (ADS)

    Von Kahle, Louis; Alexander, Robert

    2006-05-01

    The international community's focus on deterring terrorism has identified many vulnerabilities to a country's borders. These vulnerabilities include not only airports and rail lines but also the ports, harbors and miles of coastline which many countries must protect. In seeking to address this challenge, many technologies, processes and procedures have been identified that utilize single point or single source INT's (i.e., sources of intelligence - signals: SIGINT, imagery: IMINT, and open-source: INTERNET). These single source data sets include the information gleaned from shipping lines, port arrival and departure information and information from shipboard based electronic systems like the Automatic Identification System (AIS). Typically these are evaluated and incorporated into products or decisions in a singular manner and not with any reference or relationship to each other. In this work, an identification and analysis of these data sets will be performed in order to determine: •Any commonality between these data sets, •The ability to fuse information between these data sets, •The ability to determine relationships between these data sets, and •The ability to present any fused information or relationships in a timely manner In summary, the work served as a means for determining the data sets that were of the highest value and for determining the fusion method for producing a product of value. More work can be done to define the data sets that have the most commonality and thus will help to produce a fused product in the most timely and efficient manner.

  3. Light use efficiency for vegetables production in protected and indoor environments

    NASA Astrophysics Data System (ADS)

    Cocetta, Giacomo; Casciani, Daria; Bulgari, Roberta; Musante, Fulvio; Kołton, Anna; Rossi, Maurizio; Ferrante, Antonio

    2017-01-01

    In recent years, there is a growing interest for vegetables production in indoor or disadvantaged climatic zones by using greenhouses. The main problem of crop growing indoor or in environment with limited light availability is the correct choice of light source and the quality of lighting spectrum. In greenhouse and indoor cultivations, plant density is higher than in the open field and plants have to compete for light and nutrients. Nowadays, advanced systems for indoor horticulture use light emitting diodes (LED) for improving crop growth, enhancing the plant productivity and favouring the best nutritional quality formation. In closed environments, as indoor growing modules, the lighting system represents the only source of light and its features are fundamental for obtaining the best lighting performances for plant and the most efficient solution. LED lighting engines are more efficient compared to the lighting sources used traditionally in horticulture and allow light spectrum and intensity modulations to enhance the light use efficiency for plants. The lighting distribution and the digital controls are fundamental for tailoring the spectral distribution on each plant in specific moments of its growth and play an important role for optimizing growth and produce high-quality vegetables. LED lights can increase plant growth and yield, but also nutraceutical quality, since some light intensities increase pigments biosynthesis and enhance the antioxidants content of leaves or fruits: in this regards the selection of LED primary light sources in relation to the peaks of the absorbance curve of the plants is important.

  4. GIS-Based Noise Simulation Open Source Software: N-GNOIS

    NASA Astrophysics Data System (ADS)

    Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh

    2015-12-01

    Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.

  5. Utilization of open source electronic health record around the world: A systematic review

    PubMed Central

    Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahamdi, Maryam

    2014-01-01

    Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems. PMID:24672566

  6. Characterization of emission factors related to source activity for trichloroethylene degreasing and chrome plating processes.

    PubMed

    Wadden, R A; Hawkins, J L; Scheff, P A; Franke, J E

    1991-09-01

    A study at an automotive parts fabrication plant evaluated four metal surface treatment processes during production conditions. The evaluation provides examples of how to estimate process emission factors from activity and air concentration data. The processes were open tank and enclosed tank degreasing with trichloroethylene (TCE), chromium conversion coating, and chromium electroplating. Area concentrations of TCE and chromium (Cr) were monitored for 1-hr periods at three distances from each process. Source activities at each process were recorded during each sampling interval. Emission rates were determined by applying appropriate mass balance models to the concentration patterns around each source. The emission factors obtained from regression analysis of the emission rate and activity data were 16.9 g TCE/basket of parts for the open-top degreaser; 1.0 g TCE/1000 parts for the enclosed degreaser; 1.48-1.64 mg Cr/1000 parts processed in the hot CrO3/HNO3 tank for the chrome conversion coating; and 5.35-9.17 mg Cr/rack of parts for chrome electroplating. The factors were also used to determine the efficiency of collection for the local exhaust systems serving each process. Although the number of observations were limited, these factors may be useful for providing initial estimates of emissions from similar processes in other settings.

  7. The HYPE Open Source Community

    NASA Astrophysics Data System (ADS)

    Strömbäck, L.; Pers, C.; Isberg, K.; Nyström, K.; Arheimer, B.

    2013-12-01

    The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model. It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. In the model, the landscape is divided into classes according to soil type, vegetation and altitude. The soil representation is stratified and can be divided in up to three layers. Water and substances are routed through the same flow paths and storages (snow, soil, groundwater, streams, rivers, lakes) considering turn-over and transformation on the way towards the sea. HYPE has been successfully used in many hydrological applications at SMHI. For Europe, we currently have three different models; The S-HYPE model for Sweden; The BALT-HYPE model for the Baltic Sea; and the E-HYPE model for the whole Europe. These models simulate hydrological conditions and nutrients for their respective areas and are used for characterization, forecasts, and scenario analyses. Model data can be downloaded from hypeweb.smhi.se. In addition, we provide models for the Arctic region, the Arab (Middle East and Northern Africa) region, India, the Niger River basin, the La Plata Basin. This demonstrates the applicability of the HYPE model for large scale modeling in different regions of the world. An important goal with our work is to make our data and tools available as open data and services. For this aim we created the HYPE Open Source Community (OSC) that makes the source code of HYPE available for anyone interested in further development of HYPE. The HYPE OSC (hype.sourceforge.net) is an open source initiative under the Lesser GNU Public License taken by SMHI to strengthen international collaboration in hydrological modeling and hydrological data production. The hypothesis is that more brains and more testing will result in better models and better code. The code is transparent and can be changed and learnt from. New versions of the main code are delivered frequently. HYPE OSC is open to everyone interested in hydrology, hydrological modeling and code development - e.g. scientists, authorities, and consultancies. By joining the HYPE OSC you get access a state-of-the-art operational hydrological model. The HYPE source code is designed to efficiently handle large scale modeling for forecast, hindcast and climate applications. The code is under constant development to improve the hydrological processes, efficiency and readability. In the beginning of 2013 we released a version with new and better modularization based on hydrological processes. This will make the code easier to understand and further develop for a new user. An important challenge in this process is to produce code that is easy for anyone to understand and work with, but still maintain the properties that make the code efficient enough for large scale applications. Input from the HYPE Open Source Community is an important source for future improvements of the HYPE model. Therefore, by joining the community you become an active part of the development, get access to the latest features and can influence future versions of the model.

  8. Bioclipse: an open source workbench for chemo- and bioinformatics.

    PubMed

    Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl E S

    2007-02-22

    There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no successful attempts have been made to integrate chemo- and bioinformatics into a single framework. Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at http://www.bioclipse.net.

  9. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  10. Novel approach for extinguishing large-scale coal fires using gas-liquid foams in open pit mines.

    PubMed

    Lu, Xinxiao; Wang, Deming; Qin, Botao; Tian, Fuchao; Shi, Guangyi; Dong, Shuaijun

    2015-12-01

    Coal fires are a serious threat to the workers' security and safe production in open pit mines. The coal fire source is hidden and innumerable, and the large-area cavity is prevalent in the coal seam after the coal burned, causing the conventional extinguishment technology difficult to work. Foams are considered as an efficient means of fire extinguishment in these large-scale workplaces. A noble foam preparation method is introduced, and an original design of cavitation jet device is proposed to add foaming agent stably. The jet cavitation occurs when the water flow rate and pressure ratio reach specified values. Through self-building foaming system, the high performance foams are produced and then infused into the blast drilling holes at a large flow. Without complicated operation, this system is found to be very suitable for extinguishing large-scale coal fires. Field application shows that foam generation adopting the proposed key technology makes a good fire extinguishment effect. The temperature reduction using foams is 6-7 times higher than water, and CO concentration is reduced from 9.43 to 0.092‰ in the drilling hole. The coal fires are controlled successfully in open pit mines, ensuring the normal production as well as the security of personnel and equipment.

  11. Life cycle of PCBs and contamination of the environment and of food products from animal origin.

    PubMed

    Weber, Roland; Herold, Christine; Hollert, Henner; Kamphues, Josef; Ungemach, Linda; Blepp, Markus; Ballschmiter, Karlheinz

    2018-06-01

    This report gives a summary of the historic use, former management and current release of polychlorinated biphenyls (PCBs) in Germany and assesses the impact of the life cycle of PCBs on the contamination of the environment and of food products of animal origin. In Germany 60,000 t of PCBs were used in transformers, capacitors or as hydraulic oils. The use of PCB oils in these "closed applications", has been banned in Germany in 2000. Thirty to 50% of these PCBs were not appropriately managed. In West Germany, 24,000 t of PCBs were used in open applications, mainly as additive (plasticiser, flame retardant) in sealants and paints in buildings and other construction. The continued use in open applications has not been banned, and in 2013, an estimated more than 12,000 t of PCBs were still present in buildings and other constructions. These open PCB applications continuously emit PCBs into the environment with an estimated release of 7-12 t per year. This amount is in agreement with deposition measurements (estimated to 18 t) and emission estimates for Switzerland. The atmospheric PCB releases still have an relevant impact on vegetation and livestock feed. In addition, PCBs in open applications on farms are still a sources of contamination for farmed animals. Furthermore, the historic production, use, recycling and disposal of PCBs have contaminated soils along the lifecycle. This legacy of contaminated soils and contaminated feed, individually or collectively, can lead to exceedance of maximum levels in food products from animals. In beef and chicken, soil levels of 5 ng PCB-TEQ/kg and for chicken with high soil exposure even 2 ng PCB-TEQ/kg can lead to exceedance of EU limits in meat and eggs. Areas at and around industries having produced or used or managed PCBs, or facilities and areas where PCBs were disposed need to be assessed in respect to potential contamination of food-producing animals. For a large share of impacted land, management measures applicable on farm level might be sufficient to continue with food production. Open PCB applications need to be inventoried and better managed. Other persistent and toxic chemicals used as alternatives to PCBs, e.g. short chain chlorinated paraffins (SCCPs), should be assessed in the life cycle for exposure of food-producing animals and humans.

  12. Improving the Product Documentation Process of a Small Software Company

    NASA Astrophysics Data System (ADS)

    Valtanen, Anu; Ahonen, Jarmo J.; Savolainen, Paula

    Documentation is an important part of the software process, even though it is often neglected in software companies. The eternal question is how much documentation is enough. In this article, we present a practical implementation of lightweight product documentation process resulting from SPI efforts in a small company. Small companies’ financial and human resources are often limited. The documentation process described here, offers a template for creating adequate documentation consuming minimal amount of resources. The key element of the documentation process is an open source web-based bugtracking system that was customized to be used as a documentation tool. The use of the tool enables iterative and well structured documentation. The solution best serves the needs of a small company with off-the-shelf software products and striving for SPI.

  13. A user-friendly means to scale from the biochemistry of photosynthesis to whole crop canopies and production in time and space - development of Java WIMOVAC.

    PubMed

    Song, Qingfeng; Chen, Dairui; Long, Stephen P; Zhu, Xin-Guang

    2017-01-01

    Windows Intuitive Model of Vegetation response to Atmosphere and Climate Change (WIMOVAC) has been used widely as a generic modular mechanistically rich model of plant production. It can predict the responses of leaf and canopy carbon balance, as well as production in different environmental conditions, in particular those relevant to global change. Here, we introduce an open source Java user-friendly version of WIMOVAC. This software is platform independent and can be easily downloaded to a laptop and used without any prior programming skills. In this article, we describe the structure, equations and user guide and illustrate some potential applications of WIMOVAC. © 2016 The Authors Plant, Cell & Environment Published by John Wiley & Sons Ltd.

  14. The yeast stands alone: the future of protein biologic production.

    PubMed

    Love, Kerry R; Dalvie, Neil C; Love, J Christopher

    2017-12-22

    Yeasts are promising alternative hosts for the manufacturing of recombinant protein therapeutics because they simply and efficiently meet needs for both platform and small-market drugs. Fast accumulation of biomass and low-cost media reduce the cost-of-goods when using yeast, which in turn can enable agile, small-volume manufacturing facilities. Small, tractable yeast genomes are amenable to rapid process development, facilitating strain and product quality by design. Specifically, Pichia pastoris is becoming a widely accepted yeast for biopharmaceutical manufacturing in much of the world owing to a clean secreted product and the rapidly expanding understanding of its cell biology as a host organism. We advocate for a near term partnership spanning industry and academia to promote open source, timely development of yeast hosts. Copyright © 2017. Published by Elsevier Ltd.

  15. Defending the Amazon: Conservation, Development and Security in Brazil

    DTIC Science & Technology

    2009-03-01

    against drugs is not 191 Nelson Jobim, interview by Empresa Brasil de Comunicação Radio, trans. Open Source Center, February 6, 2009, available from... Empresa Brasil de Comunicação Radio, trans. Open Source Center, February 6, 2009, available from http://www.ebc.com.br (accessed February 23, 2009...Institute of Peace, 1996. Jobim, Nelson. Interview by Empresa Brasil de Comunicação Radio. Translated by Open Source Center. February 6, 2009

  16. Open-Source web-based geographical information system for health exposure assessment

    PubMed Central

    2012-01-01

    This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software. PMID:22233606

  17. Analysis of greenhouse gas emissions from 10 biogas plants within the agricultural sector.

    PubMed

    Liebetrau, J; Reinelt, T; Clemens, J; Hafermann, C; Friehe, J; Weiland, P

    2013-01-01

    With the increasing number of biogas plants in Germany the necessity for an exact determination of the actual effect on the greenhouse gas emissions related to the energy production gains importance. Hitherto the life cycle assessments have been based on estimations of emissions of biogas plants. The lack of actual emission evaluations has been addressed within a project from which the selected results are presented here. The data presented here have been obtained during a survey in which 10 biogas plants were analysed within two measurement periods each. As the major methane emission sources the open storage of digestates ranging from 0.22 to 11.2% of the methane utilized and the exhaust of the co-generation units ranging from 0.40 to 3.28% have been identified. Relevant ammonia emissions have been detected from the open digestate storage. The main source of nitrous oxide emissions was the co-generation unit. Regarding the potential of measures to reduce emissions it is highly recommended to focus on the digestate storage and the exhaust of the co-generation.

  18. CFHT data processing and calibration ESPaDOnS pipeline: Upena and OPERA (optical spectropolarimetry)

    NASA Astrophysics Data System (ADS)

    Martioli, Eder; Teeple, D.; Manset, Nadine

    2011-03-01

    CFHT is ESPaDOnS responsible for processing raw images, removing instrument related artifacts, and delivering science-ready data to the PIs. Here we describe the Upena pipeline, which is the software used to reduce the echelle spectro-polarimetric data obtained with the ESPaDOnS instrument. Upena is an automated pipeline that performs calibration and reduction of raw images. Upena has the capability of both performing real-time image-by-image basis reduction and a post observing night complete reduction. Upena produces polarization and intensity spectra in FITS format. The pipeline is designed to perform parallel computing for improved speed, which assures that the final products are delivered to the PIs before noon HST after each night of observations. We also present the OPERA project, which is an open-source pipeline to reduce ESPaDOnS data that will be developed as a collaborative work between CFHT and the scientific community. OPERA will match the core capabilities of Upena and in addition will be open-source, flexible and extensible.

  19. Beyond imperviousness: A statistical approach to identifying functional differences between development morphologies on variable source area-type response in urbanized watersheds

    NASA Astrophysics Data System (ADS)

    Lim, T. C.

    2016-12-01

    Empirical evidence has shown linkages between urbanization, hydrological regime change, and degradation of water quality and aquatic habitat. Percent imperviousness, has long been suggested as the dominant source of these negative changes. However, recent research identifying alternative pathways of runoff production at the watershed scale have called into question percent impervious surface area's primacy in urban runoff production compared to other aspects of urbanization including change in vegetative cover, imported water and water leakages, and the presence of drainage infrastructure. In this research I show how a robust statistical methodology can detect evidence of variable source area (VSA)-type hydrologic response associated with incremental hydraulic connectivity in watersheds. I then use logistic regression to explore how evidence of VSA-type response relates to the physical and meterological characteristics of the watershed. I find that impervious surface area is highly correlated with development, but does not add significant explanatory power beyond percent developed in predicting VSA-type response. Other aspects of development morphology, including percent developed open space and type of drainage infrastructure also do not add to the explanatory power of undeveloped land in predicting VSA-type response. Within only developed areas, the effect of developed open space was found to be more similar to that of total impervious area than to undeveloped land. These findings were consistent when tested across a national cross-section of urbanized watersheds, a higher resolution dataset of Baltimore Metropolitan Area watersheds, and a subsample of watersheds confirmed not to be served by combined sewer systems. These findings suggest that land development policies that focus on lot coverage should be revisited, and more focus should be placed on preserving native vegetation and soil conditions alongside development.

  20. From the epipelagic zone to the abyss: Trophic structure at two seamounts in the subtropical and tropical Eastern Atlantic - Part I zooplankton and micronekton

    NASA Astrophysics Data System (ADS)

    Denda, Anneke; Stefanowitsch, Benjamin; Christiansen, Bernd

    2017-12-01

    Specific mechanisms, driving trophic interactions within the pelagic community may be highly variable in different seamount systems. This study investigated the trophic structure of zooplankton and micronekton above and around Ampère and Senghor, two shallow seamounts in the subtropical and tropical Eastern Atlantic, and over the adjacent abyssal plains. For the identification of food sources and trophic positions stable isotope ratios (δ13C and δ15N) were used. δ13C ranged from -24.7‰ to -15.0‰ and δ15N covered a total range of 0.9-15.9‰. Based on epipelagic particulate organic matter, zooplankton and micronekton usually occupied the 1st-3rd trophic level, including herbivorous, omnivorous and carnivorous taxa. δ13C and δ15N values were generally lower in zooplankton and micronekton of the subtropical waters as compared to the tropical region, due to the differing nutrient availability and phytoplankton communities. Correlations between δ13C and δ15N values of particulate organic matter, zooplankton, micronekton and benthopelagic fishes suggest a linear food chain based on a single energy source from primary production for Ampère Seamount, but no evidence was found for an autochthonus seamount production as compared to the open ocean reference site. Between Senghor Seamount and the open ocean δ13C signatures indicate that hydrodynamic effects at seamounts may modify the energy supply at times, but evidence for a seamount effect on the trophic structure of the pelagic communities was weak, which supports the assumption that seamount communities rely to a large extent on advected food sources.

Top