Sample records for open source distribution

  1. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  2. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    PubMed

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  3. Free for All: Open Source Software

    ERIC Educational Resources Information Center

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  4. Flows and Stratification of an Enclosure Containing Both Localised and Vertically Distributed Sources of Buoyancy

    NASA Astrophysics Data System (ADS)

    Partridge, Jamie; Linden, Paul

    2013-11-01

    We examine the flows and stratification established in a naturally ventilated enclosure containing both a localised and vertically distributed source of buoyancy. The enclosure is ventilated through upper and lower openings which connect the space to an external ambient. Small scale laboratory experiments were carried out with water as the working medium and buoyancy being driven directly by temperature differences. A point source plume gave localised heating while the distributed source was driven by a controllable heater mat located in the side wall of the enclosure. The transient temperatures, as well as steady state temperature profiles, were recorded and are reported here. The temperature profiles inside the enclosure were found to be dependent on the effective opening area A*, a combination of the upper and lower openings, and the ratio of buoyancy fluxes from the distributed and localised source Ψ =Bw/Bp . Industrial CASE award with ARUP.

  5. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  6. Open-source hardware for medical devices

    PubMed Central

    2016-01-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528

  7. Open-source hardware for medical devices.

    PubMed

    Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold

    2016-04-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.

  8. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  9. Cyberscience and the Knowledge-Based Economy. Open Access and Trade Publishing: From Contradiction to Compatibility with Non-Exclusive Copyright Licensing

    ERIC Educational Resources Information Center

    Armbruster, Chris

    2008-01-01

    Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge…

  10. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    PubMed

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  11. Open-Source Intelligence in the Czech Military: Knowledge System and Process Design

    DTIC Science & Technology

    2002-06-01

    in Open-Source Intelligence OSINT, as one of the intelligence disciplines, bears some of the general problems of intelligence " business " OSINT...ADAPTING KNOWLEDGE MANAGEMENT THEORY TO THE CZECH MILITARY INTELLIGENCE Knowledge work is the core business of the military intelligence . As...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited OPEN-SOURCE INTELLIGENCE IN THE

  12. Developing open-source codes for electromagnetic geophysics using industry support

    NASA Astrophysics Data System (ADS)

    Key, K.

    2017-12-01

    Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.

  13. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  14. "WWW.MDTF.ORG": a World Wide Web forum for developing open-architecture, freely distributed, digital teaching file software by participant consensus.

    PubMed

    Katzman, G L; Morris, D; Lauman, J; Cochella, C; Goede, P; Harnsberger, H R

    2001-06-01

    To foster a community supported evaluation processes for open-source digital teaching file (DTF) development and maintenance. The mechanisms used to support this process will include standard web browsers, web servers, forum software, and custom additions to the forum software to potentially enable a mediated voting protocol. The web server will also serve as a focal point for beta and release software distribution, which is the desired end-goal of this process. We foresee that www.mdtf.org will provide for widespread distribution of open source DTF software that will include function and interface design decisions from community participation on the website forums.

  15. OpenMC In Situ Source Convergence Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldrich, Garrett Allen; Dutta, Soumya; Woodring, Jonathan Lee

    2016-05-07

    We designed and implemented an in situ version of particle source convergence for the OpenMC particle transport simulator. OpenMC is a Monte Carlo based-particle simulator for neutron criticality calculations. For the transport simulation to be accurate, source particles must converge on a spatial distribution. Typically, convergence is obtained by iterating the simulation by a user-settable, fixed number of steps, and it is assumed that convergence is achieved. We instead implement a method to detect convergence, using the stochastic oscillator for identifying convergence of source particles based on their accumulated Shannon Entropy. Using our in situ convergence detection, we are ablemore » to detect and begin tallying results for the full simulation once the proper source distribution has been confirmed. Our method ensures that the simulation is not started too early, by a user setting too optimistic parameters, or too late, by setting too conservative a parameter.« less

  16. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segre, Daniel

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  17. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  18. EPRI and Schneider Electric Demonstrate Distributed Resource Communications

    Science.gov Websites

    Electric Power Research Institute (EPRI) is designing, building, and testing a flexible, open-source Schneider Electric ADMS, open software platforms, an open-platform home energy management system

  19. Open source 3D printers: an appropriate technology for building low cost optics labs for the developing communities

    NASA Astrophysics Data System (ADS)

    Gwamuri, J.; Pearce, Joshua M.

    2017-08-01

    The recent introduction of RepRap (self-replicating rapid prototyper) 3-D printers and the resultant open source technological improvements have resulted in affordable 3-D printing, enabling low-cost distributed manufacturing for individuals. This development and others such as the rise of open source-appropriate technology (OSAT) and solar powered 3-D printing are moving 3-D printing from an industry based technology to one that could be used in the developing world for sustainable development. In this paper, we explore some specific technological improvements and how distributed manufacturing with open-source 3-D printing can be used to provide open-source 3-D printable optics components for developing world communities through the ability to print less expensive and customized products. This paper presents an open-source low cost optical equipment library which enables relatively easily adapted customizable designs with the potential of changing the way optics is taught in resource constraint communities. The study shows that this method of scientific hardware development has a potential to enables a much broader audience to participate in optical experimentation both as research and teaching platforms. Conclusions on the technical viability of 3-D printing to assist in development and recommendations on how developing communities can fully exploit this technology to improve the learning of optics through hands-on methods have been outlined.

  20. A Dozen Years after Open Source's 1998 Birth, It's Time for "OpenTechComm"

    ERIC Educational Resources Information Center

    Still, Brian

    2010-01-01

    2008 marked the 10-year Anniversary of the Open Source movement, which has had a substantial impact on not only software production and adoption, but also on the sharing and distribution of information. Technical communication as a discipline has taken some advantage of the movement or its derivative software, but this article argues not as much…

  1. Open Source Service Agent (OSSA) in the intelligence community's Open Source Architecture

    NASA Technical Reports Server (NTRS)

    Fiene, Bruce F.

    1994-01-01

    The Community Open Source Program Office (COSPO) has developed an architecture for the intelligence community's new Open Source Information System (OSIS). The architecture is a multi-phased program featuring connectivity, interoperability, and functionality. OSIS is based on a distributed architecture concept. The system is designed to function as a virtual entity. OSIS will be a restricted (non-public), user configured network employing Internet communications. Privacy and authentication will be provided through firewall protection. Connection to OSIS can be made through any server on the Internet or through dial-up modems provided the appropriate firewall authentication system is installed on the client.

  2. Distributed GIS Systems, Open Specifications and Interoperability: How do They Relate to the Sustainable Management of Natural Resources?

    Treesearch

    Rafael Moreno-Sanchez

    2006-01-01

    The aim of this is paper is to provide a conceptual framework for the session: “The role of web-based Geographic Information Systems in supporting sustainable management.” The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...

  3. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    NASA Astrophysics Data System (ADS)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  4. Numerical Simulation of Dispersion from Urban Greenhouse Gas Sources

    NASA Astrophysics Data System (ADS)

    Nottrott, Anders; Tan, Sze; He, Yonggang; Winkler, Renato

    2017-04-01

    Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model scalar emissions from various components of the natural gas distribution system, to study the impact of urban meteorology on mobile greenhouse gas measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of plumes, due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments. The Boussinesq approximation was applied to investigate the effects of canopy layer temperature gradients and convection on sensor footprints.

  5. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    NASA Technical Reports Server (NTRS)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  6. Empirical tests of Zipf's law mechanism in open source Linux distribution.

    PubMed

    Maillart, T; Sornette, D; Spaeth, S; von Krogh, G

    2008-11-21

    Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.

  7. Measuring trace gas emission from multi-distributed sources using vertical radial plume mapping (VRPM) and backward Lagrangian stochastic (bLS) techniques

    USDA-ARS?s Scientific Manuscript database

    Two micrometeorological techniques for measuring trace gas emission rates from distributed area sources were evaluated using a variety of synthetic area sources. The accuracy of the vertical radial plume mapping (VRPM) and the backward Lagrangian (bLS) techniques with an open-path optical spectrosco...

  8. A New Architecture for Visualization: Open Mission Control Technologies

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2017-01-01

    Open Mission Control Technologies (MCT) is a new architecture for visualisation of mission data. Driven by requirements for new mission capabilities, including distributed mission operations, access to data anywhere, customization by users, synthesis of multiple data sources, and flexibility for multi-mission adaptation, Open MCT provides users with an integrated customizable environment. Developed at NASAs Ames Research Center (ARC), in collaboration with NASAs Advanced Multimission Operations System (AMMOS) and NASAs Jet Propulsion Laboratory (JPL), Open MCT is getting its first mission use on the Jason 3 Mission, and is also available in the testbed for the Mars 2020 Rover and for development use for NASAs Resource Prospector Lunar Rover. The open source nature of the project provides for use outside of space missions, including open source contributions from a community of users. The defining features of Open MCT for mission users are data integration, end user composition and multiple views. Data integration provides access to mission data across domains in one place, making data such as activities, timelines, telemetry, imagery, event timers and procedures available in one place, without application switching. End user composition provides users with layouts, which act as a canvas to assemble visualisations. Multiple views provide the capability to view the same data in different ways, with live switching of data views in place. Open MCT is browser based, and works on the desktop as well as tablets and phones, providing access to data anywhere. An early use case for mobile data access took place on the Resource Prospector (RP) Mission Distributed Operations Test, in which rover engineers in the field were able to view telemetry on their phones. We envision this capability providing decision support to on console operators from off duty personnel. The plug-in architecture also allows for adaptation for different mission capabilities. Different data types and capabilities may be added or removed using plugins. An API provides a means to write new capabilities and to create data adaptors. Data plugins exist for mission data sources for NASA missions. Adaptors have been written by international and commercial users. Open MCT is open source. Open source enables collaborative development across organizations and also makes the product available outside of the space community, providing a potential source of usage and ideas to drive product design and development. The combination of open source with an Apache 2 license, and distribution on GitHub, has enabled an active community of users and contributors. The spectrum of users for Open MCT is, to our knowledge, unprecedented for mission software. In addition to our NASA users, we have, through open source, had users and inquires on projects ranging from Internet of Things, to radio hobbyists, to farming projects. We have an active community of contributors, enabling a flow of ideas inside and outside of the space community.

  9. Open Source Live Distributions for Computer Forensics

    NASA Astrophysics Data System (ADS)

    Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele

    Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.

  10. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  11. Characterizing open and non-uniform vertical heat sources: towards the identification of real vertical cracks in vibrothermography experiments

    NASA Astrophysics Data System (ADS)

    Castelo, A.; Mendioroz, A.; Celorrio, R.; Salazar, A.; López de Uralde, P.; Gorosmendi, I.; Gorostegui-Colinas, E.

    2017-05-01

    Lock-in vibrothermography is used to characterize vertical kissing and open cracks in metals. In this technique the crack heats up during ultrasound excitation due mainly to friction between the defect's faces. We have solved the inverse problem, consisting in determining the heat source distribution produced at cracks under amplitude modulated ultrasound excitation, which is an ill-posed inverse problem. As a consequence the minimization of the residual is unstable. We have stabilized the algorithm introducing a penalty term based on Total Variation functional. In the inversion, we combine amplitude and phase surface temperature data obtained at several modulation frequencies. Inversions of synthetic data with added noise indicate that compact heat sources are characterized accurately and that the particular upper contours can be retrieved for shallow heat sources. The overall shape of open and homogeneous semicircular strip-shaped heat sources representing open half-penny cracks can also be retrieved but the reconstruction of the deeper end of the heat source loses contrast. Angle-, radius- and depth-dependent inhomogeneous heat flux distributions within these semicircular strips can also be qualitatively characterized. Reconstructions of experimental data taken on samples containing calibrated heat sources confirm the predictions from reconstructions of synthetic data. We also present inversions of experimental data obtained from a real welded Inconel 718 specimen. The results are in good qualitative agreement with the results of liquids penetrants testing.

  12. Open-path spectroscopic methane detection using a broadband monolithic distributed feedback-quantum cascade laser array.

    PubMed

    Michel, Anna P M; Kapit, Jason; Witinski, Mark F; Blanchard, Romain

    2017-04-10

    Methane is a powerful greenhouse gas that has both natural and anthropogenic sources. The ability to measure methane using an integrated path length approach such as an open/long-path length sensor would be beneficial in several environments for examining anthropogenic and natural sources, including tundra landscapes, rivers, lakes, landfills, estuaries, fracking sites, pipelines, and agricultural sites. Here a broadband monolithic distributed feedback-quantum cascade laser array was utilized as the source for an open-path methane sensor. Two telescopes were utilized for the launch (laser source) and receiver (detector) in a bistatic configuration for methane sensing across a 50 m path length. Direct-absorption spectroscopy was utilized with intrapulse tuning. Ambient methane levels were detectable, and an instrument precision of 70 ppb with 100 s averaging and 90 ppb with 10 s averaging was achieved. The sensor system was designed to work "off the grid" and utilizes batteries that are rechargeable with solar panels and wind turbines.

  13. Managing multicentre clinical trials with open source.

    PubMed

    Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan

    2014-03-01

    Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.

  14. E-Standards For Mass Properties Engineering

    NASA Technical Reports Server (NTRS)

    Cerro, Jeffrey A.

    2008-01-01

    A proposal is put forth to promote the concept of a Society of Allied Weight Engineers developed voluntary consensus standard for mass properties engineering. This standard would be an e-standard, and would encompass data, data manipulation, and reporting functionality. The standard would be implemented via an open-source SAWE distribution site with full SAWE member body access. Engineering societies and global standards initiatives are progressing toward modern engineering standards, which become functioning deliverable data sets. These data sets, if properly standardized, will integrate easily between supplier and customer enabling technically precise mass properties data exchange. The concepts of object-oriented programming support all of these requirements, and the use of a JavaTx based open-source development initiative is proposed. Results are reported for activity sponsored by the NASA Langley Research Center Innovation Institute to scope out requirements for developing a mass properties engineering e-standard. An initial software distribution is proposed. Upon completion, an open-source application programming interface will be available to SAWE members for the development of more specific programming requirements that are tailored to company and project requirements. A fully functioning application programming interface will permit code extension via company proprietary techniques, as well as through continued open-source initiatives.

  15. A Web-based open-source database for the distribution of hyperspectral signatures

    NASA Astrophysics Data System (ADS)

    Ferwerda, J. G.; Jones, S. D.; Du, Pei-Jun

    2006-10-01

    With the coming of age of field spectroscopy as a non-destructive means to collect information on the physiology of vegetation, there is a need for storage of signatures, and, more importantly, their metadata. Without the proper organisation of metadata, the signatures itself become limited. In order to facilitate re-distribution of data, a database for the storage & distribution of hyperspectral signatures and their metadata was designed. The database was built using open-source software, and can be used by the hyperspectral community to share their data. Data is uploaded through a simple web-based interface. The database recognizes major file-formats by ASD, GER and International Spectronics. The database source code is available for download through the hyperspectral.info web domain, and we happily invite suggestion for additions & modification for the database to be submitted through the online forums on the same website.

  16. Method for image reconstruction of moving radionuclide source distribution

    DOEpatents

    Stolin, Alexander V.; McKisson, John E.; Lee, Seung Joon; Smith, Mark Frederick

    2012-12-18

    A method for image reconstruction of moving radionuclide distributions. Its particular embodiment is for single photon emission computed tomography (SPECT) imaging of awake animals, though its techniques are general enough to be applied to other moving radionuclide distributions as well. The invention eliminates motion and blurring artifacts for image reconstructions of moving source distributions. This opens new avenues in the area of small animal brain imaging with radiotracers, which can now be performed without the perturbing influences of anesthesia or physical restraint on the biological system.

  17. OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing

    NASA Astrophysics Data System (ADS)

    Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping

    2017-02-01

    The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.

  18. Shipping Science Worldwide with Open Source Containers

    NASA Astrophysics Data System (ADS)

    Molineaux, J. P.; McLaughlin, B. D.; Pilone, D.; Plofchan, P. G.; Murphy, K. J.

    2014-12-01

    Scientific applications often present difficult web-hosting needs. Their compute- and data-intensive nature, as well as an increasing need for high-availability and distribution, combine to create a challenging set of hosting requirements. In the past year, advancements in container-based virtualization and related tooling have offered new lightweight and flexible ways to accommodate diverse applications with all the isolation and portability benefits of traditional virtualization. This session will introduce and demonstrate an open-source, single-interface, Platform-as-a-Serivce (PaaS) that empowers application developers to seamlessly leverage geographically distributed, public and private compute resources to achieve highly-available, performant hosting for scientific applications.

  19. Open Source Tools for Numerical Simulation of Urban Greenhouse Gas Emissions

    NASA Astrophysics Data System (ADS)

    Nottrott, A.; Tan, S. M.; He, Y.

    2016-12-01

    There is a global movement toward urbanization. Approximately 7% of the global population lives in just 28 megacities, occupying less than 0.1% of the total land area used by human activity worldwide. These cities contribute a significant fraction of the global budget of anthropogenic primary pollutants and greenhouse gasses. The 27 largest cities consume 9.9%, 9.3%, 6.7% and 3.0% of global gasoline, electricity, energy and water use, respectively. This impact motivates novel approaches to quantify and mitigate the growing contribution of megacity emissions to global climate change. Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model methane (CH4) emissions from various components of the natural gas distribution system, to investigate the impact of urban meteorology on mobile CH4 measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of the plume due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments.

  20. The directivity of the sound radiation from panels and openings.

    PubMed

    Davy, John L

    2009-06-01

    This paper presents a method for calculating the directivity of the radiation of sound from a panel or opening, whose vibration is forced by the incidence of sound from the other side. The directivity of the radiation depends on the angular distribution of the incident sound energy in the room or duct in whose wall or end the panel or opening occurs. The angular distribution of the incident sound energy is predicted using a model which depends on the sound absorption coefficient of the room or duct surfaces. If the sound source is situated in the room or duct, the sound absorption coefficient model is used in conjunction with a model for the directivity of the sound source. For angles of radiation approaching 90 degrees to the normal to the panel or opening, the effect of the diffraction by the panel or opening, or by the finite baffle in which the panel or opening is mounted, is included. A simple empirical model is developed to predict the diffraction of sound into the shadow zone when the angle of radiation is greater than 90 degrees to the normal to the panel or opening. The method is compared with published experimental results.

  1. ImTK: an open source multi-center information management toolkit

    NASA Astrophysics Data System (ADS)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  2. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    NASA Astrophysics Data System (ADS)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems. Geospatial capabilities for mapping, visualization, and spatial analysis will be important components of these new generation of Web-based interoperable information systems in the water domain. The purpose of this presentation is to increase the awareness of scientists, IT personnel and agency managers about the advantages offered by the combined use of Open Data, Open Specifications for geospatial and water-related data collection, storage and sharing, as well as mature FOSS projects for the creation of interoperable Web-based information systems in the water domain. A case study is used to illustrate how these principles and technologies can be integrated to create a system with the previously mentioned characteristics for managing and responding to flood events.

  3. Open Source Next Generation Visualization Software for Interplanetary Missions

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  4. Composition, spatial distribution and sources of macro-marine litter on the Gulf of Alicante seafloor (Spanish Mediterranean).

    PubMed

    García-Rivera, Santiago; Lizaso, Jose Luis Sánchez; Millán, Jose María Bellido

    2017-08-15

    The composition, spatial distribution and source of marine litter in the Spanish Southeast Mediterranean were assessed. The data proceed from a marine litter retention programme implemented by commercial trawlers and were analysed by GIS. By weight, 75.9% was plastic, metal and glass. Glass and plastics were mainly found close to the coast. A high concentration of metal was observed in some isolated zones of both open and coastal waters. Fishing activity was the source of 29.16% of the macro-marine litter, almost 68.1% of the plastics, and 25.1% of the metal. The source of the other 60.84% could not be directly identified, revealing the high degree of uncertainty regarding its specific origin. Indirectly however, a qualitative analysis of marine traffic shows that the likely sources were merchant ships mainly in open waters and recreational and fishing vessels in coastal waters. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Open source tools for large-scale neuroscience.

    PubMed

    Freeman, Jeremy

    2015-06-01

    New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  6. Support of Multidimensional Parallelism in the OpenMP Programming Model

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Jost, Gabriele

    2003-01-01

    OpenMP is the current standard for shared-memory programming. While providing ease of parallel programming, the OpenMP programming model also has limitations which often effect the scalability of applications. Examples for these limitations are work distribution and point-to-point synchronization among threads. We propose extensions to the OpenMP programming model which allow the user to easily distribute the work in multiple dimensions and synchronize the workflow among the threads. The proposed extensions include four new constructs and the associated runtime library. They do not require changes to the source code and can be implemented based on the existing OpenMP standard. We illustrate the concept in a prototype translator and test with benchmark codes and a cloud modeling code.

  7. Predatory Publishing, Questionable Peer Review, and Fraudulent Conferences

    PubMed Central

    2014-01-01

    Open-access is a model for publishing scholarly, peer-reviewed journals on the Internet that relies on sources of funding other than subscription fees. Some publishers and editors have exploited the author-pays model of open-access, publishing for their own profit. Submissions are encouraged through widely distributed e-mails on behalf of a growing number of journals that may accept many or all submissions and subject them to little, if any, peer review or editorial oversight. Bogus conference invitations are distributed in a similar fashion. The results of these less than ethical practices might include loss of faculty member time and money, inappropriate article inclusions in curriculum vitae, and costs to the college or funding source. PMID:25657363

  8. Open source system OpenVPN in a function of Virtual Private Network

    NASA Astrophysics Data System (ADS)

    Skendzic, A.; Kovacic, B.

    2017-05-01

    Using of Virtual Private Networks (VPN) can establish high security level in network communication. VPN technology enables high security networking using distributed or public network infrastructure. VPN uses different security and managing rules inside networks. It can be set up using different communication channels like Internet or separate ISP communication infrastructure. VPN private network makes security communication channel over public network between two endpoints (computers). OpenVPN is an open source software product under GNU General Public License (GPL) that can be used to establish VPN communication between two computers inside business local network over public communication infrastructure. It uses special security protocols and 256-bit Encryption and it is capable of traversing network address translators (NATs) and firewalls. It allows computers to authenticate each other using a pre-shared secret key, certificates or username and password. This work gives review of VPN technology with a special accent on OpenVPN. This paper will also give comparison and financial benefits of using open source VPN software in business environment.

  9. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  10. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE PAGES

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    2017-01-16

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  11. Ricardo Oliveira | NREL

    Science.gov Websites

    the System Modeling & Geospatial Data Science Group in the Strategic Energy Analysis Center. Areas Publications Oliveira, R and Moreno, R. 2016. Harvesting, Integrating and Distributing Large Open Geospatial Datasets Using Free and Open-Source Software. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLI-B7

  12. OpenMebius: an open source software for isotopically nonstationary 13C-based metabolic flux analysis.

    PubMed

    Kajihata, Shuichi; Furusawa, Chikara; Matsuda, Fumio; Shimizu, Hiroshi

    2014-01-01

    The in vivo measurement of metabolic flux by (13)C-based metabolic flux analysis ((13)C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a (13)C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas (13)C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary (13)C metabolic flux analysis (INST-(13)C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-(13)C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-(13)C-MFA. Confidence intervals determined by INST-(13)C-MFA were less than those determined by conventional methods, indicating the potential of INST-(13)C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-(13)C-MFA.

  13. Open Source Software Compliance within the Government

    DTIC Science & Technology

    2016-12-01

    The exception to this rule is the various General Public License (GPLs), which consider all distributions to contractors as outside distribution...is developed by a contractor at the government’s expense or for the government’s exclusive use. The third condition that must be met is that ERDC...executables and source code can only be offered by an authorized delivering entity to an authorized receiving entity. This means that contractors , with

  14. Automatic System for Producing and Distributing Lecture Recordings and Livestreams Using Opencast Matterhorn

    ERIC Educational Resources Information Center

    Jonach, Rafael; Ebner, Martin; Grigoriadis, Ypatios

    2015-01-01

    Lectures of courses at universities are increasingly being recorded and offered through various distribution channels to support students' learning activities. This research work aims to create an automatic system for producing and distributing high quality lecture recordings. Opencast Matterhorn is an open source platform for automated video…

  15. LSST communications middleware implementation

    NASA Astrophysics Data System (ADS)

    Mills, Dave; Schumacher, German; Lotz, Paul

    2016-07-01

    The LSST communications middleware is based on a set of software abstractions; which provide standard interfaces for common communications services. The observatory requires communication between diverse subsystems, implemented by different contractors, and comprehensive archiving of subsystem status data. The Service Abstraction Layer (SAL) is implemented using open source packages that implement open standards of DDS (Data Distribution Service1) for data communication, and SQL (Standard Query Language) for database access. For every subsystem, abstractions for each of the Telemetry datastreams, along with Command/Response and Events, have been agreed with the appropriate component vendor (such as Dome, TMA, Hexapod), and captured in ICD's (Interface Control Documents).The OpenSplice (Prismtech) Community Edition of DDS provides an LGPL licensed distribution which may be freely redistributed. The availability of the full source code provides assurances that the project will be able to maintain it over the full 10 year survey, independent of the fortunes of the original providers.

  16. Squid - a simple bioinformatics grid.

    PubMed

    Carvalho, Paulo C; Glória, Rafael V; de Miranda, Antonio B; Degrave, Wim M

    2005-08-03

    BLAST is a widely used genetic research tool for analysis of similarity between nucleotide and protein sequences. This paper presents a software application entitled "Squid" that makes use of grid technology. The current version, as an example, is configured for BLAST applications, but adaptation for other computing intensive repetitive tasks can be easily accomplished in the open source version. This enables the allocation of remote resources to perform distributed computing, making large BLAST queries viable without the need of high-end computers. Most distributed computing / grid solutions have complex installation procedures requiring a computer specialist, or have limitations regarding operating systems. Squid is a multi-platform, open-source program designed to "keep things simple" while offering high-end computing power for large scale applications. Squid also has an efficient fault tolerance and crash recovery system against data loss, being able to re-route jobs upon node failure and recover even if the master machine fails. Our results show that a Squid application, working with N nodes and proper network resources, can process BLAST queries almost N times faster than if working with only one computer. Squid offers high-end computing, even for the non-specialist, and is freely available at the project web site. Its open-source and binary Windows distributions contain detailed instructions and a "plug-n-play" instalation containing a pre-configured example.

  17. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  18. Open-Source as a strategy for operational software - the case of Enki

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur; Bruland, Oddbjørn

    2014-05-01

    Since 2002, SINTEF Energy has been developing what is now known as the Enki modelling system. This development has been financed by Norway's largest hydropower producer Statkraft, motivated by a desire for distributed hydrological models in operational use. As the owner of the source code, Statkraft has recently decided on Open Source as a strategy for further development, and for migration from an R&D context to operational use. A current cooperation project is currently carried out between SINTEF Energy, 7 large Norwegian hydropower producers including Statkraft, three universities and one software company. Of course, the most immediate task is that of software maturing. A more important challenge, however, is one of gaining experience within the operational hydropower industry. A transition from lumped to distributed models is likely to also require revision of measurement program, calibration strategy, use of GIS and modern data sources like weather radar and satellite imagery. On the other hand, map based visualisations enable a richer information exchange between hydrologic forecasters and power market traders. The operating context of a distributed hydrology model within hydropower planning is far from settled. Being both a modelling framework and a library of plugin-routines to build models from, Enki supports the flexibility needed in this situation. Recent development has separated the core from the user interface, paving the way for a scripting API, cross-platform compilation, and front-end programs serving different degrees of flexibility, robustness and security. The open source strategy invites anyone to use Enki and to develop and contribute new modules. Once tested, the same modules are available for the operational versions of the program. A core challenge is to offer rigid testing procedures and mechanisms to reject routines in an operational setting, without limiting the experimentation with new modules. The Open Source strategy also has implications for building and maintaining competence around the source code and the advanced hydrological and statistical routines in Enki. Originally developed by hydrologists, the Enki code is now approaching a state where maintenance requires a background in professional software development. Without the advantage of proprietary source code, both hydrologic improvements and software maintenance depend on donations or development support on a case-to-case basis, a situation well known within the open source community. It remains to see whether these mechanisms suffice to keep Enki at the maintenance level required by the hydropower sector. ENKI is available from www.opensource-enki.org.

  19. Sunlamp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nutaro, James J

    The purpose of this model was to facilitate the design of a control system that uses fine grained control of residential and small commercial HVAC loads to counterbalance voltage swings caused by intermittent solar power sources (e.g., rooftop panels) installed in that distribution circuit. Included is the source code and pre-compiled 64 bit dll for adding building HVAC loads to an OpenDSS distribution circuit. As written, the Makefile assumes you are using the Microsoft C++ development tools.

  20. Swan: A tool for porting CUDA programs to OpenCL

    NASA Astrophysics Data System (ADS)

    Harvey, M. J.; De Fabritiis, G.

    2011-04-01

    The use of modern, high-performance graphical processing units (GPUs) for acceleration of scientific computation has been widely reported. The majority of this work has used the CUDA programming model supported exclusively by GPUs manufactured by NVIDIA. An industry standardisation effort has recently produced the OpenCL specification for GPU programming. This offers the benefits of hardware-independence and reduced dependence on proprietary tool-chains. Here we describe a source-to-source translation tool, "Swan" for facilitating the conversion of an existing CUDA code to use the OpenCL model, as a means to aid programmers experienced with CUDA in evaluating OpenCL and alternative hardware. While the performance of equivalent OpenCL and CUDA code on fixed hardware should be comparable, we find that a real-world CUDA application ported to OpenCL exhibits an overall 50% increase in runtime, a reduction in performance attributable to the immaturity of contemporary compilers. The ported application is shown to have platform independence, running on both NVIDIA and AMD GPUs without modification. We conclude that OpenCL is a viable platform for developing portable GPU applications but that the more mature CUDA tools continue to provide best performance. Program summaryProgram title: Swan Catalogue identifier: AEIH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public License version 2 No. of lines in distributed program, including test data, etc.: 17 736 No. of bytes in distributed program, including test data, etc.: 131 177 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 256 Mbytes Classification: 6.5 External routines: NVIDIA CUDA, OpenCL Nature of problem: Graphical Processing Units (GPUs) from NVIDIA are preferentially programed with the proprietary CUDA programming toolkit. An alternative programming model promoted as an industry standard, OpenCL, provides similar capabilities to CUDA and is also supported on non-NVIDIA hardware (including multicore ×86 CPUs, AMD GPUs and IBM Cell processors). The adaptation of a program from CUDA to OpenCL is relatively straightforward but laborious. The Swan tool facilitates this conversion. Solution method:Swan performs a translation of CUDA kernel source code into an OpenCL equivalent. It also generates the C source code for entry point functions, simplifying kernel invocation from the host program. A concise host-side API abstracts the CUDA and OpenCL APIs. A program adapted to use Swan has no dependency on the CUDA compiler for the host-side program. The converted program may be built for either CUDA or OpenCL, with the selection made at compile time. Restrictions: No support for CUDA C++ features Running time: Nominal

  1. Status, upgrades, and advances of RTS2: the open source astronomical observatory manager

    NASA Astrophysics Data System (ADS)

    Kubánek, Petr

    2016-07-01

    RTS2 is an open source observatory control system. Being developed from early 2000, it continue to receive new features in last two years. RTS2 is a modulat, network-based distributed control system, featuring telescope drivers with advanced tracking and pointing capabilities, fast camera drivers and high level modules for "business logic" of the observatory, connected to a SQL database. Running on all continents of the planet, it accumulated a lot to control parts or full observatory setups.

  2. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  3. Using WNTR to Model Water Distribution System Resilience

    EPA Science Inventory

    The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of di...

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less

  5. The eGo grid model: An open-source and open-data based synthetic medium-voltage grid model for distribution power supply systems

    NASA Astrophysics Data System (ADS)

    Amme, J.; Pleßmann, G.; Bühler, J.; Hülk, L.; Kötter, E.; Schwaegerl, P.

    2018-02-01

    The increasing integration of renewable energy into the electricity supply system creates new challenges for distribution grids. The planning and operation of distribution systems requires appropriate grid models that consider the heterogeneity of existing grids. In this paper, we describe a novel method to generate synthetic medium-voltage (MV) grids, which we applied in our DIstribution Network GeneratOr (DINGO). DINGO is open-source software and uses freely available data. Medium-voltage grid topologies are synthesized based on location and electricity demand in defined demand areas. For this purpose, we use GIS data containing demand areas with high-resolution spatial data on physical properties, land use, energy, and demography. The grid topology is treated as a capacitated vehicle routing problem (CVRP) combined with a local search metaheuristics. We also consider the current planning principles for MV distribution networks, paying special attention to line congestion and voltage limit violations. In the modelling process, we included power flow calculations for validation. The resulting grid model datasets contain 3608 synthetic MV grids in high resolution, covering all of Germany and taking local characteristics into account. We compared the modelled networks with real network data. In terms of number of transformers and total cable length, we conclude that the method presented in this paper generates realistic grids that could be used to implement a cost-optimised electrical energy system.

  6. An informatics model for guiding assembly of telemicrobiology workstations for malaria collaborative diagnostics using commodity products and open-source software.

    PubMed

    Suhanic, West; Crandall, Ian; Pennefather, Peter

    2009-07-17

    Deficits in clinical microbiology infrastructure exacerbate global infectious disease burdens. This paper examines how commodity computation, communication, and measurement products combined with open-source analysis and communication applications can be incorporated into laboratory medicine microbiology protocols. Those commodity components are all now sourceable globally. An informatics model is presented for guiding the use of low-cost commodity components and free software in the assembly of clinically useful and usable telemicrobiology workstations. The model incorporates two general principles: 1) collaborative diagnostics, where free and open communication and networking applications are used to link distributed collaborators for reciprocal assistance in organizing and interpreting digital diagnostic data; and 2) commodity engineering, which leverages globally available consumer electronics and open-source informatics applications, to build generic open systems that measure needed information in ways substantially equivalent to more complex proprietary systems. Routine microscopic examination of Giemsa and fluorescently stained blood smears for diagnosing malaria is used as an example to validate the model. The model is used as a constraint-based guide for the design, assembly, and testing of a functioning, open, and commoditized telemicroscopy system that supports distributed acquisition, exploration, analysis, interpretation, and reporting of digital microscopy images of stained malarial blood smears while also supporting remote diagnostic tracking, quality assessment and diagnostic process development. The open telemicroscopy workstation design and use-process described here can address clinical microbiology infrastructure deficits in an economically sound and sustainable manner. It can boost capacity to deal with comprehensive measurement of disease and care outcomes in individuals and groups in a distributed and collaborative fashion. The workstation enables local control over the creation and use of diagnostic data, while allowing for remote collaborative support of diagnostic data interpretation and tracking. It can enable global pooling of malaria disease information and the development of open, participatory, and adaptable laboratory medicine practices. The informatic model highlights how the larger issue of access to generic commoditized measurement, information processing, and communication technology in both high- and low-income countries can enable diagnostic services that are much less expensive, but substantially equivalent to those currently in use in high-income countries.

  7. Source Distributions of Substorm Ions Observed in the Near-Earth Magnetotail

    NASA Technical Reports Server (NTRS)

    Ashour-Abdalla, M.; El-Alaoui, M.; Peroomian, V.; Walker, R. J.; Raeder, J.; Frank, L. A.; Paterson, W. R.

    1999-01-01

    This study employs Geotail plasma observations and numerical modeling to determine sources of the ions observed in the near-Earth magnetotail near midnight during a substorm. The growth phase has the low-latitude boundary layer as its most important source of ions at Geotail, but during the expansion phase the plasma mantle is dominant. The mantle distribution shows evidence of two distinct entry mechanisms: entry through a high latitude reconnection region resulting in an accelerated component, and entry through open field lines traditionally identified with the mantle source. The two entry mechanisms are separated in time, with the high-latitude reconnection region disappearing prior to substorm onset.

  8. Software LS-MIDA for efficient mass isotopomer distribution analysis in metabolic modelling.

    PubMed

    Ahmed, Zeeshan; Zeeshan, Saman; Huber, Claudia; Hensel, Michael; Schomburg, Dietmar; Münch, Richard; Eisenreich, Wolfgang; Dandekar, Thomas

    2013-07-09

    The knowledge of metabolic pathways and fluxes is important to understand the adaptation of organisms to their biotic and abiotic environment. The specific distribution of stable isotope labelled precursors into metabolic products can be taken as fingerprints of the metabolic events and dynamics through the metabolic networks. An open-source software is required that easily and rapidly calculates from mass spectra of labelled metabolites, derivatives and their fragments global isotope excess and isotopomer distribution. The open-source software "Least Square Mass Isotopomer Analyzer" (LS-MIDA) is presented that processes experimental mass spectrometry (MS) data on the basis of metabolite information such as the number of atoms in the compound, mass to charge ratio (m/e or m/z) values of the compounds and fragments under study, and the experimental relative MS intensities reflecting the enrichments of isotopomers in 13C- or 15 N-labelled compounds, in comparison to the natural abundances in the unlabelled molecules. The software uses Brauman's least square method of linear regression. As a result, global isotope enrichments of the metabolite or fragment under study and the molar abundances of each isotopomer are obtained and displayed. The new software provides an open-source platform that easily and rapidly converts experimental MS patterns of labelled metabolites into isotopomer enrichments that are the basis for subsequent observation-driven analysis of pathways and fluxes, as well as for model-driven metabolic flux calculations.

  9. Poster - Thur Eve - 06: Comparison of an open source genetic algorithm to the commercially used IPSA for generation of seed distributions in LDR prostate brachytherapy.

    PubMed

    McGeachy, P; Khan, R

    2012-07-01

    In early stage prostate cancer, low dose rate (LDR) prostate brachytherapy is a favorable treatment modality, where small radioactive seeds are permanently implanted throughout the prostate. Treatment centres currently rely on a commercial optimization algorithm, IPSA, to generate seed distributions for treatment plans. However, commercial software does not allow the user access to the source code, thus reducing the flexibility for treatment planning and impeding any implementation of new and, perhaps, improved clinical techniques. An open source genetic algorithm (GA) has been encoded in MATLAB to generate seed distributions for a simplified prostate and urethra model. To assess the quality of the seed distributions created by the GA, both the GA and IPSA were used to generate seed distributions for two clinically relevant scenarios and the quality of the GA distributions relative to IPSA distributions and clinically accepted standards for seed distributions was investigated. The first clinically relevant scenario involved generating seed distributions for three different prostate volumes (19.2 cc, 32.4 cc, and 54.7 cc). The second scenario involved generating distributions for three separate seed activities (0.397 mCi, 0.455 mCi, and 0.5 mCi). Both GA and IPSA met the clinically accepted criteria for the two scenarios, where distributions produced by the GA were comparable to IPSA in terms of full coverage of the prostate by the prescribed dose, and minimized dose to the urethra, which passed straight through the prostate. Further, the GA offered improved reduction of high dose regions (i.e hot spots) within the planned target volume. © 2012 American Association of Physicists in Medicine.

  10. Open-Source Wax RepRap 3-D Printer for Rapid Prototyping Paper-Based Microfluidics.

    PubMed

    Pearce, J M; Anzalone, N C; Heldt, C L

    2016-08-01

    The open-source release of self-replicating rapid prototypers (RepRaps) has created a rich opportunity for low-cost distributed digital fabrication of complex 3-D objects such as scientific equipment. For example, 3-D printable reactionware devices offer the opportunity to combine open hardware microfluidic handling with lab-on-a-chip reactionware to radically reduce costs and increase the number and complexity of microfluidic applications. To further drive down the cost while improving the performance of lab-on-a-chip paper-based microfluidic prototyping, this study reports on the development of a RepRap upgrade capable of converting a Prusa Mendel RepRap into a wax 3-D printer for paper-based microfluidic applications. An open-source hardware approach is used to demonstrate a 3-D printable upgrade for the 3-D printer, which combines a heated syringe pump with the RepRap/Arduino 3-D control. The bill of materials, designs, basic assembly, and use instructions are provided, along with a completely free and open-source software tool chain. The open-source hardware device described here accelerates the potential of the nascent field of electrochemical detection combined with paper-based microfluidics by dropping the marginal cost of prototyping to nearly zero while accelerating the turnover between paper-based microfluidic designs. © 2016 Society for Laboratory Automation and Screening.

  11. Design, implementation and practice of JBEI-ICE: an open source biological part registry platform and tools.

    PubMed

    Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D

    2012-10-01

    The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.

  12. Computational studies for a multiple-frequency electron cyclotron resonance ion source (abstract)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alton, G.D.

    1996-03-01

    The number density of electrons, the energy (electron temperature), and energy distribution are three of the fundamental properties which govern the performance of electron cyclotron resonance (ECR) ion sources in terms of their capability to produce high charge state ions. The maximum electron energy is affected by several processes including the ability of the plasma to absorb power. In principle, the performances of an ECR ion source can be realized by increasing the physical size of the ECR zone in relation to the total plasma volume. The ECR zones can be increased either in the spatial or frequency domains inmore » any ECR ion source based on B-minimum plasma confinement principles. The former technique requires the design of a carefully tailored magnetic field geometry so that the central region of the plasma volume is a large, uniformly distributed plasma volume which surrounds the axis of symmetry, as proposed in Ref. . Present art forms of the ECR source utilize single frequency microwave power supplies to maintain the plasma discharge; because the magnetic field distribution continually changes in this source design, the ECR zones are relegated to thin {open_quote}{open_quote}surfaces{close_quote}{close_quote} which surround the axis of symmetry. As a consequence of the small ECR zone in relation to the total plasma volume, the probability for stochastic heating of the electrons is quite low, thereby compromising the source performance. This handicap can be overcome by use of broadband, multiple frequency microwave power as evidenced by the enhanced performances of the CAPRICE and AECR ion sources when two frequency microwave power was utilized. We have used particle-in-cell codes to simulate the magnetic field distributions in these sources and to demonstrate the advantages of using multiple, discrete frequencies over single frequencies to power conventional ECR ion sources. (Abstract Truncated)« less

  13. Implementation of Open-Source Web Mapping Technologies to Support Monitoring of Governmental Schemes

    NASA Astrophysics Data System (ADS)

    Pulsani, B. R.

    2015-10-01

    Several schemes are undertaken by the government to uplift social and economic condition of people. The monitoring of these schemes is done through information technology where involvement of Geographic Information System (GIS) is lacking. To demonstrate the benefits of thematic mapping as a tool for assisting the officials in making decisions, a web mapping application for three government programs such as Mother and Child Tracking system (MCTS), Telangana State Housing Corporation Limited (TSHCL) and Ground Water Quality Mapping (GWQM) has been built. Indeed the three applications depicted the distribution of various parameters thematically and helped in identifying the areas with higher and weaker distributions. Based on the three applications, the study tends to find similarities of many government schemes reflecting the nature of thematic mapping and hence deduces to implement this kind of approach for other schemes as well. These applications have been developed using SharpMap Csharp library which is a free and open source mapping library for developing geospatial applications. The study highlights upon the cost benefits of SharpMap and brings out the advantage of this library over proprietary vendors and further discusses its advantages over other open source libraries as well.

  14. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less

  15. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less

  16. Opendf - An Implementation of the Dual Fermion Method for Strongly Correlated Systems

    NASA Astrophysics Data System (ADS)

    Antipov, Andrey E.; LeBlanc, James P. F.; Gull, Emanuel

    The dual fermion method is a multiscale approach for solving lattice problems of interacting strongly correlated systems. In this paper, we present the opendfcode, an open-source implementation of the dual fermion method applicable to fermionic single- orbital lattice models in dimensions D = 1, 2, 3 and 4. The method is built on a dynamical mean field starting point, which neglects all local correlations, and perturbatively adds spatial correlations. Our code is distributed as an open-source package under the GNU public license version 2.

  17. Steady hydromagnetic flows in open magnetic fields. II - Global flows with static zones

    NASA Technical Reports Server (NTRS)

    Tsinganos, K.; Low, B. C.

    1989-01-01

    A theoretical study of an axisymmetric steady stellar wind with a static zone is presented, with emphasis on the situation where the global magnetic field is symmetrical about the stellar equator and is partially open. In this scenario, the wind escapes in open magnetic fluxes originating from a region at the star pole and a region at an equatorial belt of closed magnetic field in static equilibrium. The two-dimensional balance of the pressure gradient and the inertial, gravitational, and Lorentz forces in different parts of the flow are studied, along with the static interplay between external sources of energy (heating and/or cooling) distributed in the flow and the pressure distribution.

  18. BioContainers: an open-source and community-driven framework for software standardization.

    PubMed

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  19. BioContainers: an open-source and community-driven framework for software standardization

    PubMed Central

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  20. Building Energy Management Open Source Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This is the repository for Building Energy Management Open Source Software (BEMOSS), which is an open source operating system that is engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. BEMOSS offers the following key features: (1) Open source, open architecture – BEMOSS is an open source operating system that is built upon VOLTTRON – a distributed agent platform developed by Pacific Northwest National Laboratory (PNNL). BEMOSS was designed to make it easy for hardware manufacturers to seamlessly interface their devices with BEMOSS. Software developers can also contribute to adding additional BEMOSS functionalities and applications.more » (2) Plug & play – BEMOSS was designed to automatically discover supported load controllers (including smart thermostats, VAV/RTUs, lighting load controllers and plug load controllers) in commercial buildings. (3) Interoperability – BEMOSS was designed to work with load control devices form different manufacturers that operate on different communication technologies and data exchange protocols. (4) Cost effectiveness – Implementation of BEMOSS deemed to be cost-effective as it was built upon a robust open source platform that can operate on a low-cost single-board computer, such as Odroid. This feature could contribute to its rapid deployment in small- or medium-sized commercial buildings. (5) Scalability and ease of deployment – With its multi-node architecture, BEMOSS provides a distributed architecture where load controllers in a multi-floor and high occupancy building could be monitored and controlled by multiple single-board computers hosting BEMOSS. This makes it possible for a building engineer to deploy BEMOSS in one zone of a building, be comfortable with its operation, and later on expand the deployment to the entire building to make it more energy efficient. (6) Ability to provide local and remote monitoring – BEMOSS provides both local and remote monitoring ability with role-based access control. (7) Security – In addition to built-in security features provided by VOLTTRON, BEMOSS provides enhanced security features, including BEMOSS discovery approval process, encrypted core-to-node communication, thermostat anti-tampering feature and many more. (8) Support from the Advisory Committee – BEMOSS was developed in consultation with an advisory committee from the beginning of the project. BEMOSS advisory committee comprises representatives from 22 organizations from government and industry.« less

  1. New insight into California’s drought through open data

    USGS Publications Warehouse

    Read, Emily K.; Bucknell, Mary; Hines, Megan K.; Kreft, James M.; Lucido, Jessica M.; Read, Jordan S.; Schroedl, Carl; Sibley, David M.; Stephan, Shirley; Suftin, Ivan; Thongsavanh, Phethala; Van Den Hoek, Jamon; Walker, Jordan I.; Wernimont, Martin R; Winslow, Luke A.; Yan, Andrew N.

    2015-01-01

    Historically unprecedented drought in California has brought water issues to the forefront of the nation’s attention. Crucial investigations that concern water policy, management, and research, in turn, require extensive information about the quality and quantity of California’s water. Unfortunately, key sources of pertinent data are unevenly distributed and frequently hard to find. Thankfully, the vital importance of integrating water data across federal, state, and tribal, academic, and private entities, has recently been recognized and addressed through federal initiatives such as the Climate Data Initiative of President Obama’s Climate Action Plan and the Advisory Committee on Water Information’sOpen Water Data Initiative. Here, we demonstrate an application of integrated open water data, visualized and made available online using open source software, for the purpose of exploring the impact of the current California drought. Our collaborative approach and technical tools enabled a rapid, distributed development process. Many positive outcomes have resulted: the application received recognition within and outside of the Federal Government, inspired others to visualize open water data, spurred new collaborations for our group, and strengthened the collaborative relationships within the team of developers. In this article, we describe the technical tools and collaborative process that enabled the success of the application. 

  2. Datacube Services in Action, Using Open Source and Open Standards

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2016-12-01

    Array Databases comprise novel, promising technology for massive spatio-temporal datacubes, extending the SQL paradigm of "any query, anytime" to n-D arrays. On server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. The rasdaman ("raster data manager") system, which has pioneered Array Databases, is available in open source on www.rasdaman.org. Its declarative query language extends SQL with array operators which are optimized and parallelized on server side. The rasdaman engine, which is part of OSGeo Live, is mature and in operational use databases individually holding dozens of Terabytes. Further, the rasdaman concepts have strongly impacted international Big Data standards in the field, including the forthcoming MDA ("Multi-Dimensional Array") extension to ISO SQL, the OGC Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) standards, and the forthcoming INSPIRE WCS/WCPS; in both OGC and INSPIRE, OGC is WCS Core Reference Implementation. In our talk we present concepts, architecture, operational services, and standardization impact of open-source rasdaman, as well as experiences made.

  3. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  4. Generating the Local Oscillator "Locally" in Continuous-Variable Quantum Key Distribution Based on Coherent Detection

    NASA Astrophysics Data System (ADS)

    Qi, Bing; Lougovski, Pavel; Pooser, Raphael; Grice, Warren; Bobrek, Miljko

    2015-10-01

    Continuous-variable quantum key distribution (CV-QKD) protocols based on coherent detection have been studied extensively in both theory and experiment. In all the existing implementations of CV-QKD, both the quantum signal and the local oscillator (LO) are generated from the same laser and propagate through the insecure quantum channel. This arrangement may open security loopholes and limit the potential applications of CV-QKD. In this paper, we propose and demonstrate a pilot-aided feedforward data recovery scheme that enables reliable coherent detection using a "locally" generated LO. Using two independent commercial laser sources and a spool of 25-km optical fiber, we construct a coherent communication system. The variance of the phase noise introduced by the proposed scheme is measured to be 0.04 (rad2 ), which is small enough to enable secure key distribution. This technology also opens the door for other quantum communication protocols, such as the recently proposed measurement-device-independent CV-QKD, where independent light sources are employed by different users.

  5. QuTiP: An open-source Python framework for the dynamics of open quantum systems

    NASA Astrophysics Data System (ADS)

    Johansson, J. R.; Nation, P. D.; Nori, Franco

    2012-08-01

    We present an object-oriented open-source framework for solving the dynamics of open quantum systems written in Python. Arbitrary Hamiltonians, including time-dependent systems, may be built up from operators and states defined by a quantum object class, and then passed on to a choice of master equation or Monte Carlo solvers. We give an overview of the basic structure for the framework before detailing the numerical simulation of open system dynamics. Several examples are given to illustrate the build up to a complete calculation. Finally, we measure the performance of our library against that of current implementations. The framework described here is particularly well suited to the fields of quantum optics, superconducting circuit devices, nanomechanics, and trapped ions, while also being ideal for use in classroom instruction. Catalogue identifier: AEMB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 16 482 No. of bytes in distributed program, including test data, etc.: 213 438 Distribution format: tar.gz Programming language: Python Computer: i386, x86-64 Operating system: Linux, Mac OSX, Windows RAM: 2+ Gigabytes Classification: 7 External routines: NumPy (http://numpy.scipy.org/), SciPy (http://www.scipy.org/), Matplotlib (http://matplotlib.sourceforge.net/) Nature of problem: Dynamics of open quantum systems. Solution method: Numerical solutions to Lindblad master equation or Monte Carlo wave function method. Restrictions: Problems must meet the criteria for using the master equation in Lindblad form. Running time: A few seconds up to several tens of minutes, depending on size of underlying Hilbert space.

  6. Simulation studies on multi-mode heat transfer from an open cavity with a flush-mounted discrete heat source

    NASA Astrophysics Data System (ADS)

    Gururaja Rao, C.; Nagabhushana Rao, V.; Krishna Das, C.

    2008-04-01

    Prominent results of a simulation study on conjugate convection with surface radiation from an open cavity with a traversable flush mounted discrete heat source in the left wall are presented in this paper. The open cavity is considered to be of fixed height but with varying spacing between the legs. The position of the heat source is varied along the left leg of the cavity. The governing equations for temperature distribution along the cavity are obtained by making energy balance between heat generated, conducted, convected and radiated. Radiation terms are tackled using radiosity-irradiation formulation, while the view factors, therein, are evaluated using the crossed-string method of Hottel. The resulting non-linear partial differential equations are converted into algebraic form using finite difference formulation and are subsequently solved by Gauss Seidel iterative technique. An optimum grid system comprising 111 grids along the legs of the cavity, with 30 grids in the heat source and 31 grids across the cavity has been used. The effects of various parameters, such as surface emissivity, convection heat transfer coefficient, aspect ratio and thermal conductivity on the important results, including local temperature distribution along the cavity, peak temperature in the left and right legs of the cavity and relative contributions of convection and radiation to heat dissipation in the cavity, are studied in great detail.

  7. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    PubMed

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  8. Open Source Cloud-Based Technologies for Bim

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.

    2018-05-01

    This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  9. Free and Open Source Software for Geospatial in the field of planetary science

    NASA Astrophysics Data System (ADS)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and solutions to possible detriments coming from the effort required by using, supporting and contributing.

  10. Noise distribution of an incubator with nebulizer at a neonatal intensive care unit in southern Taiwan.

    PubMed

    Chen, H F; Chang, Y J

    2001-06-01

    The purpose of this study was to investigate the noise distribution and sources of peak noise inside an incubator with a nebulizer at a neonatal intensive care unit of a medical center in Southern Taiwan. Sound levels were monitored continuously with an electronic sound-meter for 24 hours daily over a one-week period. Three working hours (day, evening, and night hours) in the weekday and weekend (total 48 hours) were selected randomly from the one-week period of noise survey to observe peak noise at levels > or = 65 dBA. Results revealed that 24.8% of the total monitoring period had sound levels at < or = 59 dBA, 58.9% at 60-64 dBA, 10.7% at 65-69 dBA, and 5.6% at > or = 70 dBA. Furthermore, a total of 947 peak noises > or = 65 dBA were found within the 48 hours, of which 61.5% were in a range of 65-69 dBA, 24% of 70-74 dBA, 9.8% of 75-79 dBA, and 4.8% > or = 80 dBA. Human-related sources, equaling 79%, were the dominant peak noises. These noises included opening and closing doors, banging the incubator hood, conversation among staff, nursing activity inside the incubator, tearing and opening paper or bags, opening and closing trash can lids, and bumping metal carts or other apparatus. Nonhuman-related sources were 21% including alarms of monitors and running of the incubator motor. Results of this study showed that the noise distribution in the incubator with nebulizer was far above a protective limitation of 58 dBA, suggested by the American Academy of Pediatrics in 1974. However, most peak noises could be reduced by modification of staff behavior. Therefore, determinations of noise distribution and sources of peak noise in this study are useful for further noise reduction programs.

  11. Scalable cloud without dedicated storage

    NASA Astrophysics Data System (ADS)

    Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.

    2015-05-01

    We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.

  12. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  13. Software Comparison for Renewable Energy Deployment in a Distribution Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercialmore » tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.« less

  14. OpenStereo: Open Source, Cross-Platform Software for Structural Geology Analysis

    NASA Astrophysics Data System (ADS)

    Grohmann, C. H.; Campanha, G. A.

    2010-12-01

    Free and open source software (FOSS) are increasingly seen as synonyms of innovation and progress. Freedom to run, copy, distribute, study, change and improve the software (through access to the source code) assure a high level of positive feedback between users and developers, which results in stable, secure and constantly updated systems. Several software packages for structural geology analysis are available to the user, with commercial licenses or that can be downloaded at no cost from the Internet. Some provide basic tools of stereographic projections such as plotting poles, great circles, density contouring, eigenvector analysis, data rotation etc, while others perform more specific tasks, such as paleostress or geotechnical/rock stability analysis. This variety also means a wide range of data formating for input, Graphical User Interface (GUI) design and graphic export format. The majority of packages is built for MS-Windows and even though there are packages for the UNIX-based MacOS, there aren't native packages for *nix (UNIX, Linux, BSD etc) Operating Systems (OS), forcing the users to run these programs with emulators or virtual machines. Those limitations lead us to develop OpenStereo, an open source, cross-platform software for stereographic projections and structural geology. The software is written in Python, a high-level, cross-platform programming language and the GUI is designed with wxPython, which provide a consistent look regardless the OS. Numeric operations (like matrix and linear algebra) are performed with the Numpy module and all graphic capabilities are provided by the Matplolib library, including on-screen plotting and graphic exporting to common desktop formats (emf, eps, ps, pdf, png, svg). Data input is done with simple ASCII text files, with values of dip direction and dip/plunge separated by spaces, tabs or commas. The user can open multiple file at the same time (or the same file more than once), and overlay different elements of each dataset (poles, great circles etc). The GUI shows the opened files in a tree structure, similar to “layers” of many illustration software, where the vertical order of the files in the tree reflects the drawing order of the selected elements. At this stage, the software performs plotting operations of poles to planes, lineations, great circles, density contours and rose diagrams. A set of statistics is calculated for each file and its eigenvalues and eigenvectors are used to suggest if the data is clustered about a mean value or distributed along a girdle. Modified Flinn, Triangular and histograms plots are also available. Next step of development will focus on tools as merging and rotation of datasets, possibility to save 'projects' and paleostress analysis. In its current state, OpenStereo requires Python, wxPython, Numpy and Matplotlib installed in the system. We recommend installing PythonXY or the Enthought Python Distribution on MS-Windows and MacOS machines, since all dependencies are provided. Most Linux distributions provide an easy way to install all dependencies through software repositories. OpenStereo is released under the GNU General Public License. Programmers willing to contribute are encouraged to contact the authors directly. FAPESP Grant #09/17675-5

  15. Quantum key distribution with an unknown and untrusted source

    NASA Astrophysics Data System (ADS)

    Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong

    2008-05-01

    The security of a standard bidirectional “plug-and-play” quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we solve this question directly by presenting the quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard Bennett-Brassard 1984 protocol, weak+vacuum decoy state protocol, and one-decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source.

  16. On the implicit density based OpenFOAM solver for turbulent compressible flows

    NASA Astrophysics Data System (ADS)

    Fürst, Jiří

    The contribution deals with the development of coupled implicit density based solver for compressible flows in the framework of open source package OpenFOAM. However the standard distribution of OpenFOAM contains several ready-made segregated solvers for compressible flows, the performance of those solvers is rather week in the case of transonic flows. Therefore we extend the work of Shen [15] and we develop an implicit semi-coupled solver. The main flow field variables are updated using lower-upper symmetric Gauss-Seidel method (LU-SGS) whereas the turbulence model variables are updated using implicit Euler method.

  17. Naima: a Python package for inference of particle distribution properties from nonthermal spectra

    NASA Astrophysics Data System (ADS)

    Zabalza, V.

    2015-07-01

    The ultimate goal of the observation of nonthermal emission from astrophysical sources is to understand the underlying particle acceleration and evolution processes, and few tools are publicly available to infer the particle distribution properties from the observed photon spectra from X-ray to VHE gamma rays. Here I present naima, an open source Python package that provides models for nonthermal radiative emission from homogeneous distribution of relativistic electrons and protons. Contributions from synchrotron, inverse Compton, nonthermal bremsstrahlung, and neutral-pion decay can be computed for a series of functional shapes of the particle energy distributions, with the possibility of using user-defined particle distribution functions. In addition, naima provides a set of functions that allow to use these models to fit observed nonthermal spectra through an MCMC procedure, obtaining probability distribution functions for the particle distribution parameters. Here I present the models and methods available in naima and an example of their application to the understanding of a galactic nonthermal source. naima's documentation, including how to install the package, is available at http://naima.readthedocs.org.

  18. FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data

    PubMed Central

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages. PMID:21253357

  19. THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL

    EPA Science Inventory

    A toolkit for distributed hydrologic modeling at multiple scales using a geographic information system is presented. This open-source, freely available software was developed through a collaborative endeavor involving two Universities and two government agencies. Called the Auto...

  20. Software-based measurement of thin filament lengths: an open-source GUI for Distributed Deconvolution analysis of fluorescence images

    PubMed Central

    Gokhin, David S.; Fowler, Velia M.

    2016-01-01

    The periodically arranged thin filaments within the striated myofibrils of skeletal and cardiac muscle have precisely regulated lengths, which can change in response to developmental adaptations, pathophysiological states, and genetic perturbations. We have developed a user-friendly, open-source ImageJ plugin that provides a graphical user interface (GUI) for super-resolution measurement of thin filament lengths by applying Distributed Deconvolution (DDecon) analysis to periodic line scans collected from fluorescence images. In the workflow presented here, we demonstrate thin filament length measurement using a phalloidin-stained cryosection of mouse skeletal muscle. The DDecon plugin is also capable of measuring distances of any periodically localized fluorescent signal from the Z- or M-line, as well as distances between successive Z- or M-lines, providing a broadly applicable tool for quantitative analysis of muscle cytoarchitecture. These functionalities can also be used to analyze periodic fluorescence signals in nonmuscle cells. PMID:27644080

  1. Development of AN Open-Source Automatic Deformation Monitoring System for Geodetical and Geotechnical Measurements

    NASA Astrophysics Data System (ADS)

    Engel, P.; Schweimler, B.

    2016-04-01

    The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the Neubrandenburg University of Applied Sciences (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.

  2. FRED (a Framework for Reconstructing Epidemic Dynamics): an open-source software system for modeling infectious diseases and control strategies using census-based populations.

    PubMed

    Grefenstette, John J; Brown, Shawn T; Rosenfeld, Roni; DePasse, Jay; Stone, Nathan T B; Cooley, Phillip C; Wheaton, William D; Fyshe, Alona; Galloway, David D; Sriram, Anuroop; Guclu, Hasan; Abraham, Thomas; Burke, Donald S

    2013-10-08

    Mathematical and computational models provide valuable tools that help public health planners to evaluate competing health interventions, especially for novel circumstances that cannot be examined through observational or controlled studies, such as pandemic influenza. The spread of diseases like influenza depends on the mixing patterns within the population, and these mixing patterns depend in part on local factors including the spatial distribution and age structure of the population, the distribution of size and composition of households, employment status and commuting patterns of adults, and the size and age structure of schools. Finally, public health planners must take into account the health behavior patterns of the population, patterns that often vary according to socioeconomic factors such as race, household income, and education levels. FRED (a Framework for Reconstructing Epidemic Dynamics) is a freely available open-source agent-based modeling system based closely on models used in previously published studies of pandemic influenza. This version of FRED uses open-access census-based synthetic populations that capture the demographic and geographic heterogeneities of the population, including realistic household, school, and workplace social networks. FRED epidemic models are currently available for every state and county in the United States, and for selected international locations. State and county public health planners can use FRED to explore the effects of possible influenza epidemics in specific geographic regions of interest and to help evaluate the effect of interventions such as vaccination programs and school closure policies. FRED is available under a free open source license in order to contribute to the development of better modeling tools and to encourage open discussion of modeling tools being used to evaluate public health policies. We also welcome participation by other researchers in the further development of FRED.

  3. Affordable Open-Source Data Loggers for Distributed Measurements of Sap-Flux, Stem Growth, Relative Humidity, Temperature, and Soil Water Content

    NASA Astrophysics Data System (ADS)

    Anderson, T.; Jencso, K. G.; Hoylman, Z. H.; Hu, J.

    2015-12-01

    Characterizing the mechanisms that lead to differences in forest ecosystem productivity across complex terrain remains a challenge. This difficulty can be partially attributed to the cost of installing networks of proprietary data loggers that monitor differences in the biophysical factors contributing to tree growth. Here, we describe the development and initial application of a network of open source data loggers. These data loggers are based on the Arduino platform, but were refined into a custom printed circuit board (PCB). This reduced the cost and complexity of the data loggers, which made them cheap to reproduce and reliable enough to withstand the harsh environmental conditions experienced in Ecohydrology studies. We demonstrate the utility of these loggers for high frequency, spatially-distributed measurements of sap-flux, stem growth, relative humidity, temperature, and soil water content across 36 landscape positions in the Lubrecht Experimental Forest, MT, USA. This new data logging technology made it possible to develop a spatially distributed monitoring network within the constraints of our research budget and may provide new insights into factors affecting forest productivity across complex terrain.

  4. Generating the local oscillator "locally" in continuous-variable quantum key distribution based on coherent detection

    DOE PAGES

    Qi, Bing; Lougovski, Pavel; Pooser, Raphael C.; ...

    2015-10-21

    Continuous-variable quantum key distribution (CV-QKD) protocols based on coherent detection have been studied extensively in both theory and experiment. In all the existing implementations of CV-QKD, both the quantum signal and the local oscillator (LO) are generated from the same laser and propagate through the insecure quantum channel. This arrangement may open security loopholes and limit the potential applications of CV-QKD. In our paper, we propose and demonstrate a pilot-aided feedforward data recovery scheme that enables reliable coherent detection using a “locally” generated LO. Using two independent commercial laser sources and a spool of 25-km optical fiber, we construct amore » coherent communication system. The variance of the phase noise introduced by the proposed scheme is measured to be 0.04 (rad 2), which is small enough to enable secure key distribution. This technology opens the door for other quantum communication protocols, such as the recently proposed measurement-device-independent CV-QKD, where independent light sources are employed by different users.« less

  5. Contaminant transport from point source on water surface in open channel flow with bed absorption

    NASA Astrophysics Data System (ADS)

    Guo, Jinlan; Wu, Xudong; Jiang, Weiquan; Chen, Guoqian

    2018-06-01

    Studying solute dispersion in channel flows is of significance for environmental and industrial applications. Two-dimensional concentration distribution for a most typical case of a point source release on the free water surface in a channel flow with bed absorption is presented by means of Chatwin's long-time asymptotic technique. Five basic characteristics of Taylor dispersion and vertical mean concentration distribution with skewness and kurtosis modifications are also analyzed. The results reveal that bed absorption affects both the longitudinal and vertical concentration distributions and causes the contaminant cloud to concentrate in the upper layer. Additionally, the cross-sectional concentration distribution shows an asymptotic Gaussian distribution at large time which is unaffected by the bed absorption. The vertical concentration distribution is found to be nonuniform even at large time. The obtained results are essential for practical implements with strict environmental standards.

  6. The phonological-distributional coherence hypothesis: cross-linguistic evidence in language acquisition.

    PubMed

    Monaghan, Padraic; Christiansen, Morten H; Chater, Nick

    2007-12-01

    Several phonological and prosodic properties of words have been shown to relate to differences between grammatical categories. Distributional information about grammatical categories is also a rich source in the child's language environment. In this paper we hypothesise that such cues operate in tandem for developing the child's knowledge about grammatical categories. We term this the Phonological-Distributional Coherence Hypothesis (PDCH). We tested the PDCH by analysing phonological and distributional information in distinguishing open from closed class words and nouns from verbs in four languages: English, Dutch, French, and Japanese. We found an interaction between phonological and distributional cues for all four languages indicating that when distributional cues were less reliable, phonological cues were stronger. This provides converging evidence that language is structured such that language learning benefits from the integration of information about category from contextual and sound-based sources, and that the child's language environment is less impoverished than we might suspect.

  7. The X-ray properties of the young open cluster around alpha Persei

    NASA Technical Reports Server (NTRS)

    Randich, S.; Schmitt, J. H. M. M.; Prosser, C. F.; Stauffer, J. R.

    1995-01-01

    The observations of the 50 Myr old alpha Persei open cluster, performed by the Rosat's position sensitive proportional counter (PSPC), are discussed. The X-ray observations cover an area of about 10 sq deg. A total of 160 X-ray sources were detected. The comparison between the X-ray luminosity distribution functions of the alpha Persei sample and the Pleiades indicated that F and G-type stars in the alpha Persei are more X-ray luminous than their older counterparts in the Pleiades. No significant difference was found between the distributions of the K and M-type dwarfs in the two clusters.

  8. Fall 2014 SEI Research Review Edge-Enabled Tactical Systems (EETS)

    DTIC Science & Technology

    2014-10-29

    Effective communicate and reasoning despite connectivity issues • More generally, how to make programming distributed algorithms with extensible...distributed collaboration in VREP simulations for 5-12 quadcopters and ground robots • Open-source middleware and algorithms released to community...Integration into CMU Drone-RK quadcopter and Platypus autonomous boat platforms • Presentations at DARPA (CODE), AFRL C4I Workshop, and AFRL Eglin

  9. An open source high-performance solution to extract surface water drainage networks from diverse terrain conditions

    USGS Publications Warehouse

    Stanislawski, Larry V.; Survila, Kornelijus; Wendel, Jeffrey; Liu, Yan; Buttenfield, Barbara P.

    2018-01-01

    This paper describes a workflow for automating the extraction of elevation-derived stream lines using open source tools with parallel computing support and testing the effectiveness of procedures in various terrain conditions within the conterminous United States. Drainage networks are extracted from the US Geological Survey 1/3 arc-second 3D Elevation Program elevation data having a nominal cell size of 10 m. This research demonstrates the utility of open source tools with parallel computing support for extracting connected drainage network patterns and handling depressions in 30 subbasins distributed across humid, dry, and transitional climate regions and in terrain conditions exhibiting a range of slopes. Special attention is given to low-slope terrain, where network connectivity is preserved by generating synthetic stream channels through lake and waterbody polygons. Conflation analysis compares the extracted streams with a 1:24,000-scale National Hydrography Dataset flowline network and shows that similarities are greatest for second- and higher-order tributaries.

  10. Towards Full-Waveform Ambient Noise Inversion

    NASA Astrophysics Data System (ADS)

    Sager, Korbinian; Ermert, Laura; Afanasiev, Michael; Boehm, Christian; Fichtner, Andreas

    2017-04-01

    Noise tomography usually works under the assumption that the inter-station ambient noise correlation is equal to a scaled version of the Green function between the two receivers. This assumption, however, is only met under specific conditions, e.g. wavefield diffusivity and equipartitioning, or the isotropic distribution of both mono- and dipolar uncorrelated noise sources. These assumptions are typically not satisfied in the Earth. This inconsistency inhibits the exploitation of the full waveform information contained in noise correlations in order to constrain Earth structure and noise generation. To overcome this limitation, we attempt to develop a method that consistently accounts for the distribution of noise sources, 3D heterogeneous Earth structure and the full seismic wave propagation physics. This is intended to improve the resolution of tomographic images, to refine noise source distribution, and thereby to contribute to a better understanding of both Earth structure and noise generation. First, we develop an inversion strategy based on a 2D finite-difference code using adjoint techniques. To enable a joint inversion for noise sources and Earth structure, we investigate the following aspects: i) the capability of different misfit functionals to image wave speed anomalies and source distribution and ii) possible source-structure trade-offs, especially to what extent unresolvable structure can be mapped into the inverted noise source distribution and vice versa. In anticipation of real-data applications, we present an extension of the open-source waveform modelling and inversion package Salvus (http://salvus.io). It allows us to compute correlation functions in 3D media with heterogeneous noise sources at the surface and the corresponding sensitivity kernels for the distribution of noise sources and Earth structure. By studying the effect of noise sources on correlation functions in 3D, we validate the aforementioned inversion strategy and prepare the workflow necessary for the first application of full waveform ambient noise inversion to a global dataset, for which a model for the distribution of noise sources is already available.

  11. Mapping lacustrine syn-rift reservoir distribution using spectral attributes: A case study of the Pematang Brownshale Central Sumatra Basin

    NASA Astrophysics Data System (ADS)

    Haris, A.; Yustiawan, R.; Riyanto, A.; Ramadian, R.

    2017-07-01

    Pematang Brownshale is the lake sediment, which is proven as the main source rock in Malacca Strait Area. So far Brownshale is only considered as source rock, but the well data show intercalated sand layers encountered within the Pematang Brownshale, where several downhole tests proved this series as a potential hydrocarbon reservoir. Pematang formation is a syn-rift sequent deposited in Malacca Strait following the opening of central Sumatra basin during a late cretaceous to early Oligocene, which is proven as potential source rock and reservoir. The aim of the study is to identify the distribution of sandstone reservoir in Pematang Brownshale using spectral attributes. These works were carried out by integrating log data analysis and frequency maps extracted from spectral attributes Continuous Wavelet Transform (CWT). All these data are used to delineate reservoir distribution in Pematang Brownshale. Based on CWT analysis the anomalies are only visible on the frequency of I5 and I0 Hz maps, which are categorized as low frequencies. Low-frequency shadow anomaly is commonly used as an indication of the presence of hydrocarbons. The distribution of these anomalies is covering an area of approximately 3840.66 acres or equal to I554.25 sq. km, where the low-frequency pattern is interpreted as a deltaic lacustrine feature. By considering the Pematang Brown Shale of Malacca Strait area as a potential reservoir, it would open new play to another basin that has similar characteristics.

  12. Cloud-Based Distributed Control of Unmanned Systems

    DTIC Science & Technology

    2015-04-01

    during mission execution. At best, the data is saved onto hard-drives and is accessible only by the local team. Data history in a form available and...following open source technologies: GeoServer, OpenLayers, PostgreSQL , and PostGIS are chosen to implement the back-end database and server. A brief...geospatial map data. 3. PostgreSQL : An SQL-compliant object-relational database that easily scales to accommodate large amounts of data - upwards to

  13. A review on the sources and spatial-temporal distributions of Pb in Jiaozhou Bay

    NASA Astrophysics Data System (ADS)

    Yang, Dongfang; Zhang, Jie; Wang, Ming; Zhu, Sixi; Wu, Yunjie

    2017-12-01

    This paper provided a review on the source, spatial-distribution, temporal variations of Pb in Jiaozhou Bay based on investigation of Pb in surface and waters in different seasons during 1979-1983. The source strengths of Pb sources in Jiaozhou Bay were showing increasing trends, and the pollution level of Pb in this bay was slight or moderate in the early stage of reform and opening-up. Pb contents in the marine bay were mainly determined by the strength and frequency of Pb inputs from human activities, and Pb could be moving from high content areas to low content areas in the ocean interior. Surface waters in the ocean was polluted by human activities, and bottom waters was polluted by means of vertical water’s effect. The process of spatial distribution of Pb in waters was including three steps, i.e., 1), Pb was transferring to surface waters in the bay, 2) Pb was transferring to surface waters, and 3) Pb was transferring to and accumulating in bottom waters.

  14. Cheminformatics and the Semantic Web: adding value with linked data and enhanced provenance

    PubMed Central

    Frey, Jeremy G; Bird, Colin L

    2013-01-01

    Cheminformatics is evolving from being a field of study associated primarily with drug discovery into a discipline that embraces the distribution, management, access, and sharing of chemical data. The relationship with the related subject of bioinformatics is becoming stronger and better defined, owing to the influence of Semantic Web technologies, which enable researchers to integrate heterogeneous sources of chemical, biochemical, biological, and medical information. These developments depend on a range of factors: the principles of chemical identifiers and their role in relationships between chemical and biological entities; the importance of preserving provenance and properly curated metadata; and an understanding of the contribution that the Semantic Web can make at all stages of the research lifecycle. The movements toward open access, open source, and open collaboration all contribute to progress toward the goals of integration. PMID:24432050

  15. An open-source model and solution method to predict co-contraction in the finger.

    PubMed

    MacIntosh, Alexander R; Keir, Peter J

    2017-10-01

    A novel open-source biomechanical model of the index finger with an electromyography (EMG)-constrained static optimization solution method are developed with the goal of improving co-contraction estimates and providing means to assess tendon tension distribution through the finger. The Intrinsic model has four degrees of freedom and seven muscles (with a 14 component extensor mechanism). A novel plugin developed for the OpenSim modelling software applied the EMG-constrained static optimization solution method. Ten participants performed static pressing in three finger postures and five dynamic free motion tasks. Index finger 3D kinematics, force (5, 15, 30 N), and EMG (4 extrinsic muscles and first dorsal interosseous) were used in the analysis. The Intrinsic model predicted co-contraction increased by 29% during static pressing over the existing model. Further, tendon tension distribution patterns and forces, known to be essential to produce finger action, were determined by the model across all postures. The Intrinsic model and custom solution method improved co-contraction estimates to facilitate force propagation through the finger. These tools improve our interpretation of loads in the finger to develop better rehabilitation and workplace injury risk reduction strategies.

  16. Citation analytics: Data exploration and comparative analyses of CiteScores of Open Access and Subscription-Based publications indexed in Scopus (2014-2016).

    PubMed

    Atayero, Aderemi A; Popoola, Segun I; Egeonu, Jesse; Oludayo, Olumuyiwa

    2018-08-01

    Citation is one of the important metrics that are used in measuring the relevance and the impact of research publications. The potentials of citation analytics may be exploited to understand the gains of publishing scholarly peer-reviewed research outputs in either Open Access (OA) sources or Subscription-Based (SB) sources in the bid to increase citation impact. However, relevant data required for such comparative analysis must be freely accessible for evidence-based findings and conclusions. In this data article, citation scores ( CiteScores ) of 2542 OA sources and 15,040 SB sources indexed in Scopus from 2014 to 2016 were presented and analyzed based on a set of five inclusion criteria. A robust dataset, which contains the CiteScores of OA and SB publication sources included, is attached as supplementary material to this data article to facilitate further reuse. Descriptive statistics and frequency distributions of OA CiteScores and SB CiteScores are presented in tables. Boxplot representations and scatter plots are provided to show the statistical distributions of OA CiteScores and SB CiteScores across the three sub-categories (Book Series, Journal, and Trade Journal). Correlation coefficient and p-value matrices are made available within the data article. In addition, Probability Density Functions (PDFs) and Cumulative Distribution Functions (CDFs) of OA CiteScores and SB CiteScores are computed and the results are presented using tables and graphs. Furthermore, Analysis of Variance (ANOVA) and multiple comparison post-hoc tests are conducted to understand the statistical difference (and its significance, if any) in the citation impact of OA publication sources and SB publication source based on CiteScore . In the long run, the data provided in this article will help policy makers and researchers in Higher Education Institutions (HEIs) to identify the appropriate publication source type and category for dissemination of scholarly research findings with maximum citation impact.

  17. A hybrid analytical model for open-circuit field calculation of multilayer interior permanent magnet machines

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Xia, Changliang; Yan, Yan; Geng, Qiang; Shi, Tingna

    2017-08-01

    Due to the complicated rotor structure and nonlinear saturation of rotor bridges, it is difficult to build a fast and accurate analytical field calculation model for multilayer interior permanent magnet (IPM) machines. In this paper, a hybrid analytical model suitable for the open-circuit field calculation of multilayer IPM machines is proposed by coupling the magnetic equivalent circuit (MEC) method and the subdomain technique. In the proposed analytical model, the rotor magnetic field is calculated by the MEC method based on the Kirchhoff's law, while the field in the stator slot, slot opening and air-gap is calculated by subdomain technique based on the Maxwell's equation. To solve the whole field distribution of the multilayer IPM machines, the coupled boundary conditions on the rotor surface are deduced for the coupling of the rotor MEC and the analytical field distribution of the stator slot, slot opening and air-gap. The hybrid analytical model can be used to calculate the open-circuit air-gap field distribution, back electromotive force (EMF) and cogging torque of multilayer IPM machines. Compared with finite element analysis (FEA), it has the advantages of faster modeling, less computation source occupying and shorter time consuming, and meanwhile achieves the approximate accuracy. The analytical model is helpful and applicable for the open-circuit field calculation of multilayer IPM machines with any size and pole/slot number combination.

  18. Quantum key distribution with an unknown and untrusted source

    NASA Astrophysics Data System (ADS)

    Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong

    2009-03-01

    The security of a standard bi-directional ``plug & play'' quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we present the first quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard BB84 protocol, weak+vacuum decoy state protocol, and one-decoy decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source. Our work is published in [1]. [4pt] [1] Y. Zhao, B. Qi, and H.-K. Lo, Phys. Rev. A, 77:052327 (2008).

  19. Application of Open Source Software by the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on top of XML. Apache Solr, an open source search engine, was used to drive our search interface and as way to store references to metadata and data exposed via REST endpoints. As was the case with Apache OODT there was team experience with this component that helped drive this choice. Lastly, OpenSSO, an open source single sign on service, was used to secure and provide access constraints to our REST based services. For this product there was little past experience but given our service based approach seemed to be a natural fit. Given our exposure to open source we will discuss the tradeoffs and benefits received by the choices made. Moreover, we will dive into the context of how the software packages were used and the impact of their design and extensibility had on the construction of the infrastructure. Finally, we will compare our encounter across open source solutions and attributes that can vary the impression one will get. This comprehensive account of our endeavor should aid others in their assessment and use of open source.

  20. Depositional environment and distribution of Late Cretaceous [open quotes]source rocks[close quotes] from Costa Rica to West Africa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erlich, R.N.; Sofer, Z.; Pratt, L.M.

    1993-02-01

    Late Cretaceous [open quotes]source rocks[close quotes] from Costa Rica, western and eastern Venezuela, and Trinidad were studied using organic and inorganic geochemistry, biostratigraphy, and sedimentology in order to determine their depositional environments. Bulk mineralogy and major element geochemistry for 304 samples were combined with Rock Eval data and extract biomaker analysis to infer the types and distributions of the various Late Cretaceous productivity systems represented in the dataset. When data from this study are combined with published and proprietary data from offshore West Africa, Guyana/Suriname, and the central Caribbean, they show that these Late Cretaceous units can be correlated bymore » their biogeochemical characteristics to establish their temporal and spatial relationships. Paleogeographic maps constructed for the early to late Cenomanian, Turonian, Coniacian to middle Santonian, and late Santonian to latest Campanian show that upwelling and excessive fluvial runoff were probably the dominant sources of nutrient supply to the coastal productivity systems. The late Santonian to Maastrichtian rocks examined in this study indicate that organic material was poorly preserved after deposition, even though biologic productivity remained constant or changed only slightly. A rapid influx of oxygenated bottom water may have occurred following the opening of a deep water connection between the North and South Atlantic oceans, and/or separation of India from Africa and the establishment of an Antarctic oceanic connection. This study suggests that the most important factors that controlled source rock quality in northern South America were productivity, preservation, degree of clastic dilution, and subsurface diagenesis.« less

  1. Open-source sea ice drift algorithm for Sentinel-1 SAR imagery using a combination of feature-tracking and pattern-matching

    NASA Astrophysics Data System (ADS)

    Muckenhuber, Stefan; Sandven, Stein

    2017-04-01

    An open-source sea ice drift algorithm for Sentinel-1 SAR imagery is introduced based on the combination of feature-tracking and pattern-matching. A computational efficient feature-tracking algorithm produces an initial drift estimate and limits the search area for the pattern-matching, that provides small to medium scale drift adjustments and normalised cross correlation values as quality measure. The algorithm is designed to utilise the respective advantages of the two approaches and allows drift calculation at user defined locations. The pre-processing of the Sentinel-1 data has been optimised to retrieve a feature distribution that depends less on SAR backscatter peak values. A recommended parameter set for the algorithm has been found using a representative image pair over Fram Strait and 350 manually derived drift vectors as validation. Applying the algorithm with this parameter setting, sea ice drift retrieval with a vector spacing of 8 km on Sentinel-1 images covering 400 km x 400 km, takes less than 3.5 minutes on a standard 2.7 GHz processor with 8 GB memory. For validation, buoy GPS data, collected in 2015 between 15th January and 22nd April and covering an area from 81° N to 83.5° N and 12° E to 27° E, have been compared to calculated drift results from 261 corresponding Sentinel-1 image pairs. We found a logarithmic distribution of the error with a peak at 300 m. All software requirements necessary for applying the presented sea ice drift algorithm are open-source to ensure free implementation and easy distribution.

  2. Removing a barrier to computer-based outbreak and disease surveillance--the RODS Open Source Project.

    PubMed

    Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J

    2004-09-24

    Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.

  3. A Disk-Based System for Producing and Distributing Science Products from MODIS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Wolfe, Robert; Sinno, Scott; Ye Gang; Teague, Michael

    2007-01-01

    Since beginning operations in 1999, the MODIS Adaptive Processing System (MODAPS) has evolved to take advantage of trends in information technology, such as the falling cost of computing cycles and disk storage and the availability of high quality open-source software (Linux, Apache and Perl), to achieve substantial gains in processing and distribution capacity and throughput while driving down the cost of system operations.

  4. RAPID and DDS

    NASA Technical Reports Server (NTRS)

    Utz, Hans Heinrich

    2011-01-01

    This talk gives an overview of the the Robot Applications Programmers Interface Delegate (RAPID) as well as the distributed systems middleware Data Distribution Service (DDS). DDS is an open software standard, RAPID is cleared for open-source release under NOSA. RAPID specifies data-structures and semantics for high-level telemetry published by NASA robotic software. These data-structures are supported by multiple robotic platforms at Johnson Space Center (JSC), Jet Propulsion Laboratory (JPL) and Ames Research Center (ARC), providing high-level interoperability between those platforms. DDS is used as the middleware for data transfer. The feature set of the middleware heavily influences the design decision made in the RAPID specification. So it is appropriate to discuss both in this introductory talk.

  5. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions.

    PubMed

    Shenoy, Shailesh M

    2016-07-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.

  6. OntoMaton: a bioportal powered ontology widget for Google Spreadsheets.

    PubMed

    Maguire, Eamonn; González-Beltrán, Alejandra; Whetzel, Patricia L; Sansone, Susanna-Assunta; Rocca-Serra, Philippe

    2013-02-15

    Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. OntoMaton is an open source solution that brings ontology lookup and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton.

  7. pyLIMA : The first open source microlensing modeling software

    NASA Astrophysics Data System (ADS)

    Bachelet, Etienne; Street, Rachel; Bozza, Valerio

    2018-01-01

    Microlensing is highly sensitive to planets beyond the snowline and distributed along the line of sight towards the Galactic Bulge. The WFIRST-AFTA mission should detect about 3000 of these planets and significantly improves our knowledge of planet formation and statistics, complementing results found by transit and radial velocity methods. However, the modeling of microlensing event is challenging on different aspects leading to a highly time consuming analysis. After a quick summarize of these different challenges, I will present pyLIMA, the first open source microlensing modeling software. The aimed goal of this software are to be flexible, powerful and user friendly. This presentation will focus on various case and early results.

  8. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick Start Guide (Revision 1)

    DTIC Science & Technology

    2017-06-01

    for GIFT Cloud, the web -based application version of the Generalized Intelligent Framework for Tutoring (GIFT). GIFT is a modular, open-source...external applications. GIFT is available to users with a GIFT Account at no cost. GIFT Cloud is an implementation of GIFT. This web -based application...section. Approved for public release; distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser

  9. Negative ion production in large volume source with small deposition of cesium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacquot, C.; Pamela, J.; Riz, D.

    1996-03-01

    Experimental data on the enhancement of D{sup {minus}} (H{sup {minus}}) negative ion production due to cesium injection into a large volume multiampere negative ion source (MANTIS) are described. The directed deposition of small cesium amounts (5{endash}100 mg) from a compact, movable oven, placed into the central part of a MANTIS gas-discharge box was used. A calorimetrically measured D{sup {minus}} beam with an intensity up to 1.6 A and an extracted current density up to 4.2 mA/cm{sup 2} (beam energy 25 kV) was obtained. Exactly 30 mg of cesium provides at least one month of source operation (1000 pulses with amore » discharge pulse duration of 4 s). The effect of cesium on NI enhancement was immediately displayed after the distributed Cs deposition, but it needed some {open_quote}{open_quote}conditioning{close_quote}{close_quote} of cesium by tens of discharge pulses (or by several hours {open_quote}{open_quote}pause{close_quote}{close_quote}) in the case of a localized Cs deposition. No degradation of extraction-acceleration voltage holding on within the tested range of cesium injection was observed. {copyright} {ital 1996 American Institute of Physics.}« less

  10. FloorspaceJS - A New, Open Source, Web-Based Geometry Editor for Building Energy Modeling (BEM): Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macumber, Daniel L; Horowitz, Scott G; Schott, Marjorie

    Across most industries, desktop applications are being rapidly migrated to web applications for a variety of reasons. Web applications are inherently cross platform, mobile, and easier to distribute than desktop applications. Fueling this trend are a wide range of free, open source libraries and frameworks that make it incredibly easy to develop powerful web applications. The building energy modeling community is just beginning to pick up on these larger trends, with a small but growing number of building energy modeling applications starting on or moving to the web. This paper presents a new, open source, web based geometry editor formore » Building Energy Modeling (BEM). The editor is written completely in JavaScript and runs in a modern web browser. The editor works on a custom JSON file format and is designed to be integrated into a variety of web and desktop applications. The web based editor is available to use as a standalone web application at: https://nrel.github.io/openstudio-geometry-editor/. An example integration is demonstrated with the OpenStudio desktop application. Finally, the editor can be easily integrated with a wide range of possible building energy modeling web applications.« less

  11. Stewardship and management challenges within a cloud-based open data ecosystem (Invited Paper 211863)

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2017-12-01

    NOAA's Big Data Project is conducting an experiment in the collaborative distribution of open government data to non-governmental cloud-based systems. Through Cooperative Research and Development Agreements signed in 2015 between NOAA and Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium, NOAA is distributing open government data to a wide community of potential users. There are a number of significant advantages related to the use of open data on commercial cloud platforms, but through this experiment NOAA is also discovering significant challenges for those stewarding and maintaining NOAA's data resources in support of users in the wider open data ecosystem. Among the challenges that will be discussed are: the need to provide effective interpretation of the data content to enable their use by data scientists from other expert communities; effective maintenance of Collaborators' open data stores through coordinated publication of new data and new versions of older data; the provenance and verification of open data as authentic NOAA-sourced data across multiple management boundaries and analytical tools; and keeping pace with the accelerating expectations of users with regard to improved quality control, data latency, availability, and discoverability. Suggested strategies to address these challenges will also be described.

  12. Mercury- Distributed Metadata Management, Data Discovery and Access System

    NASA Astrophysics Data System (ADS)

    Palanisamy, Giri; Wilson, Bruce E.; Devarakonda, Ranjeet; Green, James M.

    2007-12-01

    Mercury is a federated metadata harvesting, search and retrieval tool based on both open source and ORNL- developed software. It was originally developed for NASA, and the Mercury development consortium now includes funding from NASA, USGS, and DOE. Mercury supports various metadata standards including XML, Z39.50, FGDC, Dublin-Core, Darwin-Core, EML, and ISO-19115 (under development). Mercury provides a single portal to information contained in disparate data management systems. It collects metadata and key data from contributing project servers distributed around the world and builds a centralized index. The Mercury search interfaces then allow the users to perform simple, fielded, spatial and temporal searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results to the user, while allowing data providers to advertise the availability of their data and maintain complete control and ownership of that data. Mercury supports various projects including: ORNL DAAC, NBII, DADDI, LBA, NARSTO, CDIAC, OCEAN, I3N, IAI, ESIP and ARM. The new Mercury system is based on a Service Oriented Architecture and supports various services such as Thesaurus Service, Gazetteer Web Service and UDDI Directory Services. This system also provides various search services including: RSS, Geo-RSS, OpenSearch, Web Services and Portlets. Other features include: Filtering and dynamic sorting of search results, book-markable search results, save, retrieve, and modify search criteria.

  13. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  14. Prediction Of pKa From Chemical Structure Using Free And Open-Source Tools

    EPA Science Inventory

    The ionization state of a chemical, reflected in pKa values, affects lipophilicity, solubility, protein binding and the ability of a chemical to cross the plasma membrane. These properties govern the pharmacokinetic parameters such as absorption, distribution, metabolism, excreti...

  15. Trends and New Directions in Software Architecture

    DTIC Science & Technology

    2014-10-10

    frameworks  Open source  Cloud strategies  NoSQL  Machine Learning  MDD  Incremental approaches  Dashboards  Distributed development...complexity grows  NoSQL Models are not created equal 2014 Our Current Research  Lightweight Evaluation and Architecture Prototyping for Big Data

  16. Low Cost, Scalable Proteomics Data Analysis Using Amazon's Cloud Computing Services and Open Source Search Algorithms

    PubMed Central

    Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.

    2009-01-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, Bing; Lougovski, Pavel; Pooser, Raphael C.

    Continuous-variable quantum key distribution (CV-QKD) protocols based on coherent detection have been studied extensively in both theory and experiment. In all the existing implementations of CV-QKD, both the quantum signal and the local oscillator (LO) are generated from the same laser and propagate through the insecure quantum channel. This arrangement may open security loopholes and limit the potential applications of CV-QKD. In our paper, we propose and demonstrate a pilot-aided feedforward data recovery scheme that enables reliable coherent detection using a “locally” generated LO. Using two independent commercial laser sources and a spool of 25-km optical fiber, we construct amore » coherent communication system. The variance of the phase noise introduced by the proposed scheme is measured to be 0.04 (rad 2), which is small enough to enable secure key distribution. This technology opens the door for other quantum communication protocols, such as the recently proposed measurement-device-independent CV-QKD, where independent light sources are employed by different users.« less

  18. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    PubMed

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  19. Open-source Framework for Storing and Manipulation of Plasma Chemical Reaction Data

    NASA Astrophysics Data System (ADS)

    Jenkins, T. G.; Averkin, S. N.; Cary, J. R.; Kruger, S. E.

    2017-10-01

    We present a new open-source framework for storage and manipulation of plasma chemical reaction data that has emerged from our in-house project MUNCHKIN. This framework consists of python scripts and C + + programs. It stores data in an SQL data base for fast retrieval and manipulation. For example, it is possible to fit cross-section data into most widely used analytical expressions, calculate reaction rates for Maxwellian distribution functions of colliding particles, and fit them into different analytical expressions. Another important feature of this framework is the ability to calculate transport properties based on the cross-section data and supplied distribution functions. In addition, this framework allows the export of chemical reaction descriptions in LaTeX format for ease of inclusion in scientific papers. With the help of this framework it is possible to generate corresponding VSim (Particle-In-Cell simulation code) and USim (unstructured multi-fluid code) input blocks with appropriate cross-sections.

  20. PACS for Bhutan: a cost effective open source architecture for emerging countries.

    PubMed

    Ratib, Osman; Roduit, Nicolas; Nidup, Dechen; De Geer, Gerard; Rosset, Antoine; Geissbuhler, Antoine

    2016-10-01

    This paper reports the design and implementation of an innovative and cost-effective imaging management infrastructure suitable for radiology centres in emerging countries. It was implemented in the main referring hospital of Bhutan equipped with a CT, an MRI, digital radiology, and a suite of several ultrasound units. They lacked the necessary informatics infrastructure for image archiving and interpretation and needed a system for distribution of images to clinical wards. The solution developed for this project combines several open source software platforms in a robust and versatile archiving and communication system connected to analysis workstations equipped with a FDA-certified version of the highly popular Open-Source software. The whole system was implemented on standard off-the-shelf hardware. The system was installed in three days, and training of the radiologists as well as the technical and IT staff was provided onsite to ensure full ownership of the system by the local team. Radiologists were rapidly capable of reading and interpreting studies on the diagnostic workstations, which had a significant benefit on their workflow and ability to perform diagnostic tasks more efficiently. Furthermore, images were also made available to several clinical units on standard desktop computers through a web-based viewer. • Open source imaging informatics platforms can provide cost-effective alternatives for PACS • Robust and cost-effective open architecture can provide adequate solutions for emerging countries • Imaging informatics is often lacking in hospitals equipped with digital modalities.

  1. Easy research data handling with an OpenEarth DataLab for geo-monitoring research

    NASA Astrophysics Data System (ADS)

    Vanderfeesten, Maurice; van der Kuil, Annemiek; Prinčič, Alenka; den Heijer, Kees; Rombouts, Jeroen

    2015-04-01

    OpenEarth DataLab is an open source-based collaboration and processing platform to enable streamlined research data management from raw data ingest and transformation to interoperable distribution. It enables geo-scientists to easily synchronise, share, compute and visualise the dynamic and most up-to-date research data, scripts and models in multi-stakeholder geo-monitoring programs. This DataLab is developed by the Research Data Services team of TU Delft Library and 3TU.Datacentrum together with coastal engineers of Delft University of Technology and Deltares. Based on the OpenEarth software stack an environment has been developed to orchestrate numerous geo-related open source software components that can empower researchers and increase the overall research quality by managing research data; enabling automatic and interoperable data workflows between all the components with track & trace, hit & run data transformation processing in cloud infrastructure using MatLab and Python, synchronisation of data and scripts (SVN), and much more. Transformed interoperable data products (KML, NetCDF, PostGIS) can be used by ready-made OpenEarth tools for further analyses and visualisation, and can be distributed via interoperable channels such as THREDDS (OpenDAP) and GeoServer. An example of a successful application of OpenEarth DataLab is the Sand Motor, an innovative method for coastal protection in the Netherlands. The Sand Motor is a huge volume of sand that has been applied along the coast to be spread naturally by wind, waves and currents. Different research disciplines are involved concerned with: weather, waves and currents, sand distribution, water table and water quality, flora and fauna, recreation and management. Researchers share and transform their data in the OpenEarth DataLab, that makes it possible to combine their data and to see influence of different aspects of the coastal protection on their models. During the project the data are available only for the researchers involved. After the project a large part of the data and scripts will be published with DOI in the Data Archive of 3TU.Datacentrum for reuse in new research. For the 83 project members of the Sand Motor, the OpenEarth DataLab is available on www.zandmotordata.nl. The OpenEarth DataLab not only saves time and increases quality, but has the potential to open new frontiers for exploring cross-domain analysis and visualisations, revealing new scientific insights.

  2. Towards Full-Waveform Ambient Noise Inversion

    NASA Astrophysics Data System (ADS)

    Sager, K.; Ermert, L. A.; Boehm, C.; Fichtner, A.

    2016-12-01

    Noise tomography usually works under the assumption that the inter-station ambient noise correlation is equal to a scaled version of the Green function between the two receivers. This assumption, however, is only met under specific conditions, e.g. wavefield diffusivity and equipartitioning, or the isotropic distribution of both mono- and dipolar uncorrelated noise sources. These assumptions are typically not satisfied in the Earth. This inconsistency inhibits the exploitation of the full waveform information contained in noise correlations in order to constrain Earth structure and noise generation. To overcome this limitation, we attempt to develop a method that consistently accounts for the distribution of noise sources, 3D heterogeneous Earth structure and the full seismic wave propagation physics. This is intended to improve the resolution of tomographic images, to refine noise source location, and thereby to contribute to a better understanding of noise generation. We introduce an operator-based formulation for the computation of correlation functions and apply the continuous adjoint method that allows us to compute first and second derivatives of misfit functionals with respect to source distribution and Earth structure efficiently. Based on these developments we design an inversion scheme using a 2D finite-difference code. To enable a joint inversion for noise sources and Earth structure, we investigate the following aspects: The capability of different misfit functionals to image wave speed anomalies and source distribution. Possible source-structure trade-offs, especially to what extent unresolvable structure can be mapped into the inverted noise source distribution and vice versa. In anticipation of real-data applications, we present an extension of the open-source waveform modelling and inversion package Salvus, which allows us to compute correlation functions in 3D media with heterogeneous noise sources at the surface.

  3. Web Monitoring of EOS Front-End Ground Operations, Science Downlinks and Level 0 Processing

    NASA Technical Reports Server (NTRS)

    Cordier, Guy R.; Wilkinson, Chris; McLemore, Bruce

    2008-01-01

    This paper addresses the efforts undertaken and the technology deployed to aggregate and distribute the metadata characterizing the real-time operations associated with NASA Earth Observing Systems (EOS) high-rate front-end systems and the science data collected at multiple ground stations and forwarded to the Goddard Space Flight Center for level 0 processing. Station operators, mission project management personnel, spacecraft flight operations personnel and data end-users for various EOS missions can retrieve the information at any time from any location having access to the internet. The users are distributed and the EOS systems are distributed but the centralized metadata accessed via an external web server provide an effective global and detailed view of the enterprise-wide events as they are happening. The data-driven architecture and the implementation of applied middleware technology, open source database, open source monitoring tools, and external web server converge nicely to fulfill the various needs of the enterprise. The timeliness and content of the information provided are key to making timely and correct decisions which reduce project risk and enhance overall customer satisfaction. The authors discuss security measures employed to limit access of data to authorized users only.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oloff, L.-P., E-mail: oloff@physik.uni-kiel.de; Hanff, K.; Stange, A.

    With the advent of ultrashort-pulsed extreme ultraviolet sources, such as free-electron lasers or high-harmonic-generation (HHG) sources, a new research field for photoelectron spectroscopy has opened up in terms of femtosecond time-resolved pump-probe experiments. The impact of the high peak brilliance of these novel sources on photoemission spectra, so-called vacuum space-charge effects caused by the Coulomb interaction among the photoemitted probe electrons, has been studied extensively. However, possible distortions of the energy and momentum distributions of the probe photoelectrons caused by the low photon energy pump pulse due to the nonlinear emission of electrons have not been studied in detail yet.more » Here, we systematically investigate these pump laser-induced space-charge effects in a HHG-based experiment for the test case of highly oriented pyrolytic graphite. Specifically, we determine how the key parameters of the pump pulse—the excitation density, wavelength, spot size, and emitted electron energy distribution—affect the measured time-dependent energy and momentum distributions of the probe photoelectrons. The results are well reproduced by a simple mean-field model, which could open a path for the correction of pump laser-induced space-charge effects and thus toward probing ultrafast electron dynamics in strongly excited materials.« less

  5. Erratum: Erratum to: Realistic interpretation of quantum mechanics and encounter-delayed-choice experiment

    NASA Astrophysics Data System (ADS)

    Long, GuiLu; Qin, Wei; Yang, Zhe; Li, Jun-Lin

    2018-06-01

    The article Realistic interpretation of quantum mechanics and encounter-delayed-choice experiment, written by GuiLu Long, Wei Qin, Zhe Yang, and Jun-Lin Li, was originally published online without open access. After publication in volume 61, issue 3: 030311 the author decided to opt for Open Choice and to make the article an open access publication. Therefore, the copyright of the article has been changed to The Author(s) 2017 and the article is forthwith distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, duplication, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The original article has been corrected.

  6. THE BERKELEY DATA ANALYSIS SYSTEM (BDAS): AN OPEN SOURCE PLATFORM FOR BIG DATA ANALYTICS

    DTIC Science & Technology

    2017-09-01

    Evan Sparks, Oliver Zahn, Michael J. Franklin, David A. Patterson, Saul Perlmutter. Scientific Computing Meets Big Data Technology: An Astronomy ...Processing Astronomy Imagery Using Big Data Technology. IEEE Transaction on Big Data, 2016. Approved for Public Release; Distribution Unlimited. 22 [93

  7. 20180318 - Prediction Of pKa From Chemical Structure Using Free And Open-Source Tools (ACS Spring)

    EPA Science Inventory

    The ionization state of a chemical, reflected in pKa values, affects lipophilicity, solubility, protein binding and the ability of a chemical to cross the plasma membrane. These properties govern the pharmacokinetic parameters such as absorption, distribution, metabolism, excreti...

  8. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License Notice: Published under licence in Journal of Physics: Conference Series by IOP Publishing Ltd.

  9. HELIOS-R: An Ultrafast, Open-Source Retrieval Code For Exoplanetary Atmosphere Characterization

    NASA Astrophysics Data System (ADS)

    LAVIE, Baptiste

    2015-12-01

    Atmospheric retrieval is a growing, new approach in the theory of exoplanet atmosphere characterization. Unlike self-consistent modeling it allows us to fully explore the parameter space, as well as the degeneracies between the parameters using a Bayesian framework. We present HELIOS-R, a very fast retrieving code written in Python and optimized for GPU computation. Once it is ready, HELIOS-R will be the first open-source atmospheric retrieval code accessible to the exoplanet community. As the new generation of direct imaging instruments (SPHERE, GPI) have started to gather data, the first version of HELIOS-R focuses on emission spectra. We use a 1D two-stream forward model for computing fluxes and couple it to an analytical temperature-pressure profile that is constructed to be in radiative equilibrium. We use our ultra-fast opacity calculator HELIOS-K (also open-source) to compute the opacities of CO2, H2O, CO and CH4 from the HITEMP database. We test both opacity sampling (which is typically used by other workers) and the method of k-distributions. Using this setup, we compute a grid of synthetic spectra and temperature-pressure profiles, which is then explored using a nested sampling algorithm. By focusing on model selection (Occam’s razor) through the explicit computation of the Bayesian evidence, nested sampling allows us to deal with current sparse data as well as upcoming high-resolution observations. Once the best model is selected, HELIOS-R provides posterior distributions of the parameters. As a test for our code we studied HR8799 system and compared our results with the previous analysis of Lee, Heng & Irwin (2013), which used the proprietary NEMESIS retrieval code. HELIOS-R and HELIOS-K are part of the set of open-source community codes we named the Exoclimes Simulation Platform (www.exoclime.org).

  10. Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets.

    PubMed

    Clark, Alex M; Dole, Krishna; Coulon-Spektor, Anna; McNutt, Andrew; Grass, George; Freundlich, Joel S; Reynolds, Robert C; Ekins, Sean

    2015-06-22

    On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user's own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery.

  11. Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets

    PubMed Central

    2015-01-01

    On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user’s own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery. PMID:25994950

  12. A Tour of Big Data, Open Source Data Management Technologies from the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.

    2012-12-01

    The Apache Software Foundation, a non-profit foundation charged with dissemination of open source software for the public good, provides a suite of data management technologies for distributed archiving, data ingestion, data dissemination, processing, triage and a host of other functionalities that are becoming critical in the Big Data regime. Apache is the world's largest open source software organization, boasting over 3000 developers from around the world all contributing to some of the most pervasive technologies in use today, from the HTTPD web server that powers a majority of Internet web sites to the Hadoop technology that is now projected at over a $1B dollar industry. Apache data management technologies are emerging as de facto off-the-shelf components for searching, distributing, processing and archiving key science data sets both geophysical, space and planetary based, all the way to biomedicine. In this talk, I will give a virtual tour of the Apache Software Foundation, its meritocracy and governance structure, and also its key big data technologies that organizations can take advantage of today and use to save cost, schedule, and resources in implementing their Big Data needs. I'll illustrate the Apache technologies in the context of several national priority projects, including the U.S. National Climate Assessment (NCA), and in the International Square Kilometre Array (SKA) project that are stretching the boundaries of volume, velocity, complexity, and other key Big Data dimensions.

  13. Adaptive and Collaborative Exploitation of 3 Dimensional Environmental Acoustics in Distributed Undersea Networks

    DTIC Science & Technology

    2015-09-30

    experiment was conducted in Broad Sound of Massachusetts Bay using the AUV Unicorn, a 147dB omnidirectional Lubell source, and an open-ended steel pipe... steel pipe target (Figure C) was dropped at an approximate local coordinate position of (x,y)=(170,155). The location was estimated using ship...position when the target was dropped, but was only accurate within 10-15m. The orientation of the target was unknown. Figure C: Open-ended steel

  14. An Open-source Meteorological Operational System and its Installation in Portuguese- speaking Countries

    NASA Astrophysics Data System (ADS)

    Almeida, W. G.; Ferreira, A. L.; Mendes, M. V.; Ribeiro, A.; Yoksas, T.

    2007-05-01

    CPTEC, a division of Brazil’s INPE, has been using several open-source software packages for a variety of tasks in its Data Division. Among these tools are ones traditionally used in research and educational communities such as GrADs (Grid Analysis and Display System from the Center for Ocean-Land-Atmosphere Studies (COLA)), the Local Data Manager (LDM) and GEMPAK (from Unidata), andl operational tools such the Automatic File Distributor (AFD) that are popular among National Meteorological Services. In addition, some tools developed locally at CPTEC are also being made available as open-source packages. One package is being used to manage the data from Automatic Weather Stations that INPE operates. This system uses only open- source tools such as MySQL database, PERL scripts and Java programs for web access, and Unidata’s Internet Data Distribution (IDD) system and AFD for data delivery. All of these packages are get bundled into a low-cost and easy to install and package called the Meteorological Data Operational System. Recently, in a cooperation with the SICLIMAD project, this system has been modified for use by Portuguese- speaking countries in Africa to manage data from many Automatic Weather Stations that are being installed in these countries under SICLIMAD sponsorship. In this presentation we describe the tools included-in and and architecture-of the Meteorological Data Operational System.

  15. QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials.

    PubMed

    Giannozzi, Paolo; Baroni, Stefano; Bonini, Nicola; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Chiarotti, Guido L; Cococcioni, Matteo; Dabo, Ismaila; Dal Corso, Andrea; de Gironcoli, Stefano; Fabris, Stefano; Fratesi, Guido; Gebauer, Ralph; Gerstmann, Uwe; Gougoussis, Christos; Kokalj, Anton; Lazzeri, Michele; Martin-Samos, Layla; Marzari, Nicola; Mauri, Francesco; Mazzarello, Riccardo; Paolini, Stefano; Pasquarello, Alfredo; Paulatto, Lorenzo; Sbraccia, Carlo; Scandolo, Sandro; Sclauzero, Gabriele; Seitsonen, Ari P; Smogunov, Alexander; Umari, Paolo; Wentzcovitch, Renata M

    2009-09-30

    QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.

  16. Design and implementation of an open source indexing solution for a large set of radiological reports and images.

    PubMed

    Voet, T; Devolder, P; Pynoo, B; Vercruysse, J; Duyck, P

    2007-11-01

    This paper hopes to share the insights we experienced during designing, building, and running an indexing solution for a large set of radiological reports and images in a production environment for more than 3 years. Several technical challenges were encountered and solved in the course of this project. One hundred four million words in 1.8 million radiological reports from 1989 to the present were indexed and became instantaneously searchable in a user-friendly fashion; the median query duration is only 31 ms. Currently, our highly tuned index holds 332,088 unique words in four languages. The indexing system is feature-rich and language-independent and allows for making complex queries. For research and training purposes it certainly is a valuable and convenient addition to our radiology informatics toolbox. Extended use of open-source technology dramatically reduced both implementation time and cost. All software we developed related to the indexing project has been made available to the open-source community covered by an unrestricted Berkeley Software Distribution-style license.

  17. Using open source computational tools for predicting human metabolic stability and additional absorption, distribution, metabolism, excretion, and toxicity properties.

    PubMed

    Gupta, Rishi R; Gifford, Eric M; Liston, Ted; Waller, Chris L; Hohman, Moses; Bunin, Barry A; Ekins, Sean

    2010-11-01

    Ligand-based computational models could be more readily shared between researchers and organizations if they were generated with open source molecular descriptors [e.g., chemistry development kit (CDK)] and modeling algorithms, because this would negate the requirement for proprietary commercial software. We initially evaluated open source descriptors and model building algorithms using a training set of approximately 50,000 molecules and a test set of approximately 25,000 molecules with human liver microsomal metabolic stability data. A C5.0 decision tree model demonstrated that CDK descriptors together with a set of Smiles Arbitrary Target Specification (SMARTS) keys had good statistics [κ = 0.43, sensitivity = 0.57, specificity = 0.91, and positive predicted value (PPV) = 0.64], equivalent to those of models built with commercial Molecular Operating Environment 2D (MOE2D) and the same set of SMARTS keys (κ = 0.43, sensitivity = 0.58, specificity = 0.91, and PPV = 0.63). Extending the dataset to ∼193,000 molecules and generating a continuous model using Cubist with a combination of CDK and SMARTS keys or MOE2D and SMARTS keys confirmed this observation. When the continuous predictions and actual values were binned to get a categorical score we observed a similar κ statistic (0.42). The same combination of descriptor set and modeling method was applied to passive permeability and P-glycoprotein efflux data with similar model testing statistics. In summary, open source tools demonstrated predictive results comparable to those of commercial software with attendant cost savings. We discuss the advantages and disadvantages of open source descriptors and the opportunity for their use as a tool for organizations to share data precompetitively, avoiding repetition and assisting drug discovery.

  18. Web-client based distributed generalization and geoprocessing

    USGS Publications Warehouse

    Wolf, E.B.; Howe, K.

    2009-01-01

    Generalization and geoprocessing operations on geospatial information were once the domain of complex software running on high-performance workstations. Currently, these computationally intensive processes are the domain of desktop applications. Recent efforts have been made to move geoprocessing operations server-side in a distributed, web accessible environment. This paper initiates research into portable client-side generalization and geoprocessing operations as part of a larger effort in user-centered design for the US Geological Survey's The National Map. An implementation of the Ramer-Douglas-Peucker (RDP) line simplification algorithm was created in the open source OpenLayers geoweb client. This algorithm implementation was benchmarked using differing data structures and browser platforms. The implementation and results of the benchmarks are discussed in the general context of client-side geoprocessing. (Abstract).

  19. pyNS: an open-source framework for 0D haemodynamic modelling.

    PubMed

    Manini, Simone; Antiga, Luca; Botti, Lorenzo; Remuzzi, Andrea

    2015-06-01

    A number of computational approaches have been proposed for the simulation of haemodynamics and vascular wall dynamics in complex vascular networks. Among them, 0D pulse wave propagation methods allow to efficiently model flow and pressure distributions and wall displacements throughout vascular networks at low computational costs. Although several techniques are documented in literature, the availability of open-source computational tools is still limited. We here present python Network Solver, a modular solver framework for 0D problems released under a BSD license as part of the archToolkit ( http://archtk.github.com ). As an application, we describe patient-specific models of the systemic circulation and detailed upper extremity for use in the prediction of maturation after surgical creation of vascular access for haemodialysis.

  20. Using multi-date satellite imagery to monitor invasive grass species distribution in post-wildfire landscapes: An iterative, adaptable approach that employs open-source data and software

    USGS Publications Warehouse

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Kumar, Sunil; Swallow, Aaron; Luizza, Matthew; Chignell, Steve

    2017-01-01

    Among the most pressing concerns of land managers in post-wildfire landscapes are the establishment and spread of invasive species. Land managers need accurate maps of invasive species cover for targeted management post-disturbance that are easily transferable across space and time. In this study, we sought to develop an iterative, replicable methodology based on limited invasive species occurrence data, freely available remotely sensed data, and open source software to predict the distribution of Bromus tectorum (cheatgrass) in a post-wildfire landscape. We developed four species distribution models using eight spectral indices derived from five months of Landsat 8 Operational Land Imager (OLI) data in 2014. These months corresponded to both cheatgrass growing period and time of field data collection in the study area. The four models were improved using an iterative approach in which a threshold for cover was established, and all models had high sensitivity values when tested on an independent dataset. We also quantified the area at highest risk for invasion in future seasons given 2014 distribution, topographic covariates, and seed dispersal limitations. These models demonstrate the effectiveness of using derived multi-date spectral indices as proxies for species occurrence on the landscape, the importance of selecting thresholds for invasive species cover to evaluate ecological risk in species distribution models, and the applicability of Landsat 8 OLI and the Software for Assisted Habitat Modeling for targeted invasive species management.

  1. Using multi-date satellite imagery to monitor invasive grass species distribution in post-wildfire landscapes: An iterative, adaptable approach that employs open-source data and software

    NASA Astrophysics Data System (ADS)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Kumar, Sunil; Swallow, Aaron; Luizza, Matthew W.; Chignell, Stephen M.

    2017-07-01

    Among the most pressing concerns of land managers in post-wildfire landscapes are the establishment and spread of invasive species. Land managers need accurate maps of invasive species cover for targeted management post-disturbance that are easily transferable across space and time. In this study, we sought to develop an iterative, replicable methodology based on limited invasive species occurrence data, freely available remotely sensed data, and open source software to predict the distribution of Bromus tectorum (cheatgrass) in a post-wildfire landscape. We developed four species distribution models using eight spectral indices derived from five months of Landsat 8 Operational Land Imager (OLI) data in 2014. These months corresponded to both cheatgrass growing period and time of field data collection in the study area. The four models were improved using an iterative approach in which a threshold for cover was established, and all models had high sensitivity values when tested on an independent dataset. We also quantified the area at highest risk for invasion in future seasons given 2014 distribution, topographic covariates, and seed dispersal limitations. These models demonstrate the effectiveness of using derived multi-date spectral indices as proxies for species occurrence on the landscape, the importance of selecting thresholds for invasive species cover to evaluate ecological risk in species distribution models, and the applicability of Landsat 8 OLI and the Software for Assisted Habitat Modeling for targeted invasive species management.

  2. Diminishing detonator effectiveness through electromagnetic effects

    DOEpatents

    Schill, Jr, Robert A.

    2016-09-20

    An inductively coupled transmission line with distributed electromotive force source and an alternative coupling model based on empirical data and theory were developed to initiate bridge wire melt for a detonator with an open and a short circuit detonator load. In the latter technique, the model was developed to exploit incomplete knowledge of the open circuited detonator using tendencies common to all of the open circuit loads examined. Military, commercial, and improvised detonators were examined and modeled. Nichrome, copper, platinum, and tungsten are the detonator specific bridge wire materials studied. The improvised detonators were made typically made with tungsten wire and copper (.about.40 AWG wire strands) wire.

  3. XaNSoNS: GPU-accelerated simulator of diffraction patterns of nanoparticles

    NASA Astrophysics Data System (ADS)

    Neverov, V. S.

    XaNSoNS is an open source software with GPU support, which simulates X-ray and neutron 1D (or 2D) diffraction patterns and pair-distribution functions (PDF) for amorphous or crystalline nanoparticles (up to ∼107 atoms) of heterogeneous structural content. Among the multiple parameters of the structure the user may specify atomic displacements, site occupancies, molecular displacements and molecular rotations. The software uses general equations nonspecific to crystalline structures to calculate the scattering intensity. It supports four major standards of parallel computing: MPI, OpenMP, Nvidia CUDA and OpenCL, enabling it to run on various architectures, from CPU-based HPCs to consumer-level GPUs.

  4. Importance of vesicle release stochasticity in neuro-spike communication.

    PubMed

    Ramezani, Hamideh; Akan, Ozgur B

    2017-07-01

    Aim of this paper is proposing a stochastic model for vesicle release process, a part of neuro-spike communication. Hence, we study biological events occurring in this process and use microphysiological simulations to observe functionality of these events. Since the most important source of variability in vesicle release probability is opening of voltage dependent calcium channels (VDCCs) followed by influx of calcium ions through these channels, we propose a stochastic model for this event, while using a deterministic model for other variability sources. To capture the stochasticity of calcium influx to pre-synaptic neuron in our model, we study its statistics and find that it can be modeled by a distribution defined based on Normal and Logistic distributions.

  5. Collaboration using open standards and open source software (examples of DIAS/CEOS Water Portal)

    NASA Astrophysics Data System (ADS)

    Miura, S.; Sekioka, S.; Kuroiwa, K.; Kudo, Y.

    2015-12-01

    The DIAS/CEOS Water Portal is a part of the DIAS (Data Integration and Analysis System, http://www.editoria.u-tokyo.ac.jp/projects/dias/?locale=en_US) systems for data distribution for users including, but not limited to, scientists, decision makers and officers like river administrators. One of the functions of this portal is to enable one-stop search and access variable water related data archived multiple data centers located all over the world. This portal itself does not store data. Instead, according to requests made by users on the web page, it retrieves data from distributed data centers on-the-fly and lets them download and see rendered images/plots. Our system mainly relies on the open source software GI-cat (http://essi-lab.eu/do/view/GIcat) and open standards such as OGC-CSW, Opensearch and OPeNDAP protocol to enable the above functions. Details on how it works will be introduced during the presentation. Although some data centers have unique meta data format and/or data search protocols, our portal's brokering function enables users to search across various data centers at one time. And this portal is also connected to other data brokering systems, including GEOSS DAB (Discovery and Access Broker). As a result, users can search over thousands of datasets, millions of files at one time. Users can access the DIAS/CEOS Water Portal system at http://waterportal.ceos.org/.

  6. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  7. Building an Open Source Framework for Integrated Catchment Modeling

    NASA Astrophysics Data System (ADS)

    Jagers, B.; Meijers, E.; Villars, M.

    2015-12-01

    In order to develop effective strategies and associated policies for environmental management, we need to understand the dynamics of the natural system as a whole and the human role therein. This understanding is gained by comparing our mental model of the world with observations from the field. However, to properly understand the system we should look at dynamics of water, sediments, water quality, and ecology throughout the whole system from catchment to coast both at the surface and in the subsurface. Numerical models are indispensable in helping us understand the interactions of the overall system, but we need to be able to update and adjust them to improve our understanding and test our hypotheses. To support researchers around the world with this challenging task we started a few years ago with the development of a new open source modeling environment DeltaShell that integrates distributed hydrological models with 1D, 2D, and 3D hydraulic models including generic components for the tracking of sediment, water quality, and ecological quantities throughout the hydrological cycle composed of the aforementioned components. The open source approach combined with a modular approach based on open standards, which allow for easy adjustment and expansion as demands and knowledge grow, provides an ideal starting point for addressing challenging integrated environmental questions.

  8. Open-source algorithm for detecting sea ice surface features in high-resolution optical imagery

    NASA Astrophysics Data System (ADS)

    Wright, Nicholas C.; Polashenski, Chris M.

    2018-04-01

    Snow, ice, and melt ponds cover the surface of the Arctic Ocean in fractions that change throughout the seasons. These surfaces control albedo and exert tremendous influence over the energy balance in the Arctic. Increasingly available meter- to decimeter-scale resolution optical imagery captures the evolution of the ice and ocean surface state visually, but methods for quantifying coverage of key surface types from raw imagery are not yet well established. Here we present an open-source system designed to provide a standardized, automated, and reproducible technique for processing optical imagery of sea ice. The method classifies surface coverage into three main categories: snow and bare ice, melt ponds and submerged ice, and open water. The method is demonstrated on imagery from four sensor platforms and on imagery spanning from spring thaw to fall freeze-up. Tests show the classification accuracy of this method typically exceeds 96 %. To facilitate scientific use, we evaluate the minimum observation area required for reporting a representative sample of surface coverage. We provide an open-source distribution of this algorithm and associated training datasets and suggest the community consider this a step towards standardizing optical sea ice imagery processing. We hope to encourage future collaborative efforts to improve the code base and to analyze large datasets of optical sea ice imagery.

  9. Embracing Open Software Development in Solar Physics

    NASA Astrophysics Data System (ADS)

    Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.

    2012-12-01

    We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We discuss the development of both these efforts and how they are beginning to influence the solar physics community.

  10. Closing gaps between open software and public data in a hackathon setting: User-centered software prototyping.

    PubMed

    Busby, Ben; Lesko, Matthew; Federer, Lisa

    2016-01-01

    In genomics, bioinformatics and other areas of data science, gaps exist between extant public datasets and the open-source software tools built by the community to analyze similar data types.  The purpose of biological data science hackathons is to assemble groups of genomics or bioinformatics professionals and software developers to rapidly prototype software to address these gaps.  The only two rules for the NCBI-assisted hackathons run so far are that 1) data either must be housed in public data repositories or be deposited to such repositories shortly after the hackathon's conclusion, and 2) all software comprising the final pipeline must be open-source or open-use.  Proposed topics, as well as suggested tools and approaches, are distributed to participants at the beginning of each hackathon and refined during the event.  Software, scripts, and pipelines are developed and published on GitHub, a web service providing publicly available, free-usage tiers for collaborative software development. The code resulting from each hackathon is published at https://github.com/NCBI-Hackathons/ with separate directories or repositories for each team.

  11. Numerical simulation of the SOFIA flowfield

    NASA Technical Reports Server (NTRS)

    Klotz, Stephen P.

    1994-01-01

    This report provides a concise summary of the contribution of computational fluid dynamics (CFD) to the SOFIA (Stratospheric Observatory for Infrared Astronomy) project at NASA Ames and presents results obtained from closed- and open-cavity SOFIA simulations. The aircraft platform is a Boeing 747SP and these are the first SOFIA simulations run with the aircraft empennage included in the geometry database. In the open-cavity run the telescope is mounted behind the wings. Results suggest that the cavity markedly influences the mean pressure distribution on empennage surfaces and that 110-140 decibel (db) sound pressure levels are typical in the cavity and on the horizontal and vertical stabilizers. A strong source of sound was found to exist on the rim of the open telescope cavity. The presence of this source suggests that additional design work needs to be performed in order to minimize the sound emanating from that location. A fluid dynamic analysis of the engine plumes is also contained in this report. The analysis was part of an effort to quantify the degradation of telescope performance resulting from the proximity of the port engine exhaust plumes to the open telescope bay.

  12. Deterministic Design Optimization of Structures in OpenMDAO Framework

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula M.; Pai, Shantaram S.

    2012-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.

  13. OpenDA-WFLOW framework for improving hydrologic predictions using distributed hydrologic models

    NASA Astrophysics Data System (ADS)

    Weerts, Albrecht; Schellekens, Jaap; Kockx, Arno; Hummel, Stef

    2017-04-01

    Data assimilation (DA) holds considerable potential for improving hydrologic predictions (Liu et al., 2012) and increase the potential for early warning and/or smart water management. However, advances in hydrologic DA research have not yet been adequately or timely implemented in operational forecast systems to improve the skill of forecasts for better informed real-world decision making. The objective of this work is to highlight the development of a generic linkage of the open source OpenDA package and the open source community hydrologic modeling framework Openstreams/WFLOW and its application in operational hydrological forecasting on various spatial scales. The coupling between OpenDA and Openstreams/wflow framework is based on the emerging standard Basic Model Interface (BMI) as advocated by CSDMS using cross-platform webservices (i.e. Apache Thrift) developed by Hut et al. (2016). The potential application of the OpenDA-WFLOW for operational hydrologic forecasting including its integration with Delft-FEWS (used by more than 40 operational forecast centers around the world (Werner et al., 2013)) is demonstrated by the presented case studies. We will also highlight the possibility to give real-time insight into the working of the DA methods applied for supporting the forecaster as mentioned as one of the burning issues by Liu et al., (2012).

  14. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    NASA Astrophysics Data System (ADS)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the fingertips of users around the globe. This user-friendly and low-cost information dissemination provides global information as a basis for decision-making in a number of critical areas, including public health, energy, agriculture, weather, water, climate, natural disasters and ecosystems. GEONETCast makes available satellite images via Digital Video Broadcast (DVB) technology. An OGC WMS interface and plug-ins which convert GEONETCast data streams allow an ILWIS user to integrate various distributed data sources with data locally stored on his machine. Our paper describes a use case in which ILWIS is used with GEONETCast satellite imagery for decision making processes in Ghana. We also explain how the ILWIS software can be extended with additional functionality by means of building plug-ins and unfold our plans to implement other OGC standards, such as WCS and WPS in the same context. Especially, the latter one can be seen as a major step forward in terms of moving well-proven desktop based processing functionality to the web. This enables the embedding of ILWIS functionality in Spatial Data Infrastructures or even the execution in scalable and on-demand cloud computing environments.

  15. Free Software and Free Textbooks

    ERIC Educational Resources Information Center

    Takhteyev, Yuri

    2012-01-01

    Some of the world's best and most sophisticated software is distributed today under "free" or "open source" licenses, which allow the recipients of such software to use, modify, and share it without paying royalties or asking for permissions. If this works for software, could it also work for educational resources, such as books? The economics of…

  16. Podcast Pilots for Distance Planning, Programming, and Development

    ERIC Educational Resources Information Center

    Cordes, Sean

    2005-01-01

    This paper examines podcasting as a library support for distance learning and information systems and services. The manuscript provides perspective on the knowledge base in the growing area of podcasting in libraries and academia. A walkthrough of the podcast creation and distribution process using basic computing skills and open source tools is…

  17. A Checkup with Open Source Software Revitalizes an Early Electronic Resource Portal

    ERIC Educational Resources Information Center

    Spitzer, Stephan; Brown, Stephen

    2007-01-01

    The Uniformed Services University of the Health Sciences, located on the National Naval Medical Center's campus in Bethesda, Maryland, is a medical education and research facility for the nation's military and public health community. In order to support its approximately 7,500 globally distributed users, the university's James A. Zimble Learning…

  18. Case study of open-source enterprise resource planning implementation in a small business

    NASA Astrophysics Data System (ADS)

    Olson, David L.; Staley, Jesse

    2012-02-01

    Enterprise resource planning (ERP) systems have been recognised as offering great benefit to some organisations, although they are expensive and problematic to implement. The cost and risk make well-developed proprietorial systems unaffordable to small businesses. Open-source software (OSS) has become a viable means of producing ERP system products. The question this paper addresses is the feasibility of OSS ERP systems for small businesses. A case is reported involving two efforts to implement freely distributed ERP software products in a small US make-to-order engineering firm. The case emphasises the potential of freely distributed ERP systems, as well as some of the hurdles involved in their implementation. The paper briefly reviews highlights of OSS ERP systems, with the primary focus on reporting the case experiences for efforts to implement ERPLite software and xTuple software. While both systems worked from a technical perspective, both failed due to economic factors. While these economic conditions led to imperfect results, the case demonstrates the feasibility of OSS ERP for small businesses. Both experiences are evaluated in terms of risk dimension.

  19. Acquire: an open-source comprehensive cancer biobanking system.

    PubMed

    Dowst, Heidi; Pew, Benjamin; Watkins, Chris; McOwiti, Apollo; Barney, Jonathan; Qu, Shijing; Becnel, Lauren B

    2015-05-15

    The probability of effective treatment of cancer with a targeted therapeutic can be improved for patients with defined genotypes containing actionable mutations. To this end, many human cancer biobanks are integrating more tightly with genomic sequencing facilities and with those creating and maintaining patient-derived xenografts (PDX) and cell lines to provide renewable resources for translational research. To support the complex data management needs and workflows of several such biobanks, we developed Acquire. It is a robust, secure, web-based, database-backed open-source system that supports all major needs of a modern cancer biobank. Its modules allow for i) up-to-the-minute 'scoreboard' and graphical reporting of collections; ii) end user roles and permissions; iii) specimen inventory through caTissue Suite; iv) shipping forms for distribution of specimens to pathology, genomic analysis and PDX/cell line creation facilities; v) robust ad hoc querying; vi) molecular and cellular quality control metrics to track specimens' progress and quality; vii) public researcher request; viii) resource allocation committee distribution request review and oversight and ix) linkage to available derivatives of specimen. © The Author 2015. Published by Oxford University Press.

  20. An inexpensive Arduino-based LED stimulator system for vision research.

    PubMed

    Teikari, Petteri; Najjar, Raymond P; Malkki, Hemi; Knoblauch, Kenneth; Dumortier, Dominique; Gronfier, Claude; Cooper, Howard M

    2012-11-15

    Light emitting diodes (LEDs) are being used increasingly as light sources in life sciences applications such as in vision research, fluorescence microscopy and in brain-computer interfacing. Here we present an inexpensive but effective visual stimulator based on light emitting diodes (LEDs) and open-source Arduino microcontroller prototyping platform. The main design goal of our system was to use off-the-shelf and open-source components as much as possible, and to reduce design complexity allowing use of the system to end-users without advanced electronics skills. The main core of the system is a USB-connected Arduino microcontroller platform designed initially with a specific emphasis on the ease-of-use creating interactive physical computing environments. The pulse-width modulation (PWM) signal of Arduino was used to drive LEDs allowing linear light intensity control. The visual stimulator was demonstrated in applications such as murine pupillometry, rodent models for cognitive research, and heterochromatic flicker photometry in human psychophysics. These examples illustrate some of the possible applications that can be easily implemented and that are advantageous for students, educational purposes and universities with limited resources. The LED stimulator system was developed as an open-source project. Software interface was developed using Python with simplified examples provided for Matlab and LabVIEW. Source code and hardware information are distributed under the GNU General Public Licence (GPL, version 3). Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Measurement of greenhouse gas emissions from agricultural sites using open-path optical remote sensing method.

    PubMed

    Ro, Kyoung S; Johnson, Melvin H; Varma, Ravi M; Hashmonay, Ram A; Hunt, Patrick

    2009-08-01

    Improved characterization of distributed emission sources of greenhouse gases such as methane from concentrated animal feeding operations require more accurate methods. One promising method is recently used by the USEPA. It employs a vertical radial plume mapping (VRPM) algorithm using optical remote sensing techniques. We evaluated this method to estimate emission rates from simulated distributed methane sources. A scanning open-path tunable diode laser was used to collect path-integrated concentrations (PICs) along different optical paths on a vertical plane downwind of controlled methane releases. Each cycle consists of 3 ground-level PICs and 2 above ground PICs. Three- to 10-cycle moving averages were used to reconstruct mass equivalent concentration plum maps on the vertical plane. The VRPM algorithm estimated emission rates of methane along with meteorological and PIC data collected concomitantly under different atmospheric stability conditions. The derived emission rates compared well with actual released rates irrespective of atmospheric stability conditions. The maximum error was 22 percent when 3-cycle moving average PICs were used; however, it decreased to 11% when 10-cycle moving average PICs were used. Our validation results suggest that this new VRPM method may be used for improved estimations of greenhouse gas emission from a variety of agricultural sources.

  2. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    DOE PAGES

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; ...

    2015-11-09

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here in this paper, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive.

  3. Duke Surgery Patient Safety: an open-source application for anonymous reporting of adverse and near-miss surgical events

    PubMed Central

    Pietrobon, Ricardo; Lima, Raquel; Shah, Anand; Jacobs, Danny O; Harker, Matthew; McCready, Mariana; Martins, Henrique; Richardson, William

    2007-01-01

    Background Studies have shown that 4% of hospitalized patients suffer from an adverse event caused by the medical treatment administered. Some institutions have created systems to encourage medical workers to report these adverse events. However, these systems often prove to be inadequate and/or ineffective for reviewing the data collected and improving the outcomes in patient safety. Objective To describe the Web-application Duke Surgery Patient Safety, designed for the anonymous reporting of adverse and near-miss events as well as scheduled reporting to surgeons and hospital administration. Software architecture DSPS was developed primarily using Java language running on a Tomcat server and with MySQL database as its backend. Results Formal and field usability tests were used to aid in development of DSPS. Extensive experience with DSPS at our institution indicate that DSPS is easy to learn and use, has good speed, provides needed functionality, and is well received by both adverse-event reporters and administrators. Discussion This is the first description of an open-source application for reporting patient safety, which allows the distribution of the application to other institutions in addition for its ability to adapt to the needs of different departments. DSPS provides a mechanism for anonymous reporting of adverse events and helps to administer Patient Safety initiatives. Conclusion The modifiable framework of DSPS allows adherence to evolving national data standards. The open-source design of DSPS permits surgical departments with existing reporting mechanisms to integrate them with DSPS. The DSPS application is distributed under the GNU General Public License. PMID:17472749

  4. Duke Surgery Patient Safety: an open-source application for anonymous reporting of adverse and near-miss surgical events.

    PubMed

    Pietrobon, Ricardo; Lima, Raquel; Shah, Anand; Jacobs, Danny O; Harker, Matthew; McCready, Mariana; Martins, Henrique; Richardson, William

    2007-05-01

    Studies have shown that 4% of hospitalized patients suffer from an adverse event caused by the medical treatment administered. Some institutions have created systems to encourage medical workers to report these adverse events. However, these systems often prove to be inadequate and/or ineffective for reviewing the data collected and improving the outcomes in patient safety. To describe the Web-application Duke Surgery Patient Safety, designed for the anonymous reporting of adverse and near-miss events as well as scheduled reporting to surgeons and hospital administration. SOFTWARE ARCHITECTURE: DSPS was developed primarily using Java language running on a Tomcat server and with MySQL database as its backend. Formal and field usability tests were used to aid in development of DSPS. Extensive experience with DSPS at our institution indicate that DSPS is easy to learn and use, has good speed, provides needed functionality, and is well received by both adverse-event reporters and administrators. This is the first description of an open-source application for reporting patient safety, which allows the distribution of the application to other institutions in addition for its ability to adapt to the needs of different departments. DSPS provides a mechanism for anonymous reporting of adverse events and helps to administer Patient Safety initiatives. The modifiable framework of DSPS allows adherence to evolving national data standards. The open-source design of DSPS permits surgical departments with existing reporting mechanisms to integrate them with DSPS. The DSPS application is distributed under the GNU General Public License.

  5. Modular Aquatic Simulation System 1D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-04-19

    MASS1 simulates open channel hydrodynamics and transport in branched channel networks, using cross-section averaged forms of the continuity, momentum, and convection diffusion equations. Thermal energy transport (temperature), including meteorological influences is supported. The thermodynamics of total dissolved gas (TDG) can be directly simulated. MASS1 has been developed over the last 20 years. It is currently being used on DOE projects that require MASS1 to beopen source. Hence, the authors would like to distribute MASS1 in source form.

  6. Plasma x-ray radiation source.

    PubMed

    Popkov, N F; Kargin, V I; Ryaslov, E A; Pikar', A S

    1995-01-01

    This paper gives the results of studies on a plasma x-ray source, which enables one to obtain a 2.5-krad radiation dose per pulse over an area of 100 cm2 in the quantum energy range from 20 to 500 keV. Pulse duration is 100 ns. Spectral radiation distributions from a diode under various operation conditions of a plasma are obtained. A Marx generator served as an initial energy source of 120 kJ with a discharge time of T/4 = 10-6 s. A short electromagnetic pulse (10-7 s) was shaped using plasma erosion opening switches.

  7. Dynamic VM Provisioning for TORQUE in a Cloud Environment

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.

    2014-06-01

    Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.

  8. Language Geography from Microblogging Platforms

    NASA Astrophysics Data System (ADS)

    Mocanu, Delia; Baronchelli, Andrea; Perra, Nicola; Gonçalves, Bruno; Vespignani, Alessandro

    2013-03-01

    Microblogging platforms have now become major open source indicators for complex social interactions. With the advent of smartphones, the everincreasing mobile Internet traffic gives us the unprecedented opportunity to complement studies of complex social phenomena with real-time location information. In this work, we show that the data nowadays accessible allows for detailed studies at different scales, ranging from country-level aggregate analysis to the analysis of linguistic communities withing specific neighborhoods. The high resolution and coverage of this data permits us to investigate such issues as the linguistic homogeneity of different countries, touristic seasonal patterns within countries, and the geographical distribution of different languages in bilingual regions. This work highlights the potentialities of geolocalized studies of open data sources that can provide an extremely detailed picture of the language geography.

  9. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    PubMed

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  10. Advanced capabilities for materials modelling with Quantum ESPRESSO

    NASA Astrophysics Data System (ADS)

    Giannozzi, P.; Andreussi, O.; Brumme, T.; Bunau, O.; Buongiorno Nardelli, M.; Calandra, M.; Car, R.; Cavazzoni, C.; Ceresoli, D.; Cococcioni, M.; Colonna, N.; Carnimeo, I.; Dal Corso, A.; de Gironcoli, S.; Delugas, P.; DiStasio, R. A., Jr.; Ferretti, A.; Floris, A.; Fratesi, G.; Fugallo, G.; Gebauer, R.; Gerstmann, U.; Giustino, F.; Gorni, T.; Jia, J.; Kawamura, M.; Ko, H.-Y.; Kokalj, A.; Küçükbenli, E.; Lazzeri, M.; Marsili, M.; Marzari, N.; Mauri, F.; Nguyen, N. L.; Nguyen, H.-V.; Otero-de-la-Roza, A.; Paulatto, L.; Poncé, S.; Rocca, D.; Sabatini, R.; Santra, B.; Schlipf, M.; Seitsonen, A. P.; Smogunov, A.; Timrov, I.; Thonhauser, T.; Umari, P.; Vast, N.; Wu, X.; Baroni, S.

    2017-11-01

    Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.

  11. Advanced capabilities for materials modelling with Quantum ESPRESSO.

    PubMed

    Giannozzi, P; Andreussi, O; Brumme, T; Bunau, O; Buongiorno Nardelli, M; Calandra, M; Car, R; Cavazzoni, C; Ceresoli, D; Cococcioni, M; Colonna, N; Carnimeo, I; Dal Corso, A; de Gironcoli, S; Delugas, P; DiStasio, R A; Ferretti, A; Floris, A; Fratesi, G; Fugallo, G; Gebauer, R; Gerstmann, U; Giustino, F; Gorni, T; Jia, J; Kawamura, M; Ko, H-Y; Kokalj, A; Küçükbenli, E; Lazzeri, M; Marsili, M; Marzari, N; Mauri, F; Nguyen, N L; Nguyen, H-V; Otero-de-la-Roza, A; Paulatto, L; Poncé, S; Rocca, D; Sabatini, R; Santra, B; Schlipf, M; Seitsonen, A P; Smogunov, A; Timrov, I; Thonhauser, T; Umari, P; Vast, N; Wu, X; Baroni, S

    2017-10-24

    Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.

  12. Advanced capabilities for materials modelling with Quantum ESPRESSO.

    PubMed

    Andreussi, Oliviero; Brumme, Thomas; Bunau, Oana; Buongiorno Nardelli, Marco; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Cococcioni, Matteo; Colonna, Nicola; Carnimeo, Ivan; Dal Corso, Andrea; de Gironcoli, Stefano; Delugas, Pietro; DiStasio, Robert; Ferretti, Andrea; Floris, Andrea; Fratesi, Guido; Fugallo, Giorgia; Gebauer, Ralph; Gerstmann, Uwe; Giustino, Feliciano; Gorni, Tommaso; Jia, Junteng; Kawamura, Mitsuaki; Ko, Hsin-Yu; Kokalj, Anton; Küçükbenli, Emine; Lazzeri, Michele; Marsili, Margherita; Marzari, Nicola; Mauri, Francesco; Nguyen, Ngoc Linh; Nguyen, Huy-Viet; Otero-de-la-Roza, Alberto; Paulatto, Lorenzo; Poncé, Samuel; Giannozzi, Paolo; Rocca, Dario; Sabatini, Riccardo; Santra, Biswajit; Schlipf, Martin; Seitsonen, Ari Paavo; Smogunov, Alexander; Timrov, Iurii; Thonhauser, Timo; Umari, Paolo; Vast, Nathalie; Wu, Xifan; Baroni, Stefano

    2017-09-27

    Quantum ESPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudo-potential and projector-augmented-wave approaches. Quantum ESPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement theirs ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software. © 2017 IOP Publishing Ltd.

  13. Inequalities in Open Source Software Development: Analysis of Contributor’s Commits in Apache Software Foundation Projects

    PubMed Central

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution. PMID:27096157

  14. Design and Implementation of a Modern Automatic Deformation Monitoring System

    NASA Astrophysics Data System (ADS)

    Engel, Philipp; Schweimler, Björn

    2016-03-01

    The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the University of Applied Sciences in Neubrandenburg (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.

  15. C3-PRO: Connecting ResearchKit to the Health System Using i2b2 and FHIR.

    PubMed

    Pfiffner, Pascal B; Pinyol, Isaac; Natter, Marc D; Mandl, Kenneth D

    2016-01-01

    A renewed interest by consumer information technology giants in the healthcare domain is focused on transforming smartphones into personal health data storage devices. With the introduction of the open source ResearchKit, Apple provides a framework for researchers to inform and consent research subjects, and to readily collect personal health data and patient reported outcomes (PRO) from distributed populations. However, being research backend agnostic, ResearchKit does not provide data transmission facilities, leaving research apps disconnected from the health system. Personal health data and PROs are of the most value when presented in context along with health system data. Our aim was to build a toolchain that allows easy and secure integration of personal health and PRO data into an open source platform widely adopted across 140 academic medical centers. We present C3-PRO: the Consent, Contact, and Community framework for Patient Reported Outcomes. This open source toolchain connects, in a standards-compliant fashion, any ResearchKit app to the widely-used clinical research infrastructure Informatics for Integrating Biology and the Bedside (i2b2). C3-PRO leverages the emerging health data standard Fast Healthcare Interoperability Resources (FHIR).

  16. C3-PRO: Connecting ResearchKit to the Health System Using i2b2 and FHIR

    PubMed Central

    Pfiffner, Pascal B.; Pinyol, Isaac; Natter, Marc D.; Mandl, Kenneth D.

    2016-01-01

    A renewed interest by consumer information technology giants in the healthcare domain is focused on transforming smartphones into personal health data storage devices. With the introduction of the open source ResearchKit, Apple provides a framework for researchers to inform and consent research subjects, and to readily collect personal health data and patient reported outcomes (PRO) from distributed populations. However, being research backend agnostic, ResearchKit does not provide data transmission facilities, leaving research apps disconnected from the health system. Personal health data and PROs are of the most value when presented in context along with health system data. Our aim was to build a toolchain that allows easy and secure integration of personal health and PRO data into an open source platform widely adopted across 140 academic medical centers. We present C3-PRO: the Consent, Contact, and Community framework for Patient Reported Outcomes. This open source toolchain connects, in a standards-compliant fashion, any ResearchKit app to the widely-used clinical research infrastructure Informatics for Integrating Biology and the Bedside (i2b2). C3-PRO leverages the emerging health data standard Fast Healthcare Interoperability Resources (FHIR). PMID:27031856

  17. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    NASA Astrophysics Data System (ADS)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  18. A new Bayesian Earthquake Analysis Tool (BEAT)

    NASA Astrophysics Data System (ADS)

    Vasyura-Bathke, Hannes; Dutta, Rishabh; Jónsson, Sigurjón; Mai, Martin

    2017-04-01

    Modern earthquake source estimation studies increasingly use non-linear optimization strategies to estimate kinematic rupture parameters, often considering geodetic and seismic data jointly. However, the optimization process is complex and consists of several steps that need to be followed in the earthquake parameter estimation procedure. These include pre-describing or modeling the fault geometry, calculating the Green's Functions (often assuming a layered elastic half-space), and estimating the distributed final slip and possibly other kinematic source parameters. Recently, Bayesian inference has become popular for estimating posterior distributions of earthquake source model parameters given measured/estimated/assumed data and model uncertainties. For instance, some research groups consider uncertainties of the layered medium and propagate these to the source parameter uncertainties. Other groups make use of informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed that efficiently explore the often high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational demands of these methods are high and estimation codes are rarely distributed along with the published results. Even if codes are made available, it is often difficult to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in earthquake source estimations, we undertook the effort of producing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package is build on top of the pyrocko seismological toolbox (www.pyrocko.org) and makes use of the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat) and we encourage and solicit contributions to the project. In this contribution, we present our strategy for developing BEAT, show application examples, and discuss future developments.

  19. 40 CFR Table 3 to Subpart Wwww of... - Organic HAP Emissions Limits for Existing Open Molding Sources, New Open Molding Sources Emitting...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and... CATEGORIES National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and...

  20. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  1. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  2. Evaluation of neutron skyshine from a cyclotron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huyashi, K.; Nakamura, T.

    1984-06-01

    The dose distribution and the spectrum variation of neutrons due to the skyshine effect have been measured with various detectors in the environment surrounding the cyclotron of the Institute for Nuclear Study, University of Tokyo. The source neutrons were produced by stopping a 52-MeV proton beam into a carbon beam stopper and were extracted upward from the opening in the concrete shield surrounding the cyclotron and then leaked into the atmosphere through the cyclotron building. The dose distribution and the spectrum of neutrons near the beam stopper were also measured in order to get information on the skyshine source. Themore » measured skyshine neutron spectra and dose distribution were analyzed with two codes, MMCR2 and SKYSHINE-II, with the result that the calculated results are in good agreement with the experiment. Valuable characteristics of this experiment are the determination of the energy spectrum and dose distribution of source neutron and the measurement of skyshine neutrons from an actual large-scale accelerator building to the exclusion of direct neutrons transported through the air. This experiment must be useful as a kind of benchmark experiment on the skyshine phenomenon.« less

  3. Extending Marine Species Distribution Maps Using Non-Traditional Sources

    PubMed Central

    Moretzsohn, Fabio; Gibeaut, James

    2015-01-01

    Abstract Background Traditional sources of species occurrence data such as peer-reviewed journal articles and museum-curated collections are included in species databases after rigorous review by species experts and evaluators. The distribution maps created in this process are an important component of species survival evaluations, and are used to adapt, extend and sometimes contract polygons used in the distribution mapping process. New information During an IUCN Red List Gulf of Mexico Fishes Assessment Workshop held at The Harte Research Institute for Gulf of Mexico Studies, a session included an open discussion on the topic of including other sources of species occurrence data. During the last decade, advances in portable electronic devices and applications enable 'citizen scientists' to record images, location and data about species sightings, and submit that data to larger species databases. These applications typically generate point data. Attendees of the workshop expressed an interest in how that data could be incorporated into existing datasets, how best to ascertain the quality and value of that data, and what other alternate data sources are available. This paper addresses those issues, and provides recommendations to ensure quality data use. PMID:25941453

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George

    This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.

  5. A Computer Simulation Using Spreadsheets for Learning Concept of Steady-State Equilibrium

    ERIC Educational Resources Information Center

    Sharda, Vandana; Sastri, O. S. K. S.; Bhardwaj, Jyoti; Jha, Arbind K.

    2016-01-01

    In this paper, we present a simple spreadsheet based simulation activity that can be performed by students at the undergraduate level. This simulation is implemented in free open source software (FOSS) LibreOffice Calc, which is available for both Windows and Linux platform. This activity aims at building the probability distribution for the…

  6. Incorporating location, routing, and inventory decisions in a bi-objective supply chain design problem with risk-pooling

    NASA Astrophysics Data System (ADS)

    Tavakkoli-Moghaddam, Reza; Forouzanfar, Fateme; Ebrahimnejad, Sadoullah

    2013-07-01

    This paper considers a single-sourcing network design problem for a three-level supply chain. For the first time, a novel mathematical model is presented considering risk-pooling, the inventory existence at distribution centers (DCs) under demand uncertainty, the existence of several alternatives to transport the product between facilities, and routing of vehicles from distribution centers to customer in a stochastic supply chain system, simultaneously. This problem is formulated as a bi-objective stochastic mixed-integer nonlinear programming model. The aim of this model is to determine the number of located distribution centers, their locations, and capacity levels, and allocating customers to distribution centers and distribution centers to suppliers. It also determines the inventory control decisions on the amount of ordered products and the amount of safety stocks at each opened DC, selecting a type of vehicle for transportation. Moreover, it determines routing decisions, such as determination of vehicles' routes starting from an opened distribution center to serve its allocated customers and returning to that distribution center. All are done in a way that the total system cost and the total transportation time are minimized. The Lingo software is used to solve the presented model. The computational results are illustrated in this paper.

  7. Computing distance distributions from dipolar evolution data with overtones: RIDME spectroscopy with Gd(iii)-based spin labels.

    PubMed

    Keller, Katharina; Mertens, Valerie; Qi, Mian; Nalepa, Anna I; Godt, Adelheid; Savitsky, Anton; Jeschke, Gunnar; Yulikov, Maxim

    2017-07-21

    Extraction of distance distributions between high-spin paramagnetic centers from relaxation induced dipolar modulation enhancement (RIDME) data is affected by the presence of overtones of dipolar frequencies. As previously proposed, we account for these overtones by using a modified kernel function in Tikhonov regularization analysis. This paper analyzes the performance of such an approach on a series of model compounds with the Gd(iii)-PyMTA complex serving as paramagnetic high-spin label. We describe the calibration of the overtone coefficients for the RIDME kernel, demonstrate the accuracy of distance distributions obtained with this approach, and show that for our series of Gd-rulers RIDME technique provides more accurate distance distributions than Gd(iii)-Gd(iii) double electron-electron resonance (DEER). The analysis of RIDME data including harmonic overtones can be performed using the MATLAB-based program OvertoneAnalysis, which is available as open-source software from the web page of ETH Zurich. This approach opens a perspective for the routine use of the RIDME technique with high-spin labels in structural biology and structural studies of other soft matter.

  8. 75 FR 4741 - Express Mail Open and Distribute and Priority Mail Open and Distribute Changes and Updates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-29

    ... POSTAL SERVICE 39 CFR Part 111 Express Mail Open and Distribute and Priority Mail Open and... proposes to revise its standards to reflect changes and updates for Express Mail[supreg] Open and Distribute and Priority Mail[supreg] Open and Distribute to improve efficiencies in processing and to control...

  9. 75 FR 14076 - Express Mail Open and Distribute and Priority Mail Open and Distribute Changes and Updates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... POSTAL SERVICE 39 CFR Part 111 Express Mail Open and Distribute and Priority Mail Open and... to reflect changes and updates for Express Mail[supreg] Open and Distribute and Priority Mail[supreg] Open and Distribute to improve efficiencies in processing and to control costs. DATES: Effective Date...

  10. 75 FR 56920 - Express Mail Open and Distribute and Priority Mail Open and Distribute

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... POSTAL SERVICE 39 CFR Part 111 Express Mail Open and Distribute and Priority Mail Open and...] Open and Distribute containers. The Postal Service also proposes to revise the service commitment for Express Mail Open and Distribute as a guaranteed end of day product; and to add a five-pound minimum...

  11. An open-source library for the numerical modeling of mass-transfer in solid oxide fuel cells

    NASA Astrophysics Data System (ADS)

    Novaresio, Valerio; García-Camprubí, María; Izquierdo, Salvador; Asinari, Pietro; Fueyo, Norberto

    2012-01-01

    The generation of direct current electricity using solid oxide fuel cells (SOFCs) involves several interplaying transport phenomena. Their simulation is crucial for the design and optimization of reliable and competitive equipment, and for the eventual market deployment of this technology. An open-source library for the computational modeling of mass-transport phenomena in SOFCs is presented in this article. It includes several multicomponent mass-transport models ( i.e. Fickian, Stefan-Maxwell and Dusty Gas Model), which can be applied both within porous media and in porosity-free domains, and several diffusivity models for gases. The library has been developed for its use with OpenFOAM ®, a widespread open-source code for fluid and continuum mechanics. The library can be used to model any fluid flow configuration involving multicomponent transport phenomena and it is validated in this paper against the analytical solution of one-dimensional test cases. In addition, it is applied for the simulation of a real SOFC and further validated using experimental data. Program summaryProgram title: multiSpeciesTransportModels Catalogue identifier: AEKB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 18 140 No. of bytes in distributed program, including test data, etc.: 64 285 Distribution format: tar.gz Programming language:: C++ Computer: Any x86 (the instructions reported in the paper consider only the 64 bit case for the sake of simplicity) Operating system: Generic Linux (the instructions reported in the paper consider only the open-source Ubuntu distribution for the sake of simplicity) Classification: 12 External routines: OpenFOAM® (version 1.6-ext) ( http://www.extend-project.de) Nature of problem: This software provides a library of models for the simulation of the steady state mass and momentum transport in a multi-species gas mixture, possibly in a porous medium. The software is particularly designed to be used as the mass-transport library for the modeling of solid oxide fuel cells (SOFC). When supplemented with other sub-models, such as thermal and charge-transport ones, it allows the prediction of the cell polarization curve and hence the cell performance. Solution method: Standard finite volume method (FVM) is used for solving all the conservation equations. The pressure-velocity coupling is solved using the SIMPLE algorithm (possibly adding a porous drag term if required). The mass transport can be calculated using different alternative models, namely Fick, Maxwell-Stefan or dusty gas model. The code adopts a segregated method to solve the resulting linear system of equations. The different regions of the SOFC, namely gas channels, electrodes and electrolyte, are solved independently, and coupled through boundary conditions. Restrictions: When extremely large species fluxes are considered, current implementation of the Neumann and Robin boundary conditions do not avoid negative values of molar and/or mass fractions, which finally end up with numerical instability. However this never happened in the documented runs. Eventually these boundary conditions could be reformulated to become more robust. Running time: From seconds to hours depending on the mesh size and number of species. For example, on a 64 bit machine with Intel Core Duo T8300 and 3 GBytes of RAM, the provided test run requires less than 1 second.

  12. Open source clustering software.

    PubMed

    de Hoon, M J L; Imoto, S; Nolan, J; Miyano, S

    2004-06-12

    We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.

  13. PRELIMINARY COUPLING OF THE MONTE CARLO CODE OPENMC AND THE MULTIPHYSICS OBJECT-ORIENTED SIMULATION ENVIRONMENT (MOOSE) FOR ANALYZING DOPPLER FEEDBACK IN MONTE CARLO SIMULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Ellis; Derek Gaston; Benoit Forget

    In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes.more » An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.« less

  14. A Shared Infrastructure for Federated Search Across Distributed Scientific Metadata Catalogs

    NASA Astrophysics Data System (ADS)

    Reed, S. A.; Truslove, I.; Billingsley, B. W.; Grauch, A.; Harper, D.; Kovarik, J.; Lopez, L.; Liu, M.; Brandt, M.

    2013-12-01

    The vast amount of science metadata can be overwhelming and highly complex. Comprehensive analysis and sharing of metadata is difficult since institutions often publish to their own repositories. There are many disjoint standards used for publishing scientific data, making it difficult to discover and share information from different sources. Services that publish metadata catalogs often have different protocols, formats, and semantics. The research community is limited by the exclusivity of separate metadata catalogs and thus it is desirable to have federated search interfaces capable of unified search queries across multiple sources. Aggregation of metadata catalogs also enables users to critique metadata more rigorously. With these motivations in mind, the National Snow and Ice Data Center (NSIDC) and Advanced Cooperative Arctic Data and Information Service (ACADIS) implemented two search interfaces for the community. Both the NSIDC Search and ACADIS Arctic Data Explorer (ADE) use a common infrastructure which keeps maintenance costs low. The search clients are designed to make OpenSearch requests against Solr, an Open Source search platform. Solr applies indexes to specific fields of the metadata which in this instance optimizes queries containing keywords, spatial bounds and temporal ranges. NSIDC metadata is reused by both search interfaces but the ADE also brokers additional sources. Users can quickly find relevant metadata with minimal effort and ultimately lowers costs for research. This presentation will highlight the reuse of data and code between NSIDC and ACADIS, discuss challenges and milestones for each project, and will identify creation and use of Open Source libraries.

  15. The Planck Catalogue of Galactic Cold Clumps : PGCC

    NASA Astrophysics Data System (ADS)

    Montier, L.

    The Planck satellite has provided an unprecedented view of the submm sky, allowing us to search for the dust emission of Galactic cold sources. Combining Planck-HFI all-sky maps in the high frequency channels with the IRAS map at 100um, we built the Planck catalogue of Galactic Cold Clumps (PGCC, Planck 2015 results. XXVIII), counting 13188 sources distributed over the whole sky, and following mainly the Galactic structures at low and intermediate latitudes. This is the first all-sky catalogue of Galactic cold sources obtained with a single instrument at this resolution and sensitivity, which opens a new window on star-formation processes in our Galaxy.

  16. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felberg, Lisa E.; Brookes, David H.; Yap, Eng-Hui

    2016-11-02

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized Poisson Boltzmann equation. The PB-AM software package includes the generation of outputs files appropriate for visualization using VMD, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmannmore » Solver (APBS) software package to make it more accessible to a larger group of scientists, educators and students that are more familiar with the APBS framework.« less

  17. The Emergence of Open-Source Software in China

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…

  18. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  19. A service-oriented distributed semantic mediator: integrating multiscale biomedical information.

    PubMed

    Mora, Oscar; Engelbrecht, Gerhard; Bisbal, Jesus

    2012-11-01

    Biomedical research continuously generates large amounts of heterogeneous and multimodal data spread over multiple data sources. These data, if appropriately shared and exploited, could dramatically improve the research practice itself, and ultimately the quality of health care delivered. This paper presents DISMED (DIstributed Semantic MEDiator), an open source semantic mediator that provides a unified view of a federated environment of multiscale biomedical data sources. DISMED is a Web-based software application to query and retrieve information distributed over a set of registered data sources, using semantic technologies. It also offers a userfriendly interface specifically designed to simplify the usage of these technologies by non-expert users. Although the architecture of the software mediator is generic and domain independent, in the context of this paper, DISMED has been evaluated for managing biomedical environments and facilitating research with respect to the handling of scientific data distributed in multiple heterogeneous data sources. As part of this contribution, a quantitative evaluation framework has been developed. It consist of a benchmarking scenario and the definition of five realistic use-cases. This framework, created entirely with public datasets, has been used to compare the performance of DISMED against other available mediators. It is also available to the scientific community in order to evaluate progress in the domain of semantic mediation, in a systematic and comparable manner. The results show an average improvement in the execution time by DISMED of 55% compared to the second best alternative in four out of the five use-cases of the experimental evaluation.

  20. OpenDanubia - An integrated, modular simulation system to support regional water resource management

    NASA Astrophysics Data System (ADS)

    Muerth, M.; Waldmann, D.; Heinzeller, C.; Hennicker, R.; Mauser, W.

    2012-04-01

    The already completed, multi-disciplinary research project GLOWA-Danube has developed a regional scale, integrated modeling system, which was successfully applied on the 77,000 km2 Upper Danube basin to investigate the impact of Global Change on both the natural and anthropogenic water cycle. At the end of the last project phase, the integrated modeling system was transferred into the open source project OpenDanubia, which now provides both the core system as well as all major model components to the general public. First, this will enable decision makers from government, business and management to use OpenDanubia as a tool for proactive management of water resources in the context of global change. Secondly, the model framework to support integrated simulations and all simulation models developed for OpenDanubia in the scope of GLOWA-Danube are further available for future developments and research questions. OpenDanubia allows for the investigation of water-related scenarios considering different ecological and economic aspects to support both scientists and policy makers to design policies for sustainable environmental management. OpenDanubia is designed as a framework-based, distributed system. The model system couples spatially distributed physical and socio-economic process during run-time, taking into account their mutual influence. To simulate the potential future impacts of Global Change on agriculture, industrial production, water supply, households and tourism businesses, so-called deep actor models are implemented in OpenDanubia. All important water-related fluxes and storages in the natural environment are implemented in OpenDanubia as spatially explicit, process-based modules. This includes the land surface water and energy balance, dynamic plant water uptake, ground water recharge and flow as well as river routing and reservoirs. Although the complete system is relatively demanding on data requirements and hardware requirements, the modular structure and the generic core system (Core Framework, Actor Framework) allows the application in new regions and the selection of a reduced number of modules for simulation. As part of the Open Source Initiative in GLOWA-Danube (opendanubia.glowa-danube.de) a comprehensive documentation for the system installation was created and both the program code of the framework and of all major components is licensed under the GNU General Public License. In addition, some helpful programs and scripts necessary for the operation and processing of input and result data sets are provided.

  1. Closing gaps between open software and public data in a hackathon setting: User-centered software prototyping

    PubMed Central

    Busby, Ben; Lesko, Matthew; Federer, Lisa

    2016-01-01

    In genomics, bioinformatics and other areas of data science, gaps exist between extant public datasets and the open-source software tools built by the community to analyze similar data types.  The purpose of biological data science hackathons is to assemble groups of genomics or bioinformatics professionals and software developers to rapidly prototype software to address these gaps.  The only two rules for the NCBI-assisted hackathons run so far are that 1) data either must be housed in public data repositories or be deposited to such repositories shortly after the hackathon’s conclusion, and 2) all software comprising the final pipeline must be open-source or open-use.  Proposed topics, as well as suggested tools and approaches, are distributed to participants at the beginning of each hackathon and refined during the event.  Software, scripts, and pipelines are developed and published on GitHub, a web service providing publicly available, free-usage tiers for collaborative software development. The code resulting from each hackathon is published at https://github.com/NCBI-Hackathons/ with separate directories or repositories for each team. PMID:27134733

  2. Numerical simulation of the SOFIA flow field

    NASA Technical Reports Server (NTRS)

    Klotz, Stephen P.

    1995-01-01

    This report provides a concise summary of the contribution of computational fluid dynamics (CFD) to the SOFIA (Stratospheric Observatory for Infrared Astronomy) project at NASA Ames and presents results obtained from closed- and open-cavity SOFIA simulations. The aircraft platform is a Boeing 747SP and these are the first SOFIA simulations run with the aircraft empennage included in the geometry database. In the open-cavity runs the telescope is mounted behind the wings. Results suggest that the cavity markedly influences the mean pressure distribution on empennage surfaces and that 110-140 decibel (db) sound pressure levels are typical in the cavity and on the horizontal and vertical stabilizers. A strong source of sound was found to exist on the rim of the open telescope cavity. The presence of this source suggests that additional design work needs to be performed in order to minimize the sound emanating from that location. A fluid dynamic analysis of the engine plumes is also contained in this report. The analysis was part of an effort to quantify the degradation of telescope performance resulting from the proximity of the port engine exhaust plumes to the open telescope bay.

  3. The eGo grid model: An open source approach towards a model of German high and extra-high voltage power grids

    NASA Astrophysics Data System (ADS)

    Mueller, Ulf Philipp; Wienholt, Lukas; Kleinhans, David; Cussmann, Ilka; Bunke, Wolf-Dieter; Pleßmann, Guido; Wendiggensen, Jochen

    2018-02-01

    There are several power grid modelling approaches suitable for simulations in the field of power grid planning. The restrictive policies of grid operators, regulators and research institutes concerning their original data and models lead to an increased interest in open source approaches of grid models based on open data. By including all voltage levels between 60 kV (high voltage) and 380kV (extra high voltage), we dissolve the common distinction between transmission and distribution grid in energy system models and utilize a single, integrated model instead. An open data set for primarily Germany, which can be used for non-linear, linear and linear-optimal power flow methods, was developed. This data set consists of an electrically parameterised grid topology as well as allocated generation and demand characteristics for present and future scenarios at high spatial and temporal resolution. The usability of the grid model was demonstrated by the performance of exemplary power flow optimizations. Based on a marginal cost driven power plant dispatch, being subject to grid restrictions, congested power lines were identified. Continuous validation of the model is nescessary in order to reliably model storage and grid expansion in progressing research.

  4. Making geospatial data in ASF archive readily accessible

    NASA Astrophysics Data System (ADS)

    Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.

    2015-12-01

    The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.

  5. Enhancing participatory approach in water resources management: development of a survey to evaluate stakeholders needs and priorities related to software capabilities

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Josef, S.; Boukalova, Z.; Triana, F.; Ghetta, M.; Sabbatini, T.; Bonari, E.; Cannata, M.; De Filippis, G.

    2016-12-01

    The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management) aims at simplifying the application of EU-water related Directives, by developing an open source and public domain, GIS-integrated platform for planning and management of ground- and surface-water resources. The FREEWAT platform is conceived as a canvas, where several distributed and physically-based simulation codes are virtually integrated. The choice of such codes was supported by the result of a survey performed by means of questionnaires distributed to 14 case study FREEWAT project partners and several stakeholders. This was performed in the first phase of the project within the WP 6 (Enhanced science and participatory approach evidence-based decision making), Task 6.1 (Definition of a "needs/tools" evaluation grid). About 30% among all the invited entities and institutions from several EU and non-EU Countries expressed their interest in contributing to the survey. Most of them were research institutions, government and geoenvironmental companies and river basin authorities.The result of the questionnaire provided a spectrum of needs and priorities of partners/stakeholders, which were addressed during the development phase of the FREEWAT platform. The main needs identified were related to ground- and surface-water quality, sustainable water management, interaction between groundwater/surface-water bodies, and design and management of Managed Aquifer Recharge schemes. Needs and priorities were then connected to the specific EU Directives and Regulations to be addressed.One of the main goals of the questionnaires was to collect information and suggestions regarding the use of existing commercial/open-source software tools to address needs and priorities, and regarding the needs to address specific water-related processes/problems.

  6. MPPhys—A many-particle simulation package for computational physics education

    NASA Astrophysics Data System (ADS)

    Müller, Thomas

    2014-03-01

    In a first course to classical mechanics elementary physical processes like elastic two-body collisions, the mass-spring model, or the gravitational two-body problem are discussed in detail. The continuation to many-body systems, however, is deferred to graduate courses although the underlying equations of motion are essentially the same and although there is a strong motivation for high-school students in particular because of the use of particle systems in computer games. The missing link between the simple and the more complex problem is a basic introduction to solve the equations of motion numerically which could be illustrated, however, by means of the Euler method. The many-particle physics simulation package MPPhys offers a platform to experiment with simple particle simulations. The aim is to give a principle idea how to implement many-particle simulations and how simulation and visualization can be combined for interactive visual explorations. Catalogue identifier: AERR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERR_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 111327 No. of bytes in distributed program, including test data, etc.: 608411 Distribution format: tar.gz Programming language: C++, OpenGL, GLSL, OpenCL. Computer: Linux and Windows platforms with OpenGL support. Operating system: Linux and Windows. RAM: Source Code 4.5 MB Complete package 242 MB Classification: 14, 16.9. External routines: OpenGL, OpenCL Nature of problem: Integrate N-body simulations, mass-spring models Solution method: Numerical integration of N-body-simulations, 3D-Rendering via OpenGL. Running time: Problem dependent

  7. Open Access, Open Source and Digital Libraries: A Current Trend in University Libraries around the World

    ERIC Educational Resources Information Center

    Krishnamurthy, M.

    2008-01-01

    Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…

  8. Zephyr: Open-source Parallel Seismic Waveform Inversion in an Integrated Python-based Framework

    NASA Astrophysics Data System (ADS)

    Smithyman, B. R.; Pratt, R. G.; Hadden, S. M.

    2015-12-01

    Seismic Full-Waveform Inversion (FWI) is an advanced method to reconstruct wave properties of materials in the Earth from a series of seismic measurements. These methods have been developed by researchers since the late 1980s, and now see significant interest from the seismic exploration industry. As researchers move towards implementing advanced numerical modelling (e.g., 3D, multi-component, anisotropic and visco-elastic physics), it is desirable to make use of a modular approach, minimizing the effort developing a new set of tools for each new numerical problem. SimPEG (http://simpeg.xyz) is an open source project aimed at constructing a general framework to enable geophysical inversion in various domains. In this abstract we describe Zephyr (https://github.com/bsmithyman/zephyr), which is a coupled research project focused on parallel FWI in the seismic context. The software is built on top of Python, Numpy and IPython, which enables very flexible testing and implementation of new features. Zephyr is an open source project, and is released freely to enable reproducible research. We currently implement a parallel, distributed seismic forward modelling approach that solves the 2.5D (two-and-one-half dimensional) viscoacoustic Helmholtz equation at a range modelling frequencies, generating forward solutions for a given source behaviour, and gradient solutions for a given set of observed data. Solutions are computed in a distributed manner on a set of heterogeneous workers. The researcher's frontend computer may be separated from the worker cluster by a network link to enable full support for computation on remote clusters from individual workstations or laptops. The present codebase introduces a numerical discretization equivalent to that used by FULLWV, a well-known seismic FWI research codebase. This makes it straightforward to compare results from Zephyr directly with FULLWV. The flexibility introduced by the use of a Python programming environment makes extension of the codebase with new methods much more straightforward. This enables comparison and integration of new efforts with existing results.

  9. Towards Standardized Patient Data Exchange: Integrating a FHIR Based API for the Open Medical Record System.

    PubMed

    Kasthurirathne, Suranga N; Mamlin, Burke; Grieve, Grahame; Biondich, Paul

    2015-01-01

    Interoperability is essential to address limitations caused by the ad hoc implementation of clinical information systems and the distributed nature of modern medical care. The HL7 V2 and V3 standards have played a significant role in ensuring interoperability for healthcare. FHIR is a next generation standard created to address fundamental limitations in HL7 V2 and V3. FHIR is particularly relevant to OpenMRS, an Open Source Medical Record System widely used across emerging economies. FHIR has the potential to allow OpenMRS to move away from a bespoke, application specific API to a standards based API. We describe efforts to design and implement a FHIR based API for the OpenMRS platform. Lessons learned from this effort were used to define long term plans to transition from the legacy OpenMRS API to a FHIR based API that greatly reduces the learning curve for developers and helps enhance adhernce to standards.

  10. SU-G-IeP4-04: DD-Neutron Source Collimation for Neutron Stimulated Emission Computed Tomography: A Monte Carlo Simulation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fong, G; Kapadia, A

    Purpose: To optimize collimation and shielding for a deuterium-deuterium (DD) neutron generator for an inexpensive and compact clinical neutron imaging system. The envisioned application is cancer diagnosis through Neutron Stimulated Emission Computed Tomography (NSECT). Methods: Collimator designs were tested with an isotropic 2.5 MeV neutron source through GEANT4 simulations. The collimator is a 52×52×52 cm{sup 3} polyethylene block coupled with a 1 cm lead sheet in sequence. Composite opening was modeled into the collimator to permit passage of neutrons. The opening varied in shape (cylindrical vs. tapered), size (1–5 cm source-side and target-side openings) and aperture placements (13–39 cm frommore » source-side). Spatial and energy distribution of neutrons and gammas were tracked from each collimator design. Parameters analyzed were primary beam width (FWHM), divergence, and efficiency (percent transmission) for different configurations of the collimator. Select resultant outputs were then used for simulated NSECT imaging of a virtual breast phantom containing a 2.5 cm diameter tumor to assess the effect of the collimator on spatial resolution, noise, and scan time. Finally, composite shielding enclosure made of polyethylene and lead was designed and evaluated to block 99.99% of neutron and gamma radiation generated in the system. Results: Analysis of primary beam indicated the beam-width is linear to the aperture size. Increasing source-side opening allowed at least 20% more neutron throughput for all designs relative to the cylindrical openings. Maximum throughput for all designs was 364% relative to cylindrical openings. Conclusion: The work indicates potential for collimating and shielding a DD neutron generator for use in a clinical NSECT system. The proposed collimator designs produced a well-defined collimated neutron beam that can be used to image samples of interest with millimeter resolution. Balance in output efficiency, noise reduction, and scan time should be considered to determine the optimal design for specific NSECT applications.« less

  11. New Open-Source Version of FLORIS Released | News | NREL

    Science.gov Websites

    New Open-Source Version of FLORIS Released New Open-Source Version of FLORIS Released January 26 , 2018 National Renewable Energy Laboratory (NREL) researchers recently released an updated open-source simplified and documented. Because of the living, open-source nature of the newly updated utility, NREL

  12. Innovative Ways of Visualising Meta Data in 4D Using Open Source Libaries

    NASA Astrophysics Data System (ADS)

    Balhar, Jakub; Valach, Pavel; Veselka, Jonas; Voumard, Yann

    2016-08-01

    There are more and more data being measured by different Earth Observation satellites around the world. Ever increasing amount of these data present new challenges and opportunities for their visualization.In this paper we propose how to visualize the amount, distribution and the structure of the data in a transparent way, which will take into account time-dimensions as well. Our approach allows us to get a global overview as well detailed regional information about distribution of the products from EO observation missions.We focus on introducing our mobile-friendly and easy- to-use web mapping application for 4D visualization of the data. Apart from that we also present the Java application which can read and process the data from various data sources.

  13. Review of Supervisory Control and Data Acquisition (SCADA) Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reva Nickelson; Briam Johnson; Ken Barnes

    2004-01-01

    A review using open source information was performed to obtain data related to Supervisory Control and Data Acquisition (SCADA) systems used to supervise and control domestic electric power generation, transmission, and distribution. This report provides the technical details for the types of systems used, system disposal, cyber and physical security measures, network connections, and a gap analysis of SCADA security holes.

  14. Flexible configuration-interaction shell-model many-body solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Calvin W.; Ormand, W. Erich; McElvain, Kenneth S.

    BIGSTICK Is a flexible configuration-Interaction open-source shell-model code for the many-fermion problem In a shell model (occupation representation) framework. BIGSTICK can generate energy spectra, static and transition one-body densities, and expectation values of scalar operators. Using the built-in Lanczos algorithm one can compute transition probabflity distributions and decompose wave functions into components defined by group theory.

  15. How old is streamwater? Open questions in catchment transit time conceptualization, modeling and analysis

    Treesearch

    J.J. McDonnell; K. McGuire; P. Aggarwal; K.J. Beven; D. Biondi; G. Destouni; S. Dunn; A. James; J. Kirchner; P. Kraft; S. Lyon; P. Maloszewski; B. Newman; L. Pfister; A. Rinaldo; A. Rodhe; T. Sayama; J. Seibert; K. Solomon; C. Soulsby; M. Stewart; D. Tetzlaff; C. Tobin; P. Troch; M. Weiler; A. Western; A. Wörman; S. Wrede

    2010-01-01

    The time water spends travelling subsurface through a catchment to the stream network (i.e. the catchment water transit time) fundamentally describes the storage, flow pathway heterogeneity and sources of water in a catchment. The distribution of transit times reflects how catchments retain and release water and solutes that in turn set biogeochemical conditions and...

  16. Opencast Matterhorn: A Community-Driven Open Source Software Project for Producing, Managing, and Distributing Academic Video

    ERIC Educational Resources Information Center

    Ketterl, Markus; Schulte, Olaf A.; Hochman, Adam

    2010-01-01

    Purpose: The purpose of this paper is to introduce the Opencast Community, a global community of individuals, institutions, and commercial stakeholders exchanging knowledge about all matters relevant in the context of academic video and promoting projects in this context. It also gives an overview of the most prominent of these projects, Opencast…

  17. Development and evaluation of an open-source, low-cost distributed sensor network for environmental monitoring applications

    NASA Astrophysics Data System (ADS)

    Gunawardena, N.; Pardyjak, E. R.; Stoll, R.; Khadka, A.

    2018-02-01

    Over the last decade there has been a proliferation of low-cost sensor networks that enable highly distributed sensor deployments in environmental applications. The technology is easily accessible and rapidly advancing due to the use of open-source microcontrollers. While this trend is extremely exciting, and the technology provides unprecedented spatial coverage, these sensors and associated microcontroller systems have not been well evaluated in the literature. Given the large number of new deployments and proposed research efforts using these technologies, it is necessary to quantify the overall instrument and microcontroller performance for specific applications. In this paper, an Arduino-based weather station system is presented in detail. These low-cost energy-budget measurement stations, or LEMS, have now been deployed for continuous measurements as part of several different field campaigns, which are described herein. The LEMS are low-cost, flexible, and simple to maintain. In addition to presenting the technical details of the LEMS, its errors are quantified in laboratory and field settings. A simple artificial neural network-based radiation-error correction scheme is also presented. Finally, challenges and possible improvements to microcontroller-based atmospheric sensing systems are discussed.

  18. Panorama: A Targeted Proteomics Knowledge Base

    PubMed Central

    2015-01-01

    Panorama is a web application for storing, sharing, analyzing, and reusing targeted assays created and refined with Skyline,1 an increasingly popular Windows client software tool for targeted proteomics experiments. Panorama allows laboratories to store and organize curated results contained in Skyline documents with fine-grained permissions, which facilitates distributed collaboration and secure sharing of published and unpublished data via a web-browser interface. It is fully integrated with the Skyline workflow and supports publishing a document directly to a Panorama server from the Skyline user interface. Panorama captures the complete Skyline document information content in a relational database schema. Curated results published to Panorama can be aggregated and exported as chromatogram libraries. These libraries can be used in Skyline to pick optimal targets in new experiments and to validate peak identification of target peptides. Panorama is open-source and freely available. It is distributed as part of LabKey Server,2 an open source biomedical research data management system. Laboratories and organizations can set up Panorama locally by downloading and installing the software on their own servers. They can also request freely hosted projects on https://panoramaweb.org, a Panorama server maintained by the Department of Genome Sciences at the University of Washington. PMID:25102069

  19. An Open Source Software and Web-GIS Based Platform for Airborne SAR Remote Sensing Data Management, Distribution and Sharing

    NASA Astrophysics Data System (ADS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; Ming, Liu

    2014-03-01

    With more and more Earth observation data available to the community, how to manage and sharing these valuable remote sensing datasets is becoming an urgent issue to be solved. The web based Geographical Information Systems (GIS) technology provides a convenient way for the users in different locations to share and make use of the same dataset. In order to efficiently use the airborne Synthetic Aperture Radar (SAR) remote sensing data acquired in the Airborne Remote Sensing Center of the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS), a Web-GIS based platform for airborne SAR data management, distribution and sharing was designed and developed. The major features of the system include map based navigation search interface, full resolution imagery shown overlaid the map, and all the software adopted in the platform are Open Source Software (OSS). The functions of the platform include browsing the imagery on the map navigation based interface, ordering and downloading data online, image dataset and user management, etc. At present, the system is under testing in RADI and will come to regular operation soon.

  20. Open Source GIS based integrated watershed management

    NASA Astrophysics Data System (ADS)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users

  1. Uncertainty Quantification Techniques for Population Density Estimates Derived from Sparse Open Source Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Robert N; White, Devin A; Urban, Marie L

    2013-01-01

    The Population Density Tables (PDT) project at the Oak Ridge National Laboratory (www.ornl.gov) is developing population density estimates for specific human activities under normal patterns of life based largely on information available in open source. Currently, activity based density estimates are based on simple summary data statistics such as range and mean. Researchers are interested in improving activity estimation and uncertainty quantification by adopting a Bayesian framework that considers both data and sociocultural knowledge. Under a Bayesian approach knowledge about population density may be encoded through the process of expert elicitation. Due to the scale of the PDT effort whichmore » considers over 250 countries, spans 40 human activity categories, and includes numerous contributors, an elicitation tool is required that can be operationalized within an enterprise data collection and reporting system. Such a method would ideally require that the contributor have minimal statistical knowledge, require minimal input by a statistician or facilitator, consider human difficulties in expressing qualitative knowledge in a quantitative setting, and provide methods by which the contributor can appraise whether their understanding and associated uncertainty was well captured. This paper introduces an algorithm that transforms answers to simple, non-statistical questions into a bivariate Gaussian distribution as the prior for the Beta distribution. Based on geometric properties of the Beta distribution parameter feasibility space and the bivariate Gaussian distribution, an automated method for encoding is developed that responds to these challenging enterprise requirements. Though created within the context of population density, this approach may be applicable to a wide array of problem domains requiring informative priors for the Beta distribution.« less

  2. ADMS State of the Industry and Gap Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agalgaonkar, Yashodhan P.; Marinovici, Maria C.; Vadari, Subramanian V.

    2016-03-31

    An Advanced distribution management system (ADMS) is a platform for optimized distribution system operational management. This platform comprises of distribution management system (DMS) applications, supervisory control and data acquisition (SCADA), outage management system (OMS), and distributed energy resource management system (DERMS). One of the primary objectives of this work is to study and analyze several ADMS component and auxiliary systems. All the important component and auxiliary systems, SCADA, GISs, DMSs, AMRs/AMIs, OMSs, and DERMS, are discussed in this report. Their current generation technologies are analyzed, and their integration (or evolution) with an ADMS technology is discussed. An ADMS technology statemore » of the art and gap analysis is also presented. There are two technical gaps observed. The integration challenge between the component operational systems is the single largest challenge for ADMS design and deployment. Another significant challenge noted is concerning essential ADMS applications, for instance, fault location, isolation, and service restoration (FLISR), volt-var optimization (VVO), etc. There are a relatively small number of ADMS application developers as ADMS software platform is not open source. There is another critical gap and while not being technical in nature (when compared the two above) is still important to consider. The data models currently residing in utility GIS systems are either incomplete or inaccurate or both. This data is essential for planning and operations because it is typically one of the primary sources from which power system model are created. To achieve the full potential of ADMS, the ability to execute acute Power Flow solution is an important pre-requisite. These critical gaps are hindering wider Utility adoption of an ADMS technology. The development of an open architecture platform can eliminate many of these barriers and also aid seamless integration of distribution Utility legacy systems with an ADMS.« less

  3. The successes and challenges of open-source biopharmaceutical innovation.

    PubMed

    Allarakhia, Minna

    2014-05-01

    Increasingly, open-source-based alliances seek to provide broad access to data, research-based tools, preclinical samples and downstream compounds. The challenge is how to create value from open-source biopharmaceutical innovation. This value creation may occur via transparency and usage of data across the biopharmaceutical value chain as stakeholders move dynamically between open source and open innovation. In this article, several examples are used to trace the evolution of biopharmaceutical open-source initiatives. The article specifically discusses the technological challenges associated with the integration and standardization of big data; the human capacity development challenges associated with skill development around big data usage; and the data-material access challenge associated with data and material access and usage rights, particularly as the boundary between open source and open innovation becomes more fluid. It is the author's opinion that the assessment of when and how value creation will occur, through open-source biopharmaceutical innovation, is paramount. The key is to determine the metrics of value creation and the necessary technological, educational and legal frameworks to support the downstream outcomes of now big data-based open-source initiatives. The continued focus on the early-stage value creation is not advisable. Instead, it would be more advisable to adopt an approach where stakeholders transform open-source initiatives into open-source discovery, crowdsourcing and open product development partnerships on the same platform.

  4. 75 FR 72686 - Express Mail Open and Distribute and Priority Mail Open and Distribute

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... POSTAL SERVICE 39 CFR Part 111 Express Mail Open and Distribute and Priority Mail Open and... ``DB'' prefix along with new Tag 257, Tag 267, or Label 257S, on all Express Mail[supreg] Open and Distribute containers. The Postal Service is also revising the service commitment for Express Mail Open and...

  5. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https://github.com/OpenDataAnalytics/gaia). In this presentation, we will discuss core features of each of these tools and will present lessons learned on handling large data in the context of data management, analyses and visualization.

  6. The Simple Concurrent Online Processing System (SCOPS) - An open-source interface for remotely sensed data processing

    NASA Astrophysics Data System (ADS)

    Warren, M. A.; Goult, S.; Clewley, D.

    2018-06-01

    Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.

  7. A novel method for automated tracking and quantification of adult zebrafish behaviour during anxiety.

    PubMed

    Nema, Shubham; Hasan, Whidul; Bhargava, Anamika; Bhargava, Yogesh

    2016-09-15

    Behavioural neuroscience relies on software driven methods for behavioural assessment, but the field lacks cost-effective, robust, open source software for behavioural analysis. Here we propose a novel method which we called as ZebraTrack. It includes cost-effective imaging setup for distraction-free behavioural acquisition, automated tracking using open-source ImageJ software and workflow for extraction of behavioural endpoints. Our ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins. We have validated this method by testing novelty induced anxiety behaviour in adult zebrafish. Our results, in agreement with established findings, showed that during state-anxiety, zebrafish showed reduced distance travelled, increased thigmotaxis and freezing events. Furthermore, we proposed a method to represent both spatial and temporal distribution of choice-based behaviour which is currently not possible to represent using simple videograms. ZebraTrack method is simple and economical, yet robust enough to give results comparable with those obtained from costly proprietary software like Ethovision XT. We have developed and validated a novel cost-effective method for behavioural analysis of adult zebrafish using open-source ImageJ software. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Open for Business

    ERIC Educational Resources Information Center

    Voyles, Bennett

    2007-01-01

    People know about the Sakai Project (open source course management system); they may even know about Kuali (open source financials). So, what is the next wave in open source software? This article discusses business intelligence (BI) systems. Though open source BI may still be only a rumor in most campus IT departments, some brave early adopters…

  9. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ananthakrishnan, Rachana; Bell, Gavin; Cinquini, Luca

    2013-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  10. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cinquini, Luca; Crichton, Daniel; Miller, Neill

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  11. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    NASA Technical Reports Server (NTRS)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  12. A Semantic Web Management Model for Integrative Biomedical Informatics

    PubMed Central

    Deus, Helena F.; Stanislaus, Romesh; Veiga, Diogo F.; Behrens, Carmen; Wistuba, Ignacio I.; Minna, John D.; Garner, Harold R.; Swisher, Stephen G.; Roth, Jack A.; Correa, Arlene M.; Broom, Bradley; Coombes, Kevin; Chang, Allen; Vogel, Lynn H.; Almeida, Jonas S.

    2008-01-01

    Background Data, data everywhere. The diversity and magnitude of the data generated in the Life Sciences defies automated articulation among complementary efforts. The additional need in this field for managing property and access permissions compounds the difficulty very significantly. This is particularly the case when the integration involves multiple domains and disciplines, even more so when it includes clinical and high throughput molecular data. Methodology/Principal Findings The emergence of Semantic Web technologies brings the promise of meaningful interoperation between data and analysis resources. In this report we identify a core model for biomedical Knowledge Engineering applications and demonstrate how this new technology can be used to weave a management model where multiple intertwined data structures can be hosted and managed by multiple authorities in a distributed management infrastructure. Specifically, the demonstration is performed by linking data sources associated with the Lung Cancer SPORE awarded to The University of Texas MDAnderson Cancer Center at Houston and the Southwestern Medical Center at Dallas. A software prototype, available with open source at www.s3db.org, was developed and its proposed design has been made publicly available as an open source instrument for shared, distributed data management. Conclusions/Significance The Semantic Web technologies have the potential to addresses the need for distributed and evolvable representations that are critical for systems Biology and translational biomedical research. As this technology is incorporated into application development we can expect that both general purpose productivity software and domain specific software installed on our personal computers will become increasingly integrated with the relevant remote resources. In this scenario, the acquisition of a new dataset should automatically trigger the delegation of its analysis. PMID:18698353

  13. Numerical analysis of the flow field in a sloshing tank with a horizontal perforated plate

    NASA Astrophysics Data System (ADS)

    Jin, Heng; Liu, Yong; Li, Huajun; Fu, Qiang

    2017-08-01

    Liquid sloshing is a type of free surface flow inside a partially filled water tank. Sloshing exerts a significant effect on the safety of liquid transport systems; in particular, it may cause large hydrodynamic loads when the frequency of the tank motion is close to the natural frequency of the tank. Perforated plates have recently been used to suppress the violent movement of liquids in a sloshing tank at resonant conditions. In this study, a numerical model based on OpenFOAM (Open Source Field Operation and Manipulation), an open source computed fluid dynamic code, is used to investigate resonant sloshing in a swaying tank with a submerged horizontal perforated plate. The numerical results of the free surface elevations are first verified using experimental data, and then the flow characteristics around the perforated plate and the fluid velocity distribution in the entire tank are examined using numerical examples. The results clearly show differences in sloshing motions under first-order and third-order resonant frequencies. This study provides a better understanding of the energy dissipation mechanism of a horizontal perforated plate in a swaying tank.

  14. Experimental verification of isotropic radiation from a coherent dipole source via electric-field-driven LC resonator metamaterials.

    PubMed

    Tichit, Paul-Henri; Burokur, Shah Nawaz; Qiu, Cheng-Wei; de Lustrac, André

    2013-09-27

    It has long been conjectured that isotropic radiation by a simple coherent source is impossible due to changes in polarization. Though hypothetical, the isotropic source is usually taken as the reference for determining a radiator's gain and directivity. Here, we demonstrate both theoretically and experimentally that an isotropic radiator can be made of a simple and finite source surrounded by electric-field-driven LC resonator metamaterials designed by space manipulation. As a proof-of-concept demonstration, we show the first isotropic source with omnidirectional radiation from a dipole source (applicable to all distributed sources), which can open up several possibilities in axion electrodynamics, optical illusion, novel transformation-optic devices, wireless communication, and antenna engineering. Owing to the electric- field-driven LC resonator realization scheme, this principle can be readily applied to higher frequency regimes where magnetism is usually not present.

  15. The Commercial Open Source Business Model

    NASA Astrophysics Data System (ADS)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  16. Collecting a better water-quality sample: Reducing vertical stratification bias in open and closed channels

    USGS Publications Warehouse

    Selbig, William R.

    2017-01-01

    Collection of water-quality samples that accurately characterize average particle concentrations and distributions in channels can be complicated by large sources of variability. The U.S. Geological Survey (USGS) developed a fully automated Depth-Integrated Sample Arm (DISA) as a way to reduce bias and improve accuracy in water-quality concentration data. The DISA was designed to integrate with existing autosampler configurations commonly used for the collection of water-quality samples in vertical profile thereby providing a better representation of average suspended sediment and sediment-associated pollutant concentrations and distributions than traditional fixed-point samplers. In controlled laboratory experiments, known concentrations of suspended sediment ranging from 596 to 1,189 mg/L were injected into a 3 foot diameter closed channel (circular pipe) with regulated flows ranging from 1.4 to 27.8 ft3 /s. Median suspended sediment concentrations in water-quality samples collected using the DISA were within 7 percent of the known, injected value compared to 96 percent for traditional fixed-point samplers. Field evaluation of this technology in open channel fluvial systems showed median differences between paired DISA and fixed-point samples to be within 3 percent. The range of particle size measured in the open channel was generally that of clay and silt. Differences between the concentration and distribution measured between the two sampler configurations could potentially be much larger in open channels that transport larger particles, such as sand.

  17. Psynteract: A flexible, cross-platform, open framework for interactive experiments.

    PubMed

    Henninger, Felix; Kieslich, Pascal J; Hilbig, Benjamin E

    2017-10-01

    We introduce a novel platform for interactive studies, that is, any form of study in which participants' experiences depend not only on their own responses, but also on those of other participants who complete the same study in parallel, for example a prisoner's dilemma or an ultimatum game. The software thus especially serves the rapidly growing field of strategic interaction research within psychology and behavioral economics. In contrast to all available software packages, our platform does not handle stimulus display and response collection itself. Instead, we provide a mechanism to extend existing experimental software to incorporate interactive functionality. This approach allows us to draw upon the capabilities already available, such as accuracy of temporal measurement, integration with auxiliary hardware such as eye-trackers or (neuro-)physiological apparatus, and recent advances in experimental software, for example capturing response dynamics through mouse-tracking. Through integration with OpenSesame, an open-source graphical experiment builder, studies can be assembled via a drag-and-drop interface requiring little or no further programming skills. In addition, by using the same communication mechanism across software packages, we also enable interoperability between systems. Our source code, which provides support for all major operating systems and several popular experimental packages, can be freely used and distributed under an open source license. The communication protocols underlying its functionality are also well documented and easily adapted to further platforms. Code and documentation are available at https://github.com/psynteract/ .

  18. Development of indirect EFBEM for radiating noise analysis including underwater problems

    NASA Astrophysics Data System (ADS)

    Kwon, Hyun-Wung; Hong, Suk-Yoon; Song, Jee-Hun

    2013-09-01

    For the analysis of radiating noise problems in medium-to-high frequency ranges, the Energy Flow Boundary Element Method (EFBEM) was developed. EFBEM is the analysis technique that applies the Boundary Element Method (BEM) to Energy Flow Analysis (EFA). The fundamental solutions representing spherical wave property for radiating noise problems in open field and considering the free surface effect in underwater are developed. Also the directivity factor is developed to express wave's directivity patterns in medium-to-high frequency ranges. Indirect EFBEM by using fundamental solutions and fictitious source was applied to open field and underwater noise problems successfully. Through numerical applications, the acoustic energy density distributions due to vibration of a simple plate model and a sphere model were compared with those of commercial code, and the comparison showed good agreement in the level and pattern of the energy density distributions.

  19. Correction to: Association of adiposity with hemoglobin levels in patients with chronic kidney disease not on dialysis.

    PubMed

    Honda, Hirokazu; Ono, Kota; Akizawa, Tadao; Nitta, Kosaku; Hishida, Akira

    2018-02-06

    The article Association of adiposity with hemoglobin levels in patients with chronic kidney disease not on dialysis, written by Hirokazu Honda, Kota Ono, Tadao Akizawa, Kosaku Nitta and Akira Hishida, was originally published electronically on the publisher's internet portal (currently springerlink) on November 4, 2017 without open access. With the author(s)' decision to opt for Open Choice, the copyright of the article changed on February 6, 2018 to © The Author(s) [2017] and the article is forthwith distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits use, duplication, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made. The original article was corrected.

  20. Frapbot: An open-source application for FRAP data.

    PubMed

    Kohze, Robin; Dieteren, Cindy E J; Koopman, Werner J H; Brock, Roland; Schmidt, Samuel

    2017-08-01

    We introduce Frapbot, a free-of-charge open source software web application written in R, which provides manual and automated analyses of fluorescence recovery after photobleaching (FRAP) datasets. For automated operation, starting from data tables containing columns of time-dependent intensity values for various regions of interests within the images, a pattern recognition algorithm recognizes the relevant columns and identifies the presence or absence of prebleach values and the time point of photobleaching. Raw data, residuals, normalization, and boxplots indicating the distribution of half times of recovery (t 1/2 ) of all uploaded files are visualized instantly in a batch-wise manner using a variety of user-definable fitting options. The fitted results are provided as .zip file, which contains .csv formatted output tables. Alternatively, the user can manually control any of the options described earlier. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  1. CHRONOS architecture: Experiences with an open-source services-oriented architecture for geoinformatics

    USGS Publications Warehouse

    Fils, D.; Cervato, C.; Reed, J.; Diver, P.; Tang, X.; Bohling, G.; Greer, D.

    2009-01-01

    CHRONOS's purpose is to transform Earth history research by seamlessly integrating stratigraphic databases and tools into a virtual on-line stratigraphic record. In this paper, we describe the various components of CHRONOS's distributed data system, including the encoding of semantic and descriptive data into a service-based architecture. We give examples of how we have integrated well-tested resources available from the open-source and geoinformatic communities, like the GeoSciML schema and the simple knowledge organization system (SKOS), into the services-oriented architecture to encode timescale and phylogenetic synonymy data. We also describe on-going efforts to use geospatially enhanced data syndication and informally including semantic information by embedding it directly into the XHTML Document Object Model (DOM). XHTML DOM allows machine-discoverable descriptive data such as licensing and citation information to be incorporated directly into data sets retrieved by users. ?? 2008 Elsevier Ltd. All rights reserved.

  2. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

    PubMed

    Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

    2017-06-05

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  3. Methods for the behavioral, educational, and social sciences: an R package.

    PubMed

    Kelley, Ken

    2007-11-01

    Methods for the Behavioral, Educational, and Social Sciences (MBESS; Kelley, 2007b) is an open source package for R (R Development Core Team, 2007b), an open source statistical programming language and environment. MBESS implements methods that are not widely available elsewhere, yet are especially helpful for the idiosyncratic techniques used within the behavioral, educational, and social sciences. The major categories of functions are those that relate to confidence interval formation for noncentral t, F, and chi2 parameters, confidence intervals for standardized effect sizes (which require noncentral distributions), and sample size planning issues from the power analytic and accuracy in parameter estimation perspectives. In addition, MBESS contains collections of other functions that should be helpful to substantive researchers and methodologists. MBESS is a long-term project that will continue to be updated and expanded so that important methods can continue to be made available to researchers in the behavioral, educational, and social sciences.

  4. RINGMesh: A programming library for developing mesh-based geomodeling applications

    NASA Astrophysics Data System (ADS)

    Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume

    2017-07-01

    RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.

  5. NWChem: A comprehensive and scalable open-source solution for large scale molecular simulations

    NASA Astrophysics Data System (ADS)

    Valiev, M.; Bylaska, E. J.; Govind, N.; Kowalski, K.; Straatsma, T. P.; Van Dam, H. J. J.; Wang, D.; Nieplocha, J.; Apra, E.; Windus, T. L.; de Jong, W. A.

    2010-09-01

    The latest release of NWChem delivers an open-source computational chemistry package with extensive capabilities for large scale simulations of chemical and biological systems. Utilizing a common computational framework, diverse theoretical descriptions can be used to provide the best solution for a given scientific problem. Scalable parallel implementations and modular software design enable efficient utilization of current computational architectures. This paper provides an overview of NWChem focusing primarily on the core theoretical modules provided by the code and their parallel performance. Program summaryProgram title: NWChem Catalogue identifier: AEGI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Open Source Educational Community License No. of lines in distributed program, including test data, etc.: 11 709 543 No. of bytes in distributed program, including test data, etc.: 680 696 106 Distribution format: tar.gz Programming language: Fortran 77, C Computer: all Linux based workstations and parallel supercomputers, Windows and Apple machines Operating system: Linux, OS X, Windows Has the code been vectorised or parallelized?: Code is parallelized Classification: 2.1, 2.2, 3, 7.3, 7.7, 16.1, 16.2, 16.3, 16.10, 16.13 Nature of problem: Large-scale atomistic simulations of chemical and biological systems require efficient and reliable methods for ground and excited solutions of many-electron Hamiltonian, analysis of the potential energy surface, and dynamics. Solution method: Ground and excited solutions of many-electron Hamiltonian are obtained utilizing density-functional theory, many-body perturbation approach, and coupled cluster expansion. These solutions or a combination thereof with classical descriptions are then used to analyze potential energy surface and perform dynamical simulations. Additional comments: Full documentation is provided in the distribution file. This includes an INSTALL file giving details of how to build the package. A set of test runs is provided in the examples directory. The distribution file for this program is over 90 Mbytes and therefore is not delivered directly when download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. Running time: Running time depends on the size of the chemical system, complexity of the method, number of cpu's and the computational task. It ranges from several seconds for serial DFT energy calculations on a few atoms to several hours for parallel coupled cluster energy calculations on tens of atoms or ab-initio molecular dynamics simulation on hundreds of atoms.

  6. Generalisation of the identity method for determination of high-order moments of multiplicity distributions with a software implementation

    NASA Astrophysics Data System (ADS)

    Maćkowiak-Pawłowska, Maja; Przybyła, Piotr

    2018-05-01

    The incomplete particle identification limits the experimentally-available phase space region for identified particle analysis. This problem affects ongoing fluctuation and correlation studies including the search for the critical point of strongly interacting matter performed on SPS and RHIC accelerators. In this paper we provide a procedure to obtain nth order moments of the multiplicity distribution using the identity method, generalising previously published solutions for n=2 and n=3. Moreover, we present an open source software implementation of this computation, called Idhim, that allows one to obtain the true moments of identified particle multiplicity distributions from the measured ones provided the response function of the detector is known.

  7. Lenstronomy: Multi-purpose gravitational lens modeling software package

    NASA Astrophysics Data System (ADS)

    Birrer, Simon; Amara, Adam

    2018-04-01

    Lenstronomy is a multi-purpose open-source gravitational lens modeling python package. Lenstronomy reconstructs the lens mass and surface brightness distributions of strong lensing systems using forward modelling and supports a wide range of analytic lens and light models in arbitrary combination. The software is also able to reconstruct complex extended sources as well as point sources. Lenstronomy is flexible and numerically accurate, with a clear user interface that could be deployed across different platforms. Lenstronomy has been used to derive constraints on dark matter properties in strong lenses, measure the expansion history of the universe with time-delay cosmography, measure cosmic shear with Einstein rings, and decompose quasar and host galaxy light.

  8. Openly Published Environmental Sensing (OPEnS) | Advancing Open-Source Research, Instrumentation, and Dissemination

    NASA Astrophysics Data System (ADS)

    Udell, C.; Selker, J. S.

    2017-12-01

    The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.

  9. Simulation with Python on transverse modes of the symmetric confocal resonator

    NASA Astrophysics Data System (ADS)

    Wang, Qing Hua; Qi, Jing; Ji, Yun Jing; Song, Yang; Li, Zhenhua

    2017-08-01

    Python is a popular open-source programming language that can be used to simulate various optical phenomena. We have developed a suite of programs to help teach the course of laser principle. The complicated transverse modes of the symmetric confocal resonator can be visualized in personal computers, which is significant to help the students understand the pattern distribution of laser resonator.

  10. Adjusting to Social Change - A Multi-Level Analysis in Three Cultures

    DTIC Science & Technology

    2013-08-01

    including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed , and completing and reviewing the...presence is often associated with the large-scale movement of civilian populations, and who need to better understand the Distribution A: Approved for...valuing openness to change (self-direction, stimulation and sometimes hedonism values) with valuing conservation (conformity, tradition and security

  11. Advanced Shutter Control for a Molecular Beam Epitaxy Reactor

    DTIC Science & Technology

    An open-source hardware and software-based shutter controller solution was developed that communicates over Ethernet with our original equipment...manufacturer (OEM) molecular beam epitaxy (MBE) reactor control software. An Arduino Mega microcontroller is the used for the brain of the shutter... controller , while a custom-designed circuit board distributes 24-V power to each of the 16 shutter solenoids available on the MBE. Using Ethernet

  12. The Distributed Air Wing

    DTIC Science & Technology

    2014-06-01

    Cruise Missile LCS Littoral Combat Ship LEO Low Earth Orbit LER Loss-Exchange-Ration LHA Landing Helicopter Assault LIDAR Laser Imaging Detection and...Ranging LOC Lines of Communication LP Linear Programming LRASM Long Range Anti-Ship Missile LT Long Ton MANA Map-Aware Non-uniform Automata ME...enemy’s spy satellites. Based on open source information, China currently has 25 satellites operating in Low Earth Orbit ( LEO ), each operates at an

  13. Development of efficient and cost-effective distributed hydrological modeling tool MWEasyDHM based on open-source MapWindow GIS

    NASA Astrophysics Data System (ADS)

    Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao

    2011-09-01

    Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.

  14. Open-source software: not quite endsville.

    PubMed

    Stahl, Matthew T

    2005-02-01

    Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apte, A; Veeraraghavan, H; Oh, J

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less

  16. Developing an Open Source Option for NASA Software

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Parks, John W. (Technical Monitor)

    2003-01-01

    We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.

  17. LED lamp or bulb with remote phosphor and diffuser configuration with enhanced scattering properties

    DOEpatents

    Tong, Tao; Le Toquin, Ronan; Keller, Bernd; Tarsa, Eric; Youmans, Mark; Lowes, Theodore; Medendorp, Jr., Nicholas W; Van De Ven, Antony; Negley, Gerald

    2014-11-11

    An LED lamp or bulb is disclosed that comprises a light source, a heat sink structure and an optical cavity. The optical cavity comprises a phosphor carrier having a conversions material and arranged over an opening to the cavity. The phosphor carrier comprises a thermally conductive transparent material and is thermally coupled to the heat sink structure. An LED based light source is mounted in the optical cavity remote to the phosphor carrier with light from the light source passing through the phosphor carrier. A diffuser dome is included that is mounted over the optical cavity, with light from the optical cavity passing through the diffuser dome. The properties of the diffuser, such as geometry, scattering properties of the scattering layer, surface roughness or smoothness, and spatial distribution of the scattering layer properties may be used to control various lamp properties such as color uniformity and light intensity distribution as a function of viewing angle.

  18. GLACiAR, an Open-Source Python Tool for Simulations of Source Recovery and Completeness in Galaxy Surveys

    NASA Astrophysics Data System (ADS)

    Carrasco, D.; Trenti, M.; Mutch, S.; Oesch, P. A.

    2018-06-01

    The luminosity function is a fundamental observable for characterising how galaxies form and evolve throughout the cosmic history. One key ingredient to derive this measurement from the number counts in a survey is the characterisation of the completeness and redshift selection functions for the observations. In this paper, we present GLACiAR, an open python tool available on GitHub to estimate the completeness and selection functions in galaxy surveys. The code is tailored for multiband imaging surveys aimed at searching for high-redshift galaxies through the Lyman-break technique, but it can be applied broadly. The code generates artificial galaxies that follow Sérsic profiles with different indexes and with customisable size, redshift, and spectral energy distribution properties, adds them to input images, and measures the recovery rate. To illustrate this new software tool, we apply it to quantify the completeness and redshift selection functions for J-dropouts sources (redshift z 10 galaxies) in the Hubble Space Telescope Brightest of Reionizing Galaxies Survey. Our comparison with a previous completeness analysis on the same dataset shows overall agreement, but also highlights how different modelling assumptions for the artificial sources can impact completeness estimates.

  19. Open-Source Data and the Study of Homicide.

    PubMed

    Parkin, William S; Gruenewald, Jeff

    2015-07-20

    To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.

  20. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    PubMed Central

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H.A.; Hlavacek, William S.; Posner, Richard G.

    2016-01-01

    Summary: Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. Availability and implementation: BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary information: Supplementary data are available at Bioinformatics online. Contact: bionetgen.help@gmail.com PMID:26556387

  1. An open data repository for steady state analysis of a 100-node electricity distribution network with moderate connection of renewable energy sources.

    PubMed

    Lazarou, Stavros; Vita, Vasiliki; Ekonomou, Lambros

    2018-02-01

    The data of this article represent a real electricity distribution network on twenty kilovolts (20 kV) at medium voltage level of the Hellenic electricity distribution system [1]. This network has been chosen as suitable for smart grid analysis. It demonstrates moderate penetration of renewable sources and it has capability in part of time for reverse power flows. It is suitable for studies of load aggregation, storage, demand response. It represents a rural line of fifty-five kilometres (55 km) total length, a typical length for this type. It serves forty-five (45) medium to low voltage transformers and twenty-four (24) connections to photovoltaic plants. The total installed load capacity is twelve mega-volt-ampere (12 MVA), however the maximum observed load is lower. The data are ready to perform load flow simulation on Matpower [2] for the maximum observed load power on the half production for renewables. The simulation results and processed data for creating the source code are also provided on the database available at http://dx.doi.org/10.7910/DVN/1I6MKU.

  2. 3Dmol.js: molecular visualization with WebGL.

    PubMed

    Rego, Nicholas; Koes, David

    2015-04-15

    3Dmol.js is a modern, object-oriented JavaScript library that uses the latest web technologies to provide interactive, hardware-accelerated three-dimensional representations of molecular data without the need to install browser plugins or Java. 3Dmol.js provides a full featured API for developers as well as a straightforward declarative interface that lets users easily share and embed molecular data in websites. 3Dmol.js is distributed under the permissive BSD open source license. Source code and documentation can be found at http://3Dmol.csb.pitt.edu dkoes@pitt.edu. © The Author 2014. Published by Oxford University Press.

  3. Web Mapping Architectures Based on Open Specifications and Free and Open Source Software in the Water Domain

    NASA Astrophysics Data System (ADS)

    Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.

    2017-09-01

    The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.

    VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less

  5. Combustion of Biofuel as a Renewable Energy Source in Sandia Flame Geometry

    NASA Astrophysics Data System (ADS)

    Rassoulinejad-Mousavi, Seyed Moein; Mao, Yijin; Zhang, Yuwen

    Energy security and climate change are two important key causes of wide spread employment of biofuel notwithstanding of problems associated with its usage. In this research, combustion of biofuel as a renewable energy source was numerically investigated in the well-known and practical Sandia flame geometry. Combustion performance of the flame has been simulated by burning biodiesel (methyl decanoate, methyl 9-decenoate, and n-heptane) oxidation with 118 species reduced/skeletal mechanism. The open-source code OpenFoam was used for simulating turbulent biodiesel-air combustion in the cylindrical chamber using the standard k-epsilon model. To check the accuracy of numerical results, the system was initially validated with methane-air Sandia national laboratories flame D experimental results. Excellent agreements between numerical and experimental results were observed at different cross sections. After ignition, temperature distributions at different distances of axial and radial directions as well as species mass fraction were investigated. It is concluded that biofuel has the capability of implementation in the turbulent jet flame that is a step forward in promotion of sustainable energy technologies and applications.

  6. Integrated genome browser: visual analytics platform for genomics.

    PubMed

    Freese, Nowlan H; Norris, David C; Loraine, Ann E

    2016-07-15

    Genome browsers that support fast navigation through vast datasets and provide interactive visual analytics functions can help scientists achieve deeper insight into biological systems. Toward this end, we developed Integrated Genome Browser (IGB), a highly configurable, interactive and fast open source desktop genome browser. Here we describe multiple updates to IGB, including all-new capabilities to display and interact with data from high-throughput sequencing experiments. To demonstrate, we describe example visualizations and analyses of datasets from RNA-Seq, ChIP-Seq and bisulfite sequencing experiments. Understanding results from genome-scale experiments requires viewing the data in the context of reference genome annotations and other related datasets. To facilitate this, we enhanced IGB's ability to consume data from diverse sources, including Galaxy, Distributed Annotation and IGB-specific Quickload servers. To support future visualization needs as new genome-scale assays enter wide use, we transformed the IGB codebase into a modular, extensible platform for developers to create and deploy all-new visualizations of genomic data. IGB is open source and is freely available from http://bioviz.org/igb aloraine@uncc.edu. © The Author 2016. Published by Oxford University Press.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana L. Kelly

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition,more » substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.« less

  8. How Is Open Source Special?

    ERIC Educational Resources Information Center

    Kapor, Mitchell

    2005-01-01

    Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…

  9. Analysis of jet-airfoil interaction noise sources by using a microphone array technique

    NASA Astrophysics Data System (ADS)

    Fleury, Vincent; Davy, Renaud

    2016-03-01

    The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.

  10. Common characteristics of open source software development and applicability for drug discovery: a systematic review.

    PubMed

    Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne

    2011-09-28

    Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.

  11. Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.

    PubMed

    Benson, Tim

    2016-07-04

    Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.

  12. Weather forecasting with open source software

    NASA Astrophysics Data System (ADS)

    Rautenhaus, Marc; Dörnbrack, Andreas

    2013-04-01

    To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.

  13. Open Source Software Development

    DTIC Science & Technology

    2011-01-01

    Software, 2002, 149(1), 3-17. 3. DiBona , C., Cooper, D., and Stone, M. (Eds.), Open Sources 2.0, 2005, O’Reilly Media, Sebastopol, CA. Also see, C... DiBona , S. Ockman, and M. Stone (Eds.). Open Sources: Vocides from the Open Source Revolution, 1999. O’Reilly Media, Sebastopol, CA. 4. Ducheneaut, N

  14. A comparison of untagged gamma-ray and tagged-neutron yields from 241AmBe and 238PuBe sources.

    PubMed

    Scherzinger, J; Al Jebali, R; Annand, J R M; Fissum, K G; Hall-Wilton, R; Koufigar, S; Mauritzson, N; Messi, F; Perrey, H; Rofors, E

    2017-09-01

    Untagged gamma-ray and tagged-neutron yields from 241 AmBe and 238 PuBe mixed-field sources have been measured. Gamma-ray spectroscopy measurements from 1 to 5MeV were performed in an open environment using a CeBr 3 detector and the same experimental conditions for both sources. The shapes of the distributions are very similar and agree well with previous data. Tagged-neutron measurements from 2 to 6MeV were performed in a shielded environment using a NE-213 liquid-scintillator detector for the neutrons and a YAP(Ce) detector to tag the 4.44MeVgamma-rays associated with the de-excitation of the first-excited state of 12 C. Again, the same experimental conditions were used for both sources. The shapes of these distributions are also very similar and agree well with previous data, each other, and the ISO recommendation. Our 238 PuBe source provides approximately 2.6 times more 4.44MeVgamma-rays and 2.4 times more neutrons over the tagged-neutron energy range, the latter in reasonable agreement with the original full-spectrum source-calibration measurements performed at the time of their acquisition. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Web catalog of oceanographic data using GeoNetwork

    NASA Astrophysics Data System (ADS)

    Marinova, Veselka; Stefanov, Asen

    2017-04-01

    Most of the data collected, analyzed and used by Bulgarian oceanographic data center (BgODC) from scientific cruises, argo floats, ferry boxes and real time operating systems are spatially oriented and need to be displayed on the map. The challenge is to make spatial information more accessible to users, decision makers and scientists. In order to meet this challenge, BgODC concentrate its efforts on improving dynamic and standardized access to their geospatial data as well as those from various related organizations and institutions. BgODC currently is implementing a project to create a geospatial portal for distributing metadata and search, exchange and harvesting spatial data. There are many open source software solutions able to create such spatial data infrastructure (SDI). Finally, the GeoNetwork open source is chosen, as it is already widespread. This software is free, effective and "cheap" solution for implementing SDI at organization level. It is platform independent and runs under many operating systems. Filling of the catalog goes through these practical steps: • Managing and storing data reliably within MS SQL spatial data base; • Registration of maps and data of various formats and sources in GeoServer (most popular open source geospatial server embedded with GeoNetwork) ; • Filling added meta data and publishing geospatial data at the desktop of GeoNetwork. GeoServer and GeoNetwork are based on Java so they require installing of a servlet engine like Tomcat. The experience gained from the use of GeoNetwork Open Source confirms that the catalog meets the requirements for data management and is flexible enough to customize. Building the catalog facilitates sustainable data exchange between end users. The catalog is a big step towards implementation of the INSPIRE directive due to availability of many features necessary for producing "INSPIRE compliant" metadata records. The catalog now contains all available GIS data provided by BgODC for Internet access. Searching data within the catalog is based upon geographic extent, theme type and free text search.

  16. Open Data and Open Science for better Research in the Geo and Space Domain

    NASA Astrophysics Data System (ADS)

    Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.

    2015-12-01

    Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data catalog based on semantical interoperability including the transparent access to data in relational data bases. References: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/207772/Open_Data_Charter.pdfhttp://www.openscience.org/blog/wp-content/uploads/2013/06/OpenSciencePoster.pdf

  17. The case for open-source software in drug discovery.

    PubMed

    DeLano, Warren L

    2005-02-01

    Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.

  18. Choosing Open Source ERP Systems: What Reasons Are There For Doing So?

    NASA Astrophysics Data System (ADS)

    Johansson, Björn; Sudzina, Frantisek

    Enterprise resource planning (ERP) systems attract a high attention and open source software does it as well. The question is then if, and if so, when do open source ERP systems take off. The paper describes the status of open source ERP systems. Based on literature review of ERP system selection criteria based on Web of Science articles, it discusses reported reasons for choosing open source or proprietary ERP systems. Last but not least, the article presents some conclusions that could act as input for future research. The paper aims at building up a foundation for the basic question: What are the reasons for an organization to adopt open source ERP systems.

  19. A Survey of Techniques for Security Architecture Analysis

    DTIC Science & Technology

    2003-05-01

    to be corrected immediately. 49 DSTO-TR-1438 A software phenomenon is the "user innovation network", examples of such networks being "free" and "open...source" software projects. These networks have innovation development, production, distribution and consumption all being performed by users/self...manufacturers. "User innovation networks can function entirely independently of manufacturers because (1) at least some users have sufficient incentive to

  20. PAL: A Positional Astronomy Library

    NASA Astrophysics Data System (ADS)

    Jenness, T.; Berry, D. S.

    2013-10-01

    PAL is a new positional astronomy library written in C that attempts to retain the SLALIB API but is distributed with an open source GPL license. The library depends on the IAU SOFA library wherever a SOFA routine exists and uses the most recent nutation and precession models. Currently about 100 of the 200 SLALIB routines are available. Interfaces are also available from Perl and Python. PAL is freely available via github.

  1. CAGE IIIA Distributed Simulation Design Methodology

    DTIC Science & Technology

    2014-05-01

    2 VHF Very High Frequency VLC Video LAN Codec – an Open-source cross-platform multimedia player and framework VM Virtual Machine VOIP Voice Over...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the...operation and to be available in the other nation’s simulations. The challenge for the CAGE campaign of experiments is to continue to build upon this

  2. Modeling Surfzone/Inner-shelf Exchange

    DTIC Science & Technology

    2013-09-30

    goal here is the use a wave-resolving Boussinesq model to figure out how to parameterize the vorticity generation due to short-crested breaking of...individual waves. The Boussinesq model funwaveC used here, developed by the PI and distributed as open-source software, has been val- idated in ONR funded...shading of bottom bathymetry, mooring locations (green squares) and the local co-ordinate system (black arrows). Positive x is directed towards the

  3. Introducing FNCS: Framework for Network Co-Simulation

    ScienceCinema

    None

    2018-06-07

    This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.

  4. Exciton Hybridisation in Organic-Inorganic Semiconductor Microcavities

    DTIC Science & Technology

    2002-02-01

    hybridizing organic and inorganic semiconductors in microcavities to produce a highly efficient light source that could be either a laser or a very efficient...such process may also have an important effect on the spectral distribution of photoluminescence from the microcavity and can be considered as a...Absorption (solid dots) and photoluminescence emission (open circles) of a thin film of J-aggregated cyanine dyes in a PVA matrix. Note, the chemical

  5. Distributed Kernelized Locality-Sensitive Hashing for Faster Image Based Navigation

    DTIC Science & Technology

    2015-03-26

    Facebook, Google, and Yahoo !. Current methods for image retrieval become problematic when implemented on image datasets that can easily reach billions of...correlations. Tech industry leaders like Facebook, Google, and Yahoo ! sort and index even larger volumes of “big data” daily. When attempting to process...open source implementation of Google’s MapReduce programming paradigm [13] which has been used for many different things. Using Apache Hadoop, Yahoo

  6. Introducing FNCS: Framework for Network Co-Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-10-23

    This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.

  7. Study of traffic noise levels at various heights of a 39-story building

    Treesearch

    Norman L. Meyerson

    1977-01-01

    Comparative measurements of exterior noise levels made at floors 3, 14, 26, and 37 of a high-rise apartment tower, when presented as a statistical distribution of percent exceedance vs. decibels, show the nature of the influence of local traffic at the low floors compared to the influence of an area source at the high floors. The open window penalty to interior noise...

  8. Development and Validation of a Low Cost, Flexible, Open Source Robot for Use as a Teaching and Research Tool across the Educational Spectrum

    ERIC Educational Resources Information Center

    Howell, Abraham L.

    2012-01-01

    In the high tech factories of today robots can be used to perform various tasks that span a wide spectrum that encompasses the act of performing high-speed, automated assembly of cell phones, laptops and other electronic devices to the compounding, filling, packaging and distribution of life-saving pharmaceuticals. As robot usage continues to…

  9. Enabling Incremental Iterative Development at Scale: Quality Attribute Refinement and Allocation in Practice

    DTIC Science & Technology

    2015-06-01

    abstract constraints along six dimen- sions for expansion: user, actions, data , business rules, interfaces, and quality attributes [Gottesdiener 2010...relevant open source systems. For example, the CONNECT and HADOOP Distributed File System (HDFS) projects have many user stories that deal with...Iteration Zero involves architecture planning before writing any code. An overly long Iteration Zero is equivalent to the dysfunctional “ Big Up-Front

  10. Extending Supernova Spectral Templates for Next Generation Space Telescope Observations

    NASA Astrophysics Data System (ADS)

    Roberts-Pierel, Justin; Rodney, Steven A.; Steven Rodney

    2018-01-01

    Widely used empirical supernova (SN) Spectral Energy Distributions (SEDs) have not historically extended meaningfully into the ultraviolet (UV), or the infrared (IR). However, both are critical for current and future aspects of SN research including UV spectra as probes of poorly understood SN Ia physical properties, and expanding our view of the universe with high-redshift James Webb Space Telescope (JWST) IR observations. We therefore present a comprehensive set of SN SED templates that have been extended into the UV and IR, as well as an open-source software package written in Python that enables a user to generate their own extrapolated SEDs. We have taken a sampling of core-collapse (CC) and Type Ia SNe to get a time-dependent distribution of UV and IR colors (U-B,r’-[JHK]), and then generated color curves are used to extrapolate SEDs into the UV and IR. The SED extrapolation process is now easily duplicated using a user’s own data and parameters via our open-source Python package: SNSEDextend. This work develops the tools necessary to explore the JWST’s ability to discriminate between CC and Type Ia SNe, as well as provides a repository of SN SEDs that will be invaluable to future JWST and WFIRST SN studies.

  11. Using WNTR to Model Water Distribution System Resilience ...

    EPA Pesticide Factsheets

    The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of disruptive events, including earthquakes, contamination incidents, floods, climate change, and fires. The software includes the EPANET solver as well as a WNTR solver with the ability to model pressure-driven demand hydraulics, pipe breaks, component degradation and failure, changes to supply and demand, and cascading failure. Damage to individual components in the network (i.e. pipes, tanks) can be selected probabilistically using fragility curves. WNTR can also simulate different types of resilience-enhancing actions, including scheduled pipe repair or replacement, water conservation efforts, addition of back-up power, and use of contamination warning systems. The software can be used to estimate potential damage in a network, evaluate preparedness, prioritize repair strategies, and identify worse case scenarios. As a Python package, WNTR takes advantage of many existing python capabilities, including parallel processing of scenarios and graphics capabilities. This presentation will outline the modeling components in WNTR, demonstrate their use, give the audience information on how to get started using the code, and invite others to participate in this open source project. This pres

  12. A new free and open source tool for space plasma modeling.

    NASA Astrophysics Data System (ADS)

    Honkonen, I. J.

    2014-12-01

    I will present a new distributed memory parallel, free and open source computational model for studying space plasma. The model is written in C++ with emphasis on good software development practices and code readability without sacrificing serial or parallel performance. As such the model could be especially useful for education, for learning both (magneto)hydrodynamics (MHD) and computational model development. By using latest features of the C++ standard (2011) it has been possible to develop a very modular program which improves not only the readability of code but also the testability of the model and decreases the effort required to make changes to various parts of the program. Major parts of the model, functionality not directly related to (M)HD, have been outsourced to other freely available libraries which has reduced the development time of the model significantly. I will present an overview of the code architecture as well as details of different parts of the model and will show examples of using the model including preparing input files and plotting results. A multitude of 1-, 2- and 3-dimensional test cases are included in the software distribution and the results of, for example, Kelvin-Helmholtz, bow shock, blast wave and reconnection tests, will be presented.

  13. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    ERIC Educational Resources Information Center

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  14. Implementing Open Source Platform for Education Quality Enhancement in Primary Education: Indonesia Experience

    ERIC Educational Resources Information Center

    Kisworo, Marsudi Wahyu

    2016-01-01

    Information and Communication Technology (ICT)-supported learning using free and open source platform draws little attention as open source initiatives were focused in secondary or tertiary educations. This study investigates possibilities of ICT-supported learning using open source platform for primary educations. The data of this study is taken…

  15. An Analysis of Open Source Security Software Products Downloads

    ERIC Educational Resources Information Center

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  16. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  17. Feature Geo Analytics and Big Data Processing: Hybrid Approaches for Earth Science and Real-Time Decision Support

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.

    2016-12-01

    Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.

  18. Sources and mass inventory of sedimentary polycyclic aromatic hydrocarbons in the Gulf of Thailand: Implications for pathways and energy structure in SE Asia.

    PubMed

    Hu, Limin; Shi, Xuefa; Qiao, Shuqing; Lin, Tian; Li, Yuanyuan; Bai, Yazhi; Wu, Bin; Liu, Shengfa; Kornkanitnan, Narumol; Khokiattiwong, Somkiat

    2017-01-01

    Surface sediments obtained from a matrix of 92 sample sites in the Gulf of Thailand (GOT) were analyzed for a comprehensive study of the distribution, sources, and mass inventory of polycyclic aromatic hydrocarbons (PAHs) to assess their input pathways and impacts of the regional land-based energy structure on the deposition of PAHs on the adjacent continental margins. The concentration of 16 PAHs in the GOT ranged from 2.6 to 78.1ng/g (dry weight), and the mean concentration was 19.4±15.1ng/g. The spatial distribution pattern of 16 PAH was generally consistent with that of sediment grain size, suggesting the influence of regional hydrodynamic conditions. Correlation and principal component analysis of the PAHs indicated that direct land-based inputs were dominantly responsible for the occurrence of PAHs in the upper GOT and the low molecular weight (LMW) PAHs in the coastal region could be from petrogenic sources. A positive matrix factorization (PMF) model apportioned five contributors: petroleum residues (~44%), biomass burning (~13%), vehicular emissions (~11%), coal combustion (~6%), and air-water exchange (~25%). Gas absorption may be a significant external input pathway for the volatile PAHs in the open GOT, which further implies that atmospheric loading could be important for the sink of PAHs in the open sea of the Southeast Asia (SE Asia). The different PAH source patterns obtained and a significant disparity of PAH mass inventory in the sediments along the East and Southeast Asia continental margins can be ascribed mainly to different land-based PAH emission features under the varied regional energy structure in addition to the depositional environment and climatic conditions. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  20. The 2017 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year’s theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest. PMID:29118973

  1. The 2017 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year's theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest.

  2. The Efficient Utilization of Open Source Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baty, Samuel R.

    These are a set of slides on the efficient utilization of open source information. Open source information consists of a vast set of information from a variety of sources. Not only does the quantity of open source information pose a problem, the quality of such information can hinder efforts. To show this, two case studies are mentioned: Iran and North Korea, in order to see how open source information can be utilized. The huge breadth and depth of open source information can complicate an analysis, especially because open information has no guarantee of accuracy. Open source information can provide keymore » insights either directly or indirectly: looking at supporting factors (flow of scientists, products and waste from mines, government budgets, etc.); direct factors (statements, tests, deployments). Fundamentally, it is the independent verification of information that allows for a more complete picture to be formed. Overlapping sources allow for more precise bounds on times, weights, temperatures, yields or other issues of interest in order to determine capability. Ultimately, a "good" answer almost never comes from an individual, but rather requires the utilization of a wide range of skill sets held by a team of people.« less

  3. Metazoan meiofauna in deep-sea canyons and adjacent open slopes: A large-scale comparison with focus on the rare taxa

    NASA Astrophysics Data System (ADS)

    Bianchelli, S.; Gambi, C.; Zeppilli, D.; Danovaro, R.

    2010-03-01

    Metazoan meiofaunal abundance, total biomass, nematode size and the richness of taxa were investigated along bathymetric gradients (from the shelf break down to ca. 5000-m depth) in six submarine canyons and on five adjacent open slopes of three deep-sea regions. The investigated areas were distributed along >2500 km, on the Portuguese to the Catalan and South Adriatic margins. The Portuguese and Catalan margins displayed the highest abundances, biomass and richness of taxa, while the lowest values were observed in the Central Mediterranean Sea. The comparison between canyons and the nearby open slopes showed the lack of significant differences in terms of meiofaunal abundance and biomass at any sampling depth. In most canyons and on most slopes, meiofaunal variables did not display consistent bathymetric patterns. Conversely, we found that the different topographic features were apparently responsible for significant differences in the abundance and distribution of the rare meiofaunal taxa (i.e. taxa accounting for <1% of total meiofaunal abundance). Several taxa belonging to the temporary meiofauna, such as larvae/juveniles of Priapulida, Holothuroidea, Ascidiacea and Cnidaria, were encountered exclusively on open slopes, while others (including the Tanaidacea and Echinodea larvae) were found exclusively in canyons sediments. Results reported here indicate that, at large spatial scales, differences in deep-sea meiofaunal abundance and biomass are not only controlled by the available food sources, but also by the region or habitat specific topographic features, which apparently play a key role in the distribution of rare benthic taxa.

  4. An open source multivariate framework for n-tissue segmentation with evaluation on public data.

    PubMed

    Avants, Brian B; Tustison, Nicholas J; Wu, Jue; Cook, Philip A; Gee, James C

    2011-12-01

    We introduce Atropos, an ITK-based multivariate n-class open source segmentation algorithm distributed with ANTs ( http://www.picsl.upenn.edu/ANTs). The Bayesian formulation of the segmentation problem is solved using the Expectation Maximization (EM) algorithm with the modeling of the class intensities based on either parametric or non-parametric finite mixtures. Atropos is capable of incorporating spatial prior probability maps (sparse), prior label maps and/or Markov Random Field (MRF) modeling. Atropos has also been efficiently implemented to handle large quantities of possible labelings (in the experimental section, we use up to 69 classes) with a minimal memory footprint. This work describes the technical and implementation aspects of Atropos and evaluates its performance on two different ground-truth datasets. First, we use the BrainWeb dataset from Montreal Neurological Institute to evaluate three-tissue segmentation performance via (1) K-means segmentation without use of template data; (2) MRF segmentation with initialization by prior probability maps derived from a group template; (3) Prior-based segmentation with use of spatial prior probability maps derived from a group template. We also evaluate Atropos performance by using spatial priors to drive a 69-class EM segmentation problem derived from the Hammers atlas from University College London. These evaluation studies, combined with illustrative examples that exercise Atropos options, demonstrate both performance and wide applicability of this new platform-independent open source segmentation tool.

  5. An Open Source Multivariate Framework for n-Tissue Segmentation with Evaluation on Public Data

    PubMed Central

    Tustison, Nicholas J.; Wu, Jue; Cook, Philip A.; Gee, James C.

    2012-01-01

    We introduce Atropos, an ITK-based multivariate n-class open source segmentation algorithm distributed with ANTs (http://www.picsl.upenn.edu/ANTs). The Bayesian formulation of the segmentation problem is solved using the Expectation Maximization (EM) algorithm with the modeling of the class intensities based on either parametric or non-parametric finite mixtures. Atropos is capable of incorporating spatial prior probability maps (sparse), prior label maps and/or Markov Random Field (MRF) modeling. Atropos has also been efficiently implemented to handle large quantities of possible labelings (in the experimental section, we use up to 69 classes) with a minimal memory footprint. This work describes the technical and implementation aspects of Atropos and evaluates its performance on two different ground-truth datasets. First, we use the BrainWeb dataset from Montreal Neurological Institute to evaluate three-tissue segmentation performance via (1) K-means segmentation without use of template data; (2) MRF segmentation with initialization by prior probability maps derived from a group template; (3) Prior-based segmentation with use of spatial prior probability maps derived from a group template. We also evaluate Atropos performance by using spatial priors to drive a 69-class EM segmentation problem derived from the Hammers atlas from University College London. These evaluation studies, combined with illustrative examples that exercise Atropos options, demonstrate both performance and wide applicability of this new platform-independent open source segmentation tool. PMID:21373993

  6. A simple theoretical model for ⁶³Ni betavoltaic battery.

    PubMed

    Zuo, Guoping; Zhou, Jianliang; Ke, Guotu

    2013-12-01

    A numerical simulation of the energy deposition distribution in semiconductors is performed for ⁶³Ni beta particles. Results show that the energy deposition distribution exhibits an approximate exponential decay law. A simple theoretical model is developed for ⁶³Ni betavoltaic battery based on the distribution characteristics. The correctness of the model is validated by two literature experiments. Results show that the theoretical short-circuit current agrees well with the experimental results, and the open-circuit voltage deviates from the experimental results in terms of the influence of the PN junction defects and the simplification of the source. The theoretical model can be applied to ⁶³Ni and ¹⁴⁷Pm betavoltaic batteries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3

    NASA Astrophysics Data System (ADS)

    Endsley, K. A.; Billmire, M. G.

    2016-01-01

    Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.

  8. Barista: A Framework for Concurrent Speech Processing by USC-SAIL

    PubMed Central

    Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G.; Narayanan, Shrikanth S.

    2016-01-01

    We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0. PMID:27610047

  9. Barista: A Framework for Concurrent Speech Processing by USC-SAIL.

    PubMed

    Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G; Narayanan, Shrikanth S

    2014-05-01

    We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0.

  10. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    NASA Astrophysics Data System (ADS)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git has been adopted in order to facilitate the collaborative maintenance and improvement of the code; CopyrightsOFF is a free software that anyone can use, copy, distribute, study, change and improve under the GNU Public License version 3. The present paper is a manifesto of OFF code and presents the currently implemented features and ongoing developments. This work is focused on the computational techniques adopted and a detailed description of the main API characteristics is reported. OFF capabilities are demonstrated by means of one and two dimensional examples and a three dimensional real application.

  11. The open sea as the main source of methylmercury in the water column of the Gulf of Lions (Northwestern Mediterranean margin)

    NASA Astrophysics Data System (ADS)

    Cossa, Daniel; Durrieu de Madron, Xavier; Schäfer, Jörg; Lanceleur, Laurent; Guédron, Stéphane; Buscail, Roselyne; Thomas, Bastien; Castelle, Sabine; Naudin, Jean-Jacques

    2017-02-01

    Despite the ecologic and economical importance of coastal areas, the neurotoxic bioaccumulable monomethylmercury (MMHg) fluxes within the ocean margins and exchanges with the open sea remain unassessed. The aim of this paper is to address the questions of the abundance, distribution, production and exchanges of methylated mercury species (MeHgT), including MMHg and dimethylmercury (DMHg), in the waters, atmosphere and sediments of the Northwestern Mediterranean margin including the Rhône River delta, the continental shelf and its slope (Gulf of Lions) and the adjacent open sea (North Gyre). Concentrations of MeHgT ranged from <0.02 to 0.48 pmol L-1 with highest values associated with the oxygen-deficient zone of the open sea. The methylated mercury to total mercury proportion (MeHgT/HgT) increased from 2% to 4% in the Rhône River to up to 30% (averaging 18%) in the North Gyre waters, whereas, within the shelf waters, MeHgT/HgT proportions were the lowest (1-3%). We calculate that the open sea is the major source of MeHgT for the shelf waters, with an annual flux estimated at 0.68 ± 0.12 kmol a-1 (i.e., equivalent to 12% of the HgT flux). This MeHgT influx is more than 80 times the direct atmospheric deposition or the in situ net production, more than 40 times the estimated "maximum potential" annual efflux from shelf sediment, and more than 7 times that of the continental sources. In the open sea, ratios of MMHg/DMHg in waters were always <1 and minimum in the oxygen deficient zones of the water column, where MeHg concentrations are maximum. This observation supports the idea that MMHg could be a degradation product of DMHg produced from inorganic divalent Hg.

  12. The 2015 Bioinformatics Open Source Conference (BOSC 2015).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Lapp, Hilmar; Chapman, Brad; Davey, Rob; Fields, Christopher; Hokamp, Karsten; Munoz-Torres, Monica

    2016-02-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.

  13. Annotation and visualization of endogenous retroviral sequences using the Distributed Annotation System (DAS) and eBioX

    PubMed Central

    Martínez Barrio, Álvaro; Lagercrantz, Erik; Sperber, Göran O; Blomberg, Jonas; Bongcam-Rudloff, Erik

    2009-01-01

    Background The Distributed Annotation System (DAS) is a widely used network protocol for sharing biological information. The distributed aspects of the protocol enable the use of various reference and annotation servers for connecting biological sequence data to pertinent annotations in order to depict an integrated view of the data for the final user. Results An annotation server has been devised to provide information about the endogenous retroviruses detected and annotated by a specialized in silico tool called RetroTector. We describe the procedure to implement the DAS 1.5 protocol commands necessary for constructing the DAS annotation server. We use our server to exemplify those steps. Data distribution is kept separated from visualization which is carried out by eBioX, an easy to use open source program incorporating multiple bioinformatics utilities. Some well characterized endogenous retroviruses are shown in two different DAS clients. A rapid analysis of areas free from retroviral insertions could be facilitated by our annotations. Conclusion The DAS protocol has shown to be advantageous in the distribution of endogenous retrovirus data. The distributed nature of the protocol is also found to aid in combining annotation and visualization along a genome in order to enhance the understanding of ERV contribution to its evolution. Reference and annotation servers are conjointly used by eBioX to provide visualization of ERV annotations as well as other data sources. Our DAS data source can be found in the central public DAS service repository, , or at . PMID:19534743

  14. Open Source, Openness, and Higher Education

    ERIC Educational Resources Information Center

    Wiley, David

    2006-01-01

    In this article David Wiley provides an overview of how the general expansion of open source software has affected the world of education in particular. In doing so, Wiley not only addresses the development of open source software applications for teachers and administrators, he also discusses how the fundamental philosophy of the open source…

  15. The Emergence of Open-Source Software in North America

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…

  16. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  17. A statistical analysis of North East Atlantic (submicron) aerosol size distributions

    NASA Astrophysics Data System (ADS)

    Dall'Osto, M.; Monahan, C.; Greaney, R.; Beddows, D. C. S.; Harrison, R. M.; Ceburnis, D.; O'Dowd, C. D.

    2011-08-01

    The Global Atmospheric Watch research station at Mace Head (Ireland) offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical Cluster~analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75 % throughout the year. By applying the Hartigan-Wong k-Means method, 12 Clusters were identified as systematically occurring and these 12 Clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time), open ocean nucleation category (occurring 32.6 % of the time), background clean marine category (occurring 26.1 % of the time) and anthropogenic category (occurring 20 % of the time) aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less that 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine characteristic is a clear bimodality in the size distribution, although it should be noted that either the Aitken mode or the Accumulation mode may dominate the number concentration. By contrast, the continentally-influenced size distributions are generally more mono-modal, albeit with traces of bi-modality. The open ocean category occurs more often during May, June and July, corresponding with the N. E. Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6 %), this suggests that the marine biota is an important source of new aerosol particles in N. E. Atlantic Air.

  18. TU-H-BRC-05: Stereotactic Radiosurgery Optimized with Orthovoltage Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fagerstrom, J; Culberson, W; Bender, E

    2016-06-15

    Purpose: To achieve improved stereotactic radiosurgery (SRS) dose distributions using orthovoltage energy fluence modulation with inverse planning optimization techniques. Methods: A pencil beam model was used to calculate dose distributions from the institution’s orthovoltage unit at 250 kVp. Kernels for the model were derived using Monte Carlo methods as well as measurements with radiochromic film. The orthovoltage photon spectra, modulated by varying thicknesses of attenuating material, were approximated using open-source software. A genetic algorithm search heuristic routine was used to optimize added tungsten filtration thicknesses to approach rectangular function dose distributions at depth. Optimizations were performed for depths of 2.5,more » 5.0, and 7.5 cm, with cone sizes of 8, 10, and 12 mm. Results: Circularly-symmetric tungsten filters were designed based on the results of the optimization, to modulate the orthovoltage beam across the aperture of an SRS cone collimator. For each depth and cone size combination examined, the beam flatness and 80–20% and 90–10% penumbrae were calculated for both standard, open cone-collimated beams as well as for the optimized, filtered beams. For all configurations tested, the modulated beams were able to achieve improved penumbra widths and flatness statistics at depth, with flatness improving between 33 and 52%, and penumbrae improving between 18 and 25% for the modulated beams compared to the unmodulated beams. Conclusion: A methodology has been described that may be used to optimize the spatial distribution of added filtration material in an orthovoltage SRS beam to result in dose distributions at depth with improved flatness and penumbrae compared to standard open cones. This work provides the mathematical foundation for a novel, orthovoltage energy fluence-modulated SRS system.« less

  19. On the power output of some idealized source configurations with one or more characteristic dimensions

    NASA Technical Reports Server (NTRS)

    Levine, H.

    1982-01-01

    The calculation of power output from a (finite) linear array of equidistant point sources is investigated with allowance for a relative phase shift and particular focus on the circumstances of small/large individual source separation. A key role is played by the estimates found for a twin parameter definite integral that involves the Fejer kernel functions, where N denotes a (positive) integer; these results also permit a quantitative accounting of energy partition between the principal and secondary lobes of the array pattern. Continuously distributed sources along a finite line segment or an open ended circular cylindrical shell are considered, and estimates for the relatively lower output in the latter configuration are made explicit when the shell radius is small compared to the wave length. A systematic reduction of diverse integrals which characterize the energy output from specific line and strip sources is investigated.

  20. Applying representational state transfer (REST) architecture to archetype-based electronic health record systems

    PubMed Central

    2013-01-01

    Background The openEHR project and the closely related ISO 13606 standard have defined structures supporting the content of Electronic Health Records (EHRs). However, there is not yet any finalized openEHR specification of a service interface to aid application developers in creating, accessing, and storing the EHR content. The aim of this paper is to explore how the Representational State Transfer (REST) architectural style can be used as a basis for a platform-independent, HTTP-based openEHR service interface. Associated benefits and tradeoffs of such a design are also explored. Results The main contribution is the formalization of the openEHR storage, retrieval, and version-handling semantics and related services into an implementable HTTP-based service interface. The modular design makes it possible to prototype, test, replicate, distribute, cache, and load-balance the system using ordinary web technology. Other contributions are approaches to query and retrieval of the EHR content that takes caching, logging, and distribution into account. Triggering on EHR change events is also explored. A final contribution is an open source openEHR implementation using the above-mentioned approaches to create LiU EEE, an educational EHR environment intended to help newcomers and developers experiment with and learn about the archetype-based EHR approach and enable rapid prototyping. Conclusions Using REST addressed many architectural concerns in a successful way, but an additional messaging component was needed to address some architectural aspects. Many of our approaches are likely of value to other archetype-based EHR implementations and may contribute to associated service model specifications. PMID:23656624

  1. Applying representational state transfer (REST) architecture to archetype-based electronic health record systems.

    PubMed

    Sundvall, Erik; Nyström, Mikael; Karlsson, Daniel; Eneling, Martin; Chen, Rong; Örman, Håkan

    2013-05-09

    The openEHR project and the closely related ISO 13606 standard have defined structures supporting the content of Electronic Health Records (EHRs). However, there is not yet any finalized openEHR specification of a service interface to aid application developers in creating, accessing, and storing the EHR content.The aim of this paper is to explore how the Representational State Transfer (REST) architectural style can be used as a basis for a platform-independent, HTTP-based openEHR service interface. Associated benefits and tradeoffs of such a design are also explored. The main contribution is the formalization of the openEHR storage, retrieval, and version-handling semantics and related services into an implementable HTTP-based service interface. The modular design makes it possible to prototype, test, replicate, distribute, cache, and load-balance the system using ordinary web technology. Other contributions are approaches to query and retrieval of the EHR content that takes caching, logging, and distribution into account. Triggering on EHR change events is also explored.A final contribution is an open source openEHR implementation using the above-mentioned approaches to create LiU EEE, an educational EHR environment intended to help newcomers and developers experiment with and learn about the archetype-based EHR approach and enable rapid prototyping. Using REST addressed many architectural concerns in a successful way, but an additional messaging component was needed to address some architectural aspects. Many of our approaches are likely of value to other archetype-based EHR implementations and may contribute to associated service model specifications.

  2. Sensing Slow Mobility and Interesting Locations for Lombardy Region (italy): a Case Study Using Pointwise Geolocated Open Data

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Oxoli, D.; Zurbarán, M. A.

    2016-06-01

    During the past years Web 2.0 technologies have caused the emergence of platforms where users can share data related to their activities which in some cases are then publicly released with open licenses. Popular categories for this include community platforms where users can upload GPS tracks collected during slow travel activities (e.g. hiking, biking and horse riding) and platforms where users share their geolocated photos. However, due to the high heterogeneity of the information available on the Web, the sole use of these user-generated contents makes it an ambitious challenge to understand slow mobility flows as well as to detect the most visited locations in a region. Exploiting the available data on community sharing websites allows to collect near real-time open data streams and enables rigorous spatial-temporal analysis. This work presents an approach for collecting, unifying and analysing pointwise geolocated open data available from different sources with the aim of identifying the main locations and destinations of slow mobility activities. For this purpose, we collected pointwise open data from the Wikiloc platform, Twitter, Flickr and Foursquare. The analysis was confined to the data uploaded in Lombardy Region (Northern Italy) - corresponding to millions of pointwise data. Collected data was processed through the use of Free and Open Source Software (FOSS) in order to organize them into a suitable database. This allowed to run statistical analyses on data distribution in both time and space by enabling the detection of users' slow mobility preferences as well as places of interest at a regional scale.

  3. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments.

    PubMed

    Thomas, Brandon R; Chylek, Lily A; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H A; Hlavacek, William S; Posner, Richard G

    2016-03-01

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary data are available at Bioinformatics online. bionetgen.help@gmail.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Ultrabroadband direct detection of nonclassical photon statistics at telecom wavelength

    PubMed Central

    Wakui, Kentaro; Eto, Yujiro; Benichi, Hugo; Izumi, Shuro; Yanagida, Tetsufumi; Ema, Kazuhiro; Numata, Takayuki; Fukuda, Daiji; Takeoka, Masahiro; Sasaki, Masahide

    2014-01-01

    Broadband light sources play essential roles in diverse fields, such as high-capacity optical communications, optical coherence tomography, optical spectroscopy, and spectrograph calibration. Although a nonclassical state from spontaneous parametric down-conversion may serve as a quantum counterpart, its detection and characterization have been a challenging task. Here we demonstrate the direct detection of photon numbers of an ultrabroadband (110 nm FWHM) squeezed state in the telecom band centred at 1535 nm wavelength, using a superconducting transition-edge sensor. The observed photon-number distributions violate Klyshko's criterion for the nonclassicality. From the observed photon-number distribution, we evaluate the second- and third-order correlation functions, and characterize a multimode structure, which implies that several tens of orthonormal modes of squeezing exist in the single optical pulse. Our results and techniques open up a new possibility to generate and characterize frequency-multiplexed nonclassical light sources for quantum info-communications technology. PMID:24694515

  5. Ultrabroadband direct detection of nonclassical photon statistics at telecom wavelength.

    PubMed

    Wakui, Kentaro; Eto, Yujiro; Benichi, Hugo; Izumi, Shuro; Yanagida, Tetsufumi; Ema, Kazuhiro; Numata, Takayuki; Fukuda, Daiji; Takeoka, Masahiro; Sasaki, Masahide

    2014-04-03

    Broadband light sources play essential roles in diverse fields, such as high-capacity optical communications, optical coherence tomography, optical spectroscopy, and spectrograph calibration. Although a nonclassical state from spontaneous parametric down-conversion may serve as a quantum counterpart, its detection and characterization have been a challenging task. Here we demonstrate the direct detection of photon numbers of an ultrabroadband (110 nm FWHM) squeezed state in the telecom band centred at 1535 nm wavelength, using a superconducting transition-edge sensor. The observed photon-number distributions violate Klyshko's criterion for the nonclassicality. From the observed photon-number distribution, we evaluate the second- and third-order correlation functions, and characterize a multimode structure, which implies that several tens of orthonormal modes of squeezing exist in the single optical pulse. Our results and techniques open up a new possibility to generate and characterize frequency-multiplexed nonclassical light sources for quantum info-communications technology.

  6. HELIOS: A new open-source radiative transfer code

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin

    2015-12-01

    I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net

  7. Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on

    PubMed Central

    2011-01-01

    Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community. PMID:21999342

  8. A High-Speed, Real-Time Visualization and State Estimation Platform for Monitoring and Control of Electric Distribution Systems: Implementation and Field Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake; Gotseff, Peter; Giraldez, Julieta

    Continued deployment of renewable and distributed energy resources is fundamentally changing the way that electric distribution systems are controlled and operated; more sophisticated active system control and greater situational awareness are needed. Real-time measurements and distribution system state estimation (DSSE) techniques enable more sophisticated system control and, when combined with visualization applications, greater situational awareness. This paper presents a novel demonstration of a high-speed, real-time DSSE platform and related control and visualization functionalities, implemented using existing open-source software and distribution system monitoring hardware. Live scrolling strip charts of meter data and intuitive annotated map visualizations of the entire state (obtainedmore » via DSSE) of a real-world distribution circuit are shown. The DSSE implementation is validated to demonstrate provision of accurate voltage data. This platform allows for enhanced control and situational awareness using only a minimum quantity of distribution system measurement units and modest data and software infrastructure.« less

  9. Simulations of negative hydrogen ion sources

    NASA Astrophysics Data System (ADS)

    Demerdjiev, A.; Goutev, N.; Tonev, D.

    2018-05-01

    The development and the optimisation of negative hydrogen/deuterium ion sources goes hand in hand with modelling. In this paper a brief introduction on the physics and types of different sources, and on the Kinetic and Fluid theories for plasma description is made. Examples of some recent models are considered whereas the main emphasis is on the model behind the concept and design of a matrix source of negative hydrogen ions. At the Institute for Nuclear Research and Nuclear Energy of the Bulgarian Academy of Sciences a new cyclotron center is under construction which opens new opportunities for research. One of them is the development of plasma sources for additional proton beam acceleration. We have applied the modelling technique implemented in the aforementioned model of the matrix source to a microwave plasma source exemplifying a plasma filled array of cavities made of a dielectric material with high permittivity. Preliminary results for the distribution of the plasma parameters and the φ component of the electric field in the plasma are obtained.

  10. Open Genetic Code: on open source in the life sciences.

    PubMed

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  11. The Open Source Teaching Project (OSTP): Research Note.

    ERIC Educational Resources Information Center

    Hirst, Tony

    The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…

  12. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  13. Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness

    ERIC Educational Resources Information Center

    Committee for Economic Development, 2006

    2006-01-01

    Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…

  14. Ray Meta: scalable de novo metagenome assembly and profiling

    PubMed Central

    2012-01-01

    Voluminous parallel sequencing datasets, especially metagenomic experiments, require distributed computing for de novo assembly and taxonomic profiling. Ray Meta is a massively distributed metagenome assembler that is coupled with Ray Communities, which profiles microbiomes based on uniquely-colored k-mers. It can accurately assemble and profile a three billion read metagenomic experiment representing 1,000 bacterial genomes of uneven proportions in 15 hours with 1,024 processor cores, using only 1.5 GB per core. The software will facilitate the processing of large and complex datasets, and will help in generating biological insights for specific environments. Ray Meta is open source and available at http://denovoassembler.sf.net. PMID:23259615

  15. Livermore Big Artificial Neural Network Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Essen, Brian Van; Jacobs, Sam; Kim, Hyojin

    2016-07-01

    LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.

  16. MzJava: An open source library for mass spectrometry data processing.

    PubMed

    Horlacher, Oliver; Nikitin, Frederic; Alocci, Davide; Mariethoz, Julien; Müller, Markus; Lisacek, Frederique

    2015-11-03

    Mass spectrometry (MS) is a widely used and evolving technique for the high-throughput identification of molecules in biological samples. The need for sharing and reuse of code among bioinformaticians working with MS data prompted the design and implementation of MzJava, an open-source Java Application Programming Interface (API) for MS related data processing. MzJava provides data structures and algorithms for representing and processing mass spectra and their associated biological molecules, such as metabolites, glycans and peptides. MzJava includes functionality to perform mass calculation, peak processing (e.g. centroiding, filtering, transforming), spectrum alignment and clustering, protein digestion, fragmentation of peptides and glycans as well as scoring functions for spectrum-spectrum and peptide/glycan-spectrum matches. For data import and export MzJava implements readers and writers for commonly used data formats. For many classes support for the Hadoop MapReduce (hadoop.apache.org) and Apache Spark (spark.apache.org) frameworks for cluster computing was implemented. The library has been developed applying best practices of software engineering. To ensure that MzJava contains code that is correct and easy to use the library's API was carefully designed and thoroughly tested. MzJava is an open-source project distributed under the AGPL v3.0 licence. MzJava requires Java 1.7 or higher. Binaries, source code and documentation can be downloaded from http://mzjava.expasy.org and https://bitbucket.org/sib-pig/mzjava. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. The 2015 Bioinformatics Open Source Conference (BOSC 2015)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J. A.; Lapp, Hilmar

    2016-01-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included “Data Science;” “Standards and Interoperability;” “Open Science and Reproducibility;” “Translational Bioinformatics;” “Visualization;” and “Bioinformatics Open Source Project Updates”. In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled “Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community,” that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule. PMID:26914653

  18. Autocorrel I: A Neural Network Based Network Event Correlation Approach

    DTIC Science & Technology

    2005-05-01

    which concern any component of the network. 2.1.1 Existing Intrusion Detection Systems EMERALD [8] is a distributed, scalable, hierarchal, customizable...writing this paper, the updaters of this system had not released their correlation unit to the public. EMERALD ex- plicitly divides statistical analysis... EMERALD , NetSTAT is scalable and composi- ble. QuidSCOR [12] is an open-source IDS, though it requires a subscription from its publisher, Qualys Inc

  19. Integrating Distributed Interactive Simulations With the Project Darkstar Open-Source Massively Multiplayer Online Game (MMOG) Middleware

    DTIC Science & Technology

    2009-09-01

    be complete MMOG solutions such as Multiverse are not within the scope of this thesis, though it is recommended that readers compare this type of...software to the middleware described here ( Multiverse , 2009). 1. University of Munster: Real-Time Framework The Real-Time Framework (RTF) project is...10, 2009, from http://wiki.secondlife.com/wiki/MMOX Multiverse . (2009). Multiverse platform architecture. Retrieved September 9, 2009, from http

  20. Proceedings of the Very Low Frequency (VLF) Ambient Noise Workshop Held in Stennis, Mississippi on 20-21 September 1988

    DTIC Science & Technology

    1988-12-01

    20-21 September 1988 (This document contains the unclassified presentations only.) DTIC Ronad A by&S"LECTE 0 iCompiled W.y: Ronald A. Wagstaff 0140...7 IV* Summary........................11 V. Presentations ................................... 37 "Opening Remarks," R. Wagstaff ............... 39...34Influence of Noise Source Distributions and Propagation Mechanisms on Noise Field Directionality," M. Bradley/R. Wagstaff ...... 51 "Distant Storm

  1. China Report, Economic Affairs, Energy: Status and Development -- XXVI

    DTIC Science & Technology

    1984-04-09

    better technology for open-cut mining; — renovating 9 railway trunk lines while building 8 new ones, bringing the total number in the area up to 20 and...CSO: 4013/119 91 SUPPLEMENTAL SOURCES CHINA’S SOLAR ENERGY RESOURCE DISTRIBUTION ANALYZED Beijing TAIYANGNENG XUEBAO [ACTA ENERGIAE SOLARIS...Calculation in Our Country, TAIYANGNENG XUEBAO [ACTA ENERGIAE SOLARIS SINICA], Vol 1, 1980 pp 1-9. 12553 CSO: 4013/19 102 CONSERVATION CONSERVATION

  2. Database Entity Persistence with Hibernate for the Network Connectivity Analysis Model

    DTIC Science & Technology

    2014-04-01

    time savings in the Java coding development process. Appendices A and B describe address setup procedures for installing the MySQL database...development environment is required: • The open source MySQL Database Management System (DBMS) from Oracle, which is a Java Database Connectivity (JDBC...compliant DBMS • MySQL JDBC Driver library that comes as a plug-in with the Netbeans distribution • The latest Java Development Kit with the latest

  3. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  4. Parallel, Distributed Scripting with Python

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, P J

    2002-05-24

    Parallel computers used to be, for the most part, one-of-a-kind systems which were extremely difficult to program portably. With SMP architectures, the advent of the POSIX thread API and OpenMP gave developers ways to portably exploit on-the-box shared memory parallelism. Since these architectures didn't scale cost-effectively, distributed memory clusters were developed. The associated MPI message passing libraries gave these systems a portable paradigm too. Having programmers effectively use this paradigm is a somewhat different question. Distributed data has to be explicitly transported via the messaging system in order for it to be useful. In high level languages, the MPI librarymore » gives access to data distribution routines in C, C++, and FORTRAN. But we need more than that. Many reasonable and common tasks are best done in (or as extensions to) scripting languages. Consider sysadm tools such as password crackers, file purgers, etc ... These are simple to write in a scripting language such as Python (an open source, portable, and freely available interpreter). But these tasks beg to be done in parallel. Consider the a password checker that checks an encrypted password against a 25,000 word dictionary. This can take around 10 seconds in Python (6 seconds in C). It is trivial to parallelize if you can distribute the information and co-ordinate the work.« less

  5. A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data

    DOE PAGES

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...

    2016-01-01

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  6. The 2016 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science.

  7. Beyond Open Source: According to Jim Hirsch, Open Technology, Not Open Source, Is the Wave of the Future

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    This article presents an interview with Jim Hirsch, an associate superintendent for technology at Piano Independent School District in Piano, Texas. Hirsch serves as a liaison for the open technologies committee of the Consortium for School Networking. In this interview, he shares his opinion on the significance of open source in K-12.

  8. High abundances of oxalic, azelaic, and glyoxylic acids and methylglyoxal in the open ocean with high biological activity: Implication for secondary OA formation from isoprene

    NASA Astrophysics Data System (ADS)

    Bikkina, Srinivas; Kawamura, Kimitaka; Miyazaki, Yuzo; Fu, Pingqing

    2014-05-01

    Atmospheric dicarboxylic acids (DCA) are a ubiquitous water-soluble component of secondary organic aerosols (SOA), which can act as cloud condensation nuclei (CCN), affecting the Earth's climate. Despite the high abundances of oxalic acid and related compounds in the marine aerosols, there is no consensus on what controls their distributions over the open ocean. Marine biological productivity could play a role in the production of DCA, but there is no substantial evidence to support this hypothesis. Here we present latitudinal distributions of DCA, oxoacids and α-dicarbonyls in the marine aerosols from the remote Pacific. Their concentrations were found several times higher in more biologically influenced aerosols (MBA) than less biologically influenced aerosols. We propose isoprene and unsaturated fatty acids as sources of DCA as inferred from significantly higher abundances of isoprene-SOA tracers and azelaic acid in MBA. These results have implications toward the reassessment of climate forcing feedbacks of marine-derived SOA.

  9. Final Report for the Development of the NASA Technical Report Server (NTRS)

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.

    2005-01-01

    The author performed a variety of research, development and consulting tasks for NASA Langley Research Center in the area of digital libraries (DLs) and supporting technologies, such as the Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH). In particular, the development focused on the NASA Technical Report Server (NTRS) and its transition from a distributed searching model to one that uses the OAI-PMH. The Open Archives Initiative (OAI) is an international consortium focused on furthering the interoperability of DLs through the use of "metadata harvesting". The OAI-PMH version of NTRS went into public production on April 28, 2003. Since that time, it has been extremely well received. In addition to providing the NTRS user community with a higher level of service than the previous, distributed searching version of NTRS, it has provided more insight into how the user community uses NTRS in a variety of deployment scenarios. This report details the design, implementation and maintenance of the NTRS. Source code is included in the appendices.

  10. Integrating Socioeconomic and Earth Science Data Using Geobrowsers and Web Services: A Demonstration

    NASA Astrophysics Data System (ADS)

    Schumacher, J. A.; Yetman, G. G.

    2007-12-01

    The societal benefit areas identified as the focus for the Global Earth Observing System of Systems (GEOSS) 10- year implementation plan are an indicator of the importance of integrating socioeconomic data with earth science data to support decision makers. To aid this integration, CIESIN is delivering its global and U.S. demographic data to commercial and open source Geobrowsers and providing open standards based services for data access. Currently, data on population distribution, poverty, and detailed census data for the U.S. are available for visualization and access in Google Earth, NASA World Wind, and a browser-based 2-dimensional mapping client. The mapping client allows for the creation of web map documents that pull together layers from distributed servers and can be saved and shared. Visualization tools with Geobrowsers, user-driven map creation and sharing via browser-based clients, and a prototype for characterizing populations at risk to predicted precipitation deficits will be demonstrated.

  11. Open-cavity fiber laser with distributed feedback based on externally or self-induced dynamic gratings.

    PubMed

    Lobach, Ivan A; Drobyshev, Roman V; Fotiadi, Andrei A; Podivilov, Evgeniy V; Kablukov, Sergey I; Babin, Sergey A

    2017-10-15

    Dynamic population inversion gratings induced in an active medium by counter-propagating optical fields may have a reverse effect on writing laser radiation via feedback they provide. In this Letter we report, to the best of our knowledge, on the first demonstration of an open-cavity fiber laser in which the distributed feedback is provided by a dynamic grating "written" in a Yb-doped active fiber, either by an external source or self-induced via a weak (∼0.1%) reflection from an angle-cleaved fiber end. It has been shown that meters-long dynamic grating is formed with a narrow bandwidth (<50  MHz) and a relatively high-reflection coefficient (>7%) securing single-frequency operation, but the subsequent hole-burning effects accompanied by new grating formation lead to the switching from one longitudinal mode to another. providing a regular pulse-mode dynamics. As a result, periodically generated pulse trains cover a spectrum range of several terahertz delivering millions of cavity modes in sequent pulses.

  12. PAPARA(ZZ)I: An open-source software interface for annotating photographs of the deep-sea

    NASA Astrophysics Data System (ADS)

    Marcon, Yann; Purser, Autun

    PAPARA(ZZ)I is a lightweight and intuitive image annotation program developed for the study of benthic megafauna. It offers functionalities such as free, grid and random point annotation. Annotations may be made following existing classification schemes for marine biota and substrata or with the use of user defined, customised lists of keywords, which broadens the range of potential application of the software to other types of studies (e.g. marine litter distribution assessment). If Internet access is available, PAPARA(ZZ)I can also query and use standardised taxa names directly from the World Register of Marine Species (WoRMS). Program outputs include abundances, densities and size calculations per keyword (e.g. per taxon). These results are written into text files that can be imported into spreadsheet programs for further analyses. PAPARA(ZZ)I is open-source and is available at http://papara-zz-i.github.io. Compiled versions exist for most 64-bit operating systems: Windows, Mac OS X and Linux.

  13. Apis - a Digital Inventory of Archaeological Heritage Based on Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Doneus, M.; Forwagner, U.; Liem, J.; Sevara, C.

    2017-08-01

    Heritage managers are in need of dynamic spatial inventories of archaeological and cultural heritage that provide them with multipurpose tools to interactively understand information about archaeological heritage within its landscape context. Specifically, linking site information with the respective non-invasive prospection data is of increasing importance as it allows for the assessment of inherent uncertainties related to the use and interpretation of remote sensing data by the educated and knowledgeable heritage manager. APIS, the archaeological prospection information system of the Aerial Archive of the University of Vienna, is specifically designed to meet these needs. It provides storage and easy access to all data concerning aerial photographs and archaeological sites through a single GIS-based application. Furthermore, APIS has been developed in an open source environment, which allows it to be freely distributed and modified. This combination in one single open source system facilitates an easy workflow for data management, interpretation, storage, and retrieval. APIS and a sample dataset will be released free of charge under creative commons license in near future.

  14. HERAFitter: Open source QCD fit project

    DOE PAGES

    Alekhin, S.; Behnke, O.; Belov, P.; ...

    2015-07-01

    HERAFitter is an open-source package that provides a framework for the determination of the parton distribution functions (PDFs) of the proton and for many different kinds of analyses in Quantum Chromodynamics (QCD). It encodes results from a wide range of experimental measurements in lepton-proton deep inelastic scattering and proton-proton (proton-antiproton) collisions at hadron colliders. These are complemented with a variety of theoretical options for calculating PDF-dependent cross section predictions corresponding to the measurements. The framework covers a large number of the existing methods and schemes used for PDF determination. The data and theoretical predictions are brought together through numerous methodologicalmore » options for carrying out PDF fits and plotting tools to help visualise the results. While primarily based on the approach of collinear factorisation, HERAFitter also provides facilities for fits of dipole models and transverse-momentum dependent PDFs. The package can be used to study the impact of new precise measurements from hadron colliders. This paper describes the general structure of HERAFitter and its wide choice of options.« less

  15. ScreenMasker: An Open-source Gaze-contingent Screen Masking Environment.

    PubMed

    Orlov, Pavel A; Bednarik, Roman

    2016-09-01

    The moving-window paradigm, based on gazecontingent technic, traditionally used in a studies of the visual perceptual span. There is a strong demand for new environments that could be employed by non-technical researchers. We have developed an easy-to-use tool with a graphical user interface (GUI) allowing both execution and control of visual gaze-contingency studies. This work describes ScreenMasker, an environment that allows create gaze-contingent textured displays used together with stimuli presentation software. ScreenMasker has an architecture that meets the requirements of low-latency real-time eye-movement experiments. It also provides a variety of settings and functions. Effective rendering times and performance are ensured by means of GPU processing under CUDA technology. Performance tests show ScreenMasker's latency to be 67-74 ms on a typical office computer, and high-end 144-Hz screen latencies of about 25-28 ms. ScreenMasker is an open-source system distributed under the GNU Lesser General Public License and is available at https://github.com/PaulOrlov/ScreenMasker .

  16. EMISSIONS OF ORGANIC AIR TOXICS FROM OPEN ...

    EPA Pesticide Factsheets

    A detailed literature search was performed to collect and collate available data reporting emissions of toxic organic substances into the air from open burning sources. Availability of data varied according to the source and the class of air toxics of interest. Volatile organic compound (VOC) and polycyclic aromatic hydrocarbon (PAH) data were available for many of the sources. Data on semivolatile organic compounds (SVOCs) that are not PAHs were available for several sources. Carbonyl and polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofuran (PCDD/F) data were available for only a few sources. There were several sources for which no emissions data were available at all. Several observations were made including: 1) Biomass open burning sources typically emitted less VOCs than open burning sources with anthropogenic fuels on a mass emitted per mass burned basis, particularly those where polymers were concerned; 2) Biomass open burning sources typically emitted less SVOCs and PAHs than anthropogenic sources on a mass emitted per mass burned basis. Burning pools of crude oil and diesel fuel produced significant amounts of PAHs relative to other types of open burning. PAH emissions were highest when combustion of polymers was taking place; and 3) Based on very limited data, biomass open burning sources typically produced higher levels of carbonyls than anthropogenic sources on a mass emitted per mass burned basis, probably due to oxygenated structures r

  17. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borisova, Elena; Lilly, Simon J.; Cantalupo, Sebastiano

    A toy model is developed to understand how the spatial distribution of fluorescent emitters in the vicinity of bright quasars could be affected by the geometry of the quasar bi-conical radiation field and by its lifetime. The model is then applied to the distribution of high-equivalent-width Ly α emitters (with rest-frame equivalent widths above 100 Å, threshold used in, e.g., Trainor and Steidel) identified in a deep narrow-band 36 × 36 arcmin{sup 2} image centered on the luminous quasar Q0420–388. These emitters are found near the edge of the field and show some evidence of an azimuthal asymmetry on themore » sky of the type expected if the quasar is radiating in a bipolar cone. If these sources are being fluorescently illuminated by the quasar, the two most distant objects require a lifetime of at least 15 Myr for an opening angle of 60° or more, increasing to more than 40 Myr if the opening angle is reduced to a minimum of 30°. However, some other expected signatures of boosted fluorescence are not seen at the current survey limits, e.g., a fall off in Ly α brightness, or equivalent width, with distance. Furthermore, to have most of the Ly α emission of the two distant sources to be fluorescently boosted would require the quasar to have been significantly brighter in the past. This suggests that these particular sources may not be fluorescent, invalidating the above lifetime constraints. This would cast doubt on the use of this relatively low equivalent width threshold and thus also on the lifetime analysis in Trainor and Steidel.« less

  19. VoIP attacks detection engine based on neural network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Slachta, Jiri

    2015-05-01

    The security is crucial for any system nowadays, especially communications. One of the most successful protocols in the field of communication over IP networks is Session Initiation Protocol. It is an open-source project used by different kinds of applications, both open-source and proprietary. High penetration and text-based principle made SIP number one target in IP telephony infrastructure, so security of SIP server is essential. To keep up with hackers and to detect potential malicious attacks, security administrator needs to monitor and evaluate SIP traffic in the network. But monitoring and following evaluation could easily overwhelm the security administrator in networks, typically in networks with a number of SIP servers, users and logically or geographically separated networks. The proposed solution lies in automatic attack detection systems. The article covers detection of VoIP attacks through a distributed network of nodes. Then the gathered data analyze aggregation server with artificial neural network. Artificial neural network means multilayer perceptron network trained with a set of collected attacks. Attack data could also be preprocessed and verified with a self-organizing map. The source data is detected by distributed network of detection nodes. Each node contains a honeypot application and traffic monitoring mechanism. Aggregation of data from each node creates an input for neural networks. The automatic classification on a centralized server with low false positive detection reduce the cost of attack detection resources. The detection system uses modular design for easy deployment in final infrastructure. The centralized server collects and process detected traffic. It also maintains all detection nodes.

  20. Breakthrough in 4π ion emission mechanism understanding in plasma focus devices.

    PubMed

    Sohrabi, Mehdi; Zarinshad, Arefe; Habibi, Morteza

    2016-12-12

    Ion emission angular distribution mechanisms in plasma focus devices (PFD) have not yet been well developed and understood being due to the lack of an efficient wide-angle ion distribution image detection system to characterize a PFD space in detail. Present belief is that the acceleration of ions points from "anode top" upwards in forward direction within a small solid angle. A breakthrough is reported in this study, by mega-size position-sensitive polycarbonate ion image detection systems invented, on discovery of 4π ion emission from the "anode top" in a PFD space after plasma pinch instability and radial run-away of ions from the "anode cathodes array" during axial acceleration of plasma sheaths before the radial phase. These two ion emission source mechanisms behave respectively as a "Point Ion Source" and a "Line Ion Source" forming "Ion Cathode Shadows" on mega-size detectors. We believe that the inventions and discoveries made here will open new horizons for advanced ion emission studies towards better mechanisms understanding and in particular will promote efficient applications of PFDs in medicine, science and technology.

  1. The hobbyist phenomenon in physical security.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michaud, E. C.

    Pro-Ams (professional amateurs) are groups of people who work on a problem as amateurs or unpaid persons in a given field at professional levels of competence. Astronomy is a good example of Pro-Am activity. At Galaxy Zoo, Pro-Ams evaluate data generated by professional observatories and are able to evaluate the millions of galaxies that have been observed but not classified, and report their findings at professional levels for fun. To allow the archiving of millions of galaxies that have been observed but not classified, the website has been engineered so that the public can view and classify galaxies even ifmore » they are not professional astronomers. In this endeavor, it has been found that amateurs can easily outperform automated vision systems. Today in the world of physical security, Pro-Ams are playing an ever-increasing role. Traditionally, locksmiths, corporations, and government organizations have been largely responsible for developing standards, uncovering vulnerabilities, and devising best security practices. Increasingly, however, nonprofit sporting organizations and clubs are doing this. They can be found all over the world, from Europe to the US and now South East Asia. Examples include TOOOL (The Open Organization of Lockpickers), the Longhorn Lockpicking Club, Sportsfreunde der Sperrtechnik - Deustcheland e.V., though there are many others. Members of these groups have been getting together weekly to discuss many elements of security, with some groups specializing in specific areas of security. When members are asked why they participate in these hobbyist groups, they usually reply (with gusto) that they do it for fun, and that they view defeating locks and other security devices as an interesting and entertaining puzzle. A lot of what happens at these clubs would not be possible if it weren't for 'Super Abundance', the ability to easily acquire (at little or no cost) the products, security tools, technologies, and intellectual resources traditionally limited to corporations, government organizations, or wealthy individuals. With this new access comes new discoveries. For example, hobbyist sport lockpicking groups discovered - and publicized - a number of new vulnerabilities between 2004 and 2009 that resulted in the majority of high-security lock manufacturers having to make changes and improvements to their products. A decade ago, amateur physical security discoveries were rare, at least those discussed publicly. In the interim, Internet sites such as lockpicking.org, lockpicking101.com and others have provided an online meeting place for people to trade tips, find friends with similar interests, and develop tools. The open, public discussion of software vulnerabilities, in contrast, has been going on for a long time. These two industries, physical security and software, have very different upgrade mechanisms. With software, a patch can typically be deployed quickly to fix a serious vulnerability, whereas a hardware fix for a physical security device or system can take upwards of months to implement in the field, especially if (as is often the case) hardware integrators are involved. Even when responding to publicly announced security vulnerabilities, manufacturers of physical security devices such as locks, intrusion detectors, or access control devices rarely view hobbyists as a positive resource. This is most unfortunate. In the field of software, it is common to speak of Open Source versus Closed Source. An Open Source software company may choose to distribute their software with a particular license, and give it away openly, with full details and all the lines of source code made available. Linux is a very popular example of this. A Close Source company, in contrast, chooses not to reveal its source code and will license its software products in a restrictive manor. Slowly, the idea of Open Source is now coming to the world of physical security. In the case of locks, it provides an alternative to the traditional Closed Source world of locksmiths. Now locks are physical objects, and can therefore be disassembled. As such, they have always been Open Source in a limited sense. Secrecy, in fact, is very difficult to maintain for a lock that is widely distributed. Having direct access to the lock design provides the hobbyist with a very open environment for finding security flaws, even if the lock manufacturer attempts to follow a Close Source model. It is clear that the field of physical security is going the digital route with companies such as Medeco, Mul-T-Lock, and Abloy manufacturing electromechanical locks. Various companies have already begun to add microcontrollers, cryptographic chip sets, solid-state sensors, and a number of other high-tech improvements to their product lineup in an effort to thwart people from defeating their security products.« less

  2. Open-Source 3D-Printable Optics Equipment

    PubMed Central

    Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104

  3. Open-source 3D-printable optics equipment.

    PubMed

    Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.

  4. Aerostat-Lofted Instrument Platform and Sampling Method for Determination of Emissions from Open Area Sources

    EPA Science Inventory

    Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...

  5. Learning from Heterogeneous Data Sources: An Application in Spatial Proteomics

    PubMed Central

    Breckels, Lisa M.; Holden, Sean B.; Wojnar, David; Mulvey, Claire M.; Christoforou, Andy; Groen, Arnoud; Trotter, Matthew W. B.; Kohlbacher, Oliver; Lilley, Kathryn S.; Gatto, Laurent

    2016-01-01

    Sub-cellular localisation of proteins is an essential post-translational regulatory mechanism that can be assayed using high-throughput mass spectrometry (MS). These MS-based spatial proteomics experiments enable us to pinpoint the sub-cellular distribution of thousands of proteins in a specific system under controlled conditions. Recent advances in high-throughput MS methods have yielded a plethora of experimental spatial proteomics data for the cell biology community. Yet, there are many third-party data sources, such as immunofluorescence microscopy or protein annotations and sequences, which represent a rich and vast source of complementary information. We present a unique transfer learning classification framework that utilises a nearest-neighbour or support vector machine system, to integrate heterogeneous data sources to considerably improve on the quantity and quality of sub-cellular protein assignment. We demonstrate the utility of our algorithms through evaluation of five experimental datasets, from four different species in conjunction with four different auxiliary data sources to classify proteins to tens of sub-cellular compartments with high generalisation accuracy. We further apply the method to an experiment on pluripotent mouse embryonic stem cells to classify a set of previously unknown proteins, and validate our findings against a recent high resolution map of the mouse stem cell proteome. The methodology is distributed as part of the open-source Bioconductor pRoloc suite for spatial proteomics data analysis. PMID:27175778

  6. The Visible Human Data Sets (VHD) and Insight Toolkit (ITk): Experiments in Open Source Software

    PubMed Central

    Ackerman, Michael J.; Yoo, Terry S.

    2003-01-01

    From its inception in 1989, the Visible Human Project was designed as an experiment in open source software. In 1994 and 1995 the male and female Visible Human data sets were released by the National Library of Medicine (NLM) as open source data sets. In 2002 the NLM released the first version of the Insight Toolkit (ITk) as open source software. PMID:14728278

  7. FORM version 4.0

    NASA Astrophysics Data System (ADS)

    Kuipers, J.; Ueda, T.; Vermaseren, J. A. M.; Vollinga, J.

    2013-05-01

    We present version 4.0 of the symbolic manipulation system FORM. The most important new features are manipulation of rational polynomials and the factorization of expressions. Many other new functions and commands are also added; some of them are very general, while others are designed for building specific high level packages, such as one for Gröbner bases. New is also the checkpoint facility, that allows for periodic backups during long calculations. Finally, FORM 4.0 has become available as open source under the GNU General Public License version 3. Program summaryProgram title: FORM. Catalogue identifier: AEOT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 151599 No. of bytes in distributed program, including test data, etc.: 1 078 748 Distribution format: tar.gz Programming language: The FORM language. FORM itself is programmed in a mixture of C and C++. Computer: All. Operating system: UNIX, LINUX, Mac OS, Windows. Classification: 5. Nature of problem: FORM defines a symbolic manipulation language in which the emphasis lies on fast processing of very large formulas. It has been used successfully for many calculations in Quantum Field Theory and mathematics. In speed and size of formulas that can be handled it outperforms other systems typically by an order of magnitude. Special in this version: The version 4.0 contains many new features. Most important are factorization and rational arithmetic. The program has also become open source under the GPL. The code in CPC is for reference. You are encouraged to upload the most recent sources from www.nikhef.nl/form/formcvs.php because of frequent bug fixes. Solution method: See "Nature of Problem", above. Additional comments: NOTE: The code in CPC is for reference. You are encouraged to upload the most recent sources from www.nikhef.nl/form/formcvs.php because of frequent bug fixes.

  8. Coordinated Control Method of Voltage and Reactive Power for Active Distribution Networks Based on Soft Open Point

    DOE PAGES

    Li, Peng; Ji, Haoran; Wang, Chengshan; ...

    2017-03-22

    The increasing penetration of distributed generators (DGs) exacerbates the risk of voltage violations in active distribution networks (ADNs). The conventional voltage regulation devices limited by the physical constraints are difficult to meet the requirement of real-time voltage and VAR control (VVC) with high precision when DGs fluctuate frequently. But, soft open point (SOP), a flexible power electronic device, can be used as the continuous reactive power source to realize the fast voltage regulation. Considering the cooperation of SOP and multiple regulation devices, this paper proposes a coordinated VVC method based on SOP for ADNs. Firstly, a time-series model of coordi-natedmore » VVC is developed to minimize operation costs and eliminate voltage violations of ADNs. Then, by applying the linearization and conic relaxation, the original nonconvex mixed-integer non-linear optimization model is converted into a mixed-integer second-order cone programming (MISOCP) model which can be efficiently solved to meet the requirement of voltage regulation rapidity. Here, we carried out some case studies on the IEEE 33-node system and IEEE 123-node system to illustrate the effectiveness of the proposed method.« less

  9. Coordinated Control Method of Voltage and Reactive Power for Active Distribution Networks Based on Soft Open Point

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Peng; Ji, Haoran; Wang, Chengshan

    The increasing penetration of distributed generators (DGs) exacerbates the risk of voltage violations in active distribution networks (ADNs). The conventional voltage regulation devices limited by the physical constraints are difficult to meet the requirement of real-time voltage and VAR control (VVC) with high precision when DGs fluctuate frequently. But, soft open point (SOP), a flexible power electronic device, can be used as the continuous reactive power source to realize the fast voltage regulation. Considering the cooperation of SOP and multiple regulation devices, this paper proposes a coordinated VVC method based on SOP for ADNs. Firstly, a time-series model of coordi-natedmore » VVC is developed to minimize operation costs and eliminate voltage violations of ADNs. Then, by applying the linearization and conic relaxation, the original nonconvex mixed-integer non-linear optimization model is converted into a mixed-integer second-order cone programming (MISOCP) model which can be efficiently solved to meet the requirement of voltage regulation rapidity. Here, we carried out some case studies on the IEEE 33-node system and IEEE 123-node system to illustrate the effectiveness of the proposed method.« less

  10. An Inverse Modeling Plugin for HydroDesktop using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio, C.; Over, M. W.; Rubin, Y.

    2011-12-01

    The CUAHSI Hydrologic Information System (HIS) software stack is based on an open and extensible architecture that facilitates the addition of new functions and capabilities at both the server side (using HydroServer) and the client side (using HydroDesktop). The HydroDesktop client plugin architecture is used here to expose a new scripting based plugin that makes use of the R statistics software as a means for conducting inverse modeling using the Method of Anchored Distributions (MAD). MAD is a Bayesian inversion technique for conditioning computational model parameters on relevant field observations yielding probabilistic distributions of the model parameters, related to the spatial random variable of interest, by assimilating multi-type and multi-scale data. The implementation of a desktop software tool for using the MAD technique is expected to significantly lower the barrier to use of inverse modeling in education, research, and resource management. The HydroDesktop MAD plugin is being developed following a community-based, open-source approach that will help both its adoption and long term sustainability as a user tool. This presentation will briefly introduce MAD, HydroDesktop, and the MAD plugin and software development effort.

  11. The 2016 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science. PMID:27781083

  12. SiGN-SSM: open source parallel software for estimating gene networks with state space models.

    PubMed

    Tamada, Yoshinori; Yamaguchi, Rui; Imoto, Seiya; Hirose, Osamu; Yoshida, Ryo; Nagasaki, Masao; Miyano, Satoru

    2011-04-15

    SiGN-SSM is an open-source gene network estimation software able to run in parallel on PCs and massively parallel supercomputers. The software estimates a state space model (SSM), that is a statistical dynamic model suitable for analyzing short time and/or replicated time series gene expression profiles. SiGN-SSM implements a novel parameter constraint effective to stabilize the estimated models. Also, by using a supercomputer, it is able to determine the gene network structure by a statistical permutation test in a practical time. SiGN-SSM is applicable not only to analyzing temporal regulatory dependencies between genes, but also to extracting the differentially regulated genes from time series expression profiles. SiGN-SSM is distributed under GNU Affero General Public Licence (GNU AGPL) version 3 and can be downloaded at http://sign.hgc.jp/signssm/. The pre-compiled binaries for some architectures are available in addition to the source code. The pre-installed binaries are also available on the Human Genome Center supercomputer system. The online manual and the supplementary information of SiGN-SSM is available on our web site. tamada@ims.u-tokyo.ac.jp.

  13. ENKI - An Open Source environmental modelling platfom

    NASA Astrophysics Data System (ADS)

    Kolberg, S.; Bruland, O.

    2012-04-01

    The ENKI software framework for implementing spatio-temporal models is now released under the LGPL license. Originally developed for evaluation and comparison of distributed hydrological model compositions, ENKI can be used for simulating any time-evolving process over a spatial domain. The core approach is to connect a set of user specified subroutines into a complete simulation model, and provide all administrative services needed to calibrate and run that model. This includes functionality for geographical region setup, all file I/O, calibration and uncertainty estimation etc. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines and various model compositions in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational water resource management. ENKI uses a plug-in structure to invoke separately compiled subroutines, separately built as dynamic-link libraries (dlls). The source code of an ENKI routine is highly compact, with a narrow framework-routine interface allowing the main program to recognise the number, types, and names of the routine's variables. The framework then exposes these variables to the user within the proper context, ensuring that distributed maps coincide spatially, time series exist for input variables, states are initialised, GIS data sets exist for static map data, manually or automatically calibrated values for parameters etc. By using function calls and memory data structures to invoke routines and facilitate information flow, ENKI provides good performance. For a typical distributed hydrological model setup in a spatial domain of 25000 grid cells, 3-4 time steps simulated per second should be expected. Future adaptation to parallel processing may further increase this speed. New modifications to ENKI include a full separation of API and user interface, making it possible to run ENKI from GIS programs and other software environments. ENKI currently compiles under Windows and Visual Studio only, but ambitions exist to remove the platform and compiler dependencies.

  14. Bayesian models for comparative analysis integrating phylogenetic uncertainty.

    PubMed

    de Villemereuil, Pierre; Wells, Jessie A; Edwards, Robert D; Blomberg, Simon P

    2012-06-28

    Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language.

  15. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    PubMed Central

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language. PMID:22741602

  16. Radiative flux from a planar multiple point source within a cylindrical enclosure reaching a coaxial circular plane

    NASA Astrophysics Data System (ADS)

    Tryka, Stanislaw

    2007-04-01

    A general formula and some special integral formulas were presented for calculating radiative fluxes incident on a circular plane from a planar multiple point source within a coaxial cylindrical enclosure perpendicular to the source. These formula were obtained for radiation propagating in a homogeneous isotropic medium assuming that the lateral surface of the enclosure completely absorbs the incident radiation. Exemplary results were computed numerically and illustrated with three-dimensional surface plots. The formulas presented are suitable for determining fluxes of radiation reaching planar circular detectors, collectors or other planar circular elements from systems of laser diodes, light emitting diodes and fiber lamps within cylindrical enclosures, as well as small biological emitters (bacteria, fungi, yeast, etc.) distributed on planar bases of open nontransparent cylindrical containers.

  17. OpenMx: An Open Source Extended Structural Equation Modeling Framework

    ERIC Educational Resources Information Center

    Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John

    2011-01-01

    OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are…

  18. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.

  19. A statistical analysis of North East Atlantic (submicron) aerosol size distributions

    NASA Astrophysics Data System (ADS)

    Dall'Osto, M.; Monahan, C.; Greaney, R.; Beddows, D. C. S.; Harrison, R. M.; Ceburnis, D.; O'Dowd, C. D.

    2011-12-01

    The Global Atmospheric Watch research station at Mace Head (Ireland) offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical cluster analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75% throughout the year. By applying the Hartigan-Wong k-Means method, 12 clusters were identified as systematically occurring. These 12 clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time), open ocean nucleation category (occurring 32.6% of the time), background clean marine category (occurring 26.1% of the time) and anthropogenic category (occurring 20% of the time) aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less than 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine aerosol exhibited a clear bimodality in the sub-micron size distribution, with although it should be noted that either the Aitken mode or the accumulation mode may dominate the number concentration. However, peculiar background clean marine size distributions with coarser accumulation modes are also observed during winter months. By contrast, the continentally-influenced size distributions are generally more monomodal (accumulation), albeit with traces of bimodality. The open ocean category occurs more often during May, June and July, corresponding with the North East (NE) Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6%), this suggests that the marine biota is an important source of new nano aerosol particles in NE Atlantic Air.

  20. The proton and helium anomalies in the light of the Myriad model

    NASA Astrophysics Data System (ADS)

    Salati, Pierre; Génolini, Yoann; Serpico, Pasquale; Taillet, Richard

    2017-03-01

    A hardening of the proton and helium fluxes is observed above a few hundreds of GeV/nuc. The distribution of local sources of primary cosmic rays has been suggested as a potential solution to this puzzling behavior. Some authors even claim that a single source is responsible for the observed anomalies. But how probable these explanations are? To answer that question, our current description of cosmic ray Galactic propagation needs to be replaced by the Myriad model. In the former approach, sources of protons and helium nuclei are treated as a jelly continuously spread over space and time. A more accurate description is provided by the Myriad model where sources are considered as point-like events. This leads to a probabilistic derivation of the fluxes of primary species, and opens the possibility that larger-than-average values may be observed at the Earth. For a long time though, a major obstacle has been the infinite variance associated to the probability distribution function which the fluxes follow. Several suggestions have been made to cure this problem but none is entirely satisfactory. We go a step further here and solve the infinite variance problem of the Myriad model by making use of the generalized central limit theorem. We find that primary fluxes are distributed according to a stable law with heavy tail, well-known to financial analysts. The probability that the proton and helium anomalies are sourced by local SNR can then be calculated. The p-values associated to the CREAM measurements turn out to be small, unless somewhat unrealistic propagation parameters are assumed.

  1. Building a Snow Data Management System using Open Source Software (and IDL)

    NASA Astrophysics Data System (ADS)

    Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.

    2012-12-01

    At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01

  2. CellAnimation: an open source MATLAB framework for microscopy assays.

    PubMed

    Georgescu, Walter; Wikswo, John P; Quaranta, Vito

    2012-01-01

    Advances in microscopy technology have led to the creation of high-throughput microscopes that are capable of generating several hundred gigabytes of images in a few days. Analyzing such wealth of data manually is nearly impossible and requires an automated approach. There are at present a number of open-source and commercial software packages that allow the user to apply algorithms of different degrees of sophistication to the images and extract desired metrics. However, the types of metrics that can be extracted are severely limited by the specific image processing algorithms that the application implements, and by the expertise of the user. In most commercial software, code unavailability prevents implementation by the end user of newly developed algorithms better suited for a particular type of imaging assay. While it is possible to implement new algorithms in open-source software, rewiring an image processing application requires a high degree of expertise. To obviate these limitations, we have developed an open-source high-throughput application that allows implementation of different biological assays such as cell tracking or ancestry recording, through the use of small, relatively simple image processing modules connected into sophisticated imaging pipelines. By connecting modules, non-expert users can apply the particular combination of well-established and novel algorithms developed by us and others that are best suited for each individual assay type. In addition, our data exploration and visualization modules make it easy to discover or select specific cell phenotypes from a heterogeneous population. CellAnimation is distributed under the Creative Commons Attribution-NonCommercial 3.0 Unported license (http://creativecommons.org/licenses/by-nc/3.0/). CellAnimationsource code and documentation may be downloaded from www.vanderbilt.edu/viibre/software/documents/CellAnimation.zip. Sample data are available at www.vanderbilt.edu/viibre/software/documents/movies.zip. walter.georgescu@vanderbilt.edu Supplementary data available at Bioinformatics online.

  3. MIST: An Open Source Environmental Modelling Programming Language Incorporating Easy to Use Data Parallelism.

    NASA Astrophysics Data System (ADS)

    Bellerby, Tim

    2014-05-01

    Model Integration System (MIST) is open-source environmental modelling programming language that directly incorporates data parallelism. The language is designed to enable straightforward programming structures, such as nested loops and conditional statements to be directly translated into sequences of whole-array (or more generally whole data-structure) operations. MIST thus enables the programmer to use well-understood constructs, directly relating to the mathematical structure of the model, without having to explicitly vectorize code or worry about details of parallelization. A range of common modelling operations are supported by dedicated language structures operating on cell neighbourhoods rather than individual cells (e.g.: the 3x3 local neighbourhood needed to implement an averaging image filter can be simply accessed from within a simple loop traversing all image pixels). This facility hides details of inter-process communication behind more mathematically relevant descriptions of model dynamics. The MIST automatic vectorization/parallelization process serves both to distribute work among available nodes and separately to control storage requirements for intermediate expressions - enabling operations on very large domains for which memory availability may be an issue. MIST is designed to facilitate efficient interpreter based implementations. A prototype open source interpreter is available, coded in standard FORTRAN 95, with tools to rapidly integrate existing FORTRAN 77 or 95 code libraries. The language is formally specified and thus not limited to FORTRAN implementation or to an interpreter-based approach. A MIST to FORTRAN compiler is under development and volunteers are sought to create an ANSI-C implementation. Parallel processing is currently implemented using OpenMP. However, parallelization code is fully modularised and could be replaced with implementations using other libraries. GPU implementation is potentially possible.

  4. National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents

    NASA Astrophysics Data System (ADS)

    Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.

    2014-12-01

    The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.

  5. The Case for Open Source: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    Open source has continued to evolve and in the past three years the development of a graphical user interface has made it increasingly accessible and viable for end users without special training. Open source relies to a great extent on the free software movement. In this context, the term free refers not to cost, but to the freedom users have to…

  6. Neoproterozoic rift basins and their control on the development of hydrocarbon source rocks in the Tarim Basin, NW China

    NASA Astrophysics Data System (ADS)

    Zhu, Guang-You; Ren, Rong; Chen, Fei-Ran; Li, Ting-Ting; Chen, Yong-Quan

    2017-12-01

    The Proterozoic is demonstrated to be an important period for global petroleum systems. Few exploration breakthroughs, however, have been obtained on the system in the Tarim Basin, NW China. Outcrop, drilling, and seismic data are integrated in this paper to focus on the Neoproterozoic rift basins and related hydrocarbon source rocks in the Tarim Basin. The basin consists of Cryogenian to Ediacaran rifts showing a distribution of N-S differentiation. Compared to the Cryogenian basins, those of the Ediacaran are characterized by deposits in small thickness and wide distribution. Thus, the rifts have a typical dual structure, namely the Cryogenian rifting and Ediacaran depression phases that reveal distinct structural and sedimentary characteristics. The Cryogenian rifting basins are dominated by a series of grabens or half grabens, which have a wedge-shaped rapid filling structure. The basins evolved into Ediacaran depression when the rifting and magmatic activities diminished, and extensive overlapping sedimentation occurred. The distributions of the source rocks are controlled by the Neoproterozoic rifts as follows. The present outcrops lie mostly at the margins of the Cryogenian rifting basins where the rapid deposition dominates and the argillaceous rocks have low total organic carbon (TOC) contents; however, the source rocks with high TOC contents should develop in the center of the basins. The Ediacaran source rocks formed in deep water environment of the stable depressions evolving from the previous rifting basins, and are thus more widespread in the Tarim Basin. The confirmation of the Cryogenian to Ediacaran source rocks would open up a new field for the deep hydrocarbon exploration in the Tarim Basin.

  7. Visualization, documentation, analysis, and communication of large scale gene regulatory networks

    PubMed Central

    Longabaugh, William J.R.; Davidson, Eric H.; Bolouri, Hamid

    2009-01-01

    Summary Genetic regulatory networks (GRNs) are complex, large-scale, and spatially and temporally distributed. These characteristics impose challenging demands on computational GRN modeling tools, and there is a need for custom modeling tools. In this paper, we report on our ongoing development of BioTapestry, an open source, freely available computational tool designed specifically for GRN modeling. We also outline our future development plans, and give some examples of current applications of BioTapestry. PMID:18757046

  8. Refined Source Terms in WAVEWATCH III with Wave Breaking and Sea Spray Forecasts

    DTIC Science & Technology

    2015-09-30

    young wind seas reported by Schwendeman et al. (2014) and for the open ocean cases reported by Sutherland and Melville (2015). These verifications...modeled Λ(c) distributions shown in Figure 3 follow a very similar dependence to the Sutherland and Melville observations to about 1-2 m/s. The...and 11) as well as Sutherland and Melville (2015) which show beff ~ O(10-3). Figure 4. Modeled behavior of spectrally-integrated breaking

  9. The MAL: A Malware Analysis Lexicon

    DTIC Science & Technology

    2013-02-01

    we feel that further exploration of the open source literature is a promising avenue for enlarging the corpus. 2.3 Publishing the MAL Early in the...MAL. We feel that the advantages of this format are well worth the small incremental cost. The distribution of the MAL in this format is under...dictionary. We feel that moving to a richer format such as WordNet or WordVis would greatly improve the usability of the lexicon. 3.5 Improved Hosting The

  10. JavaGenes Molecular Evolution

    NASA Technical Reports Server (NTRS)

    Lohn, Jason; Smith, David; Frank, Jeremy; Globus, Al; Crawford, James

    2007-01-01

    JavaGenes is a general-purpose, evolutionary software system written in Java. It implements several versions of a genetic algorithm, simulated annealing, stochastic hill climbing, and other search techniques. This software has been used to evolve molecules, atomic force field parameters, digital circuits, Earth Observing Satellite schedules, and antennas. This version differs from version 0.7.28 in that it includes the molecule evolution code and other improvements. Except for the antenna code, JaveGenes is available for NASA Open Source distribution.

  11. AliEn—ALICE environment on the GRID

    NASA Astrophysics Data System (ADS)

    Saiz, P.; Aphecetche, L.; Bunčić, P.; Piskač, R.; Revsbech, J.-E.; Šego, V.; Alice Collaboration

    2003-04-01

    AliEn ( http://alien.cern.ch) (ALICE Environment) is a Grid framework built on top of the latest Internet standards for information exchange and authentication (SOAP, PKI) and common Open Source components. AliEn provides a virtual file catalogue that allows transparent access to distributed datasets and a number of collaborating Web services which implement the authentication, job execution, file transport, performance monitor and event logging. In the paper we will present the architecture and components of the system.

  12. Calculations of lattice vibrational mode lifetimes using Jazz: a Python wrapper for LAMMPS

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Wang, H.; Daw, M. S.

    2015-06-01

    Jazz is a new python wrapper for LAMMPS [1], implemented to calculate the lifetimes of vibrational normal modes based on forces as calculated for any interatomic potential available in that package. The anharmonic character of the normal modes is analyzed via the Monte Carlo-based moments approximation as is described in Gao and Daw [2]. It is distributed as open-source software and can be downloaded from the website http://jazz.sourceforge.net/.

  13. Quality Attribute-Guided Evaluation of NoSQL Databases: A Case Study

    DTIC Science & Technology

    2015-01-16

    evaluations of NoSQL databases specifically, and big data systems in general, that have become apparent during our study. Keywords—NoSQL, distributed...technology, namely that of big data , software systems [1]. At the heart of big data systems are a collection of database technologies that are more...born organizations such as Google and Amazon [3][4], along with those of numerous other big data innovators, have created a variety of open source and

  14. Bayesian inference of spectral induced polarization parameters for laboratory complex resistivity measurements of rocks and soils

    NASA Astrophysics Data System (ADS)

    Bérubé, Charles L.; Chouteau, Michel; Shamsipour, Pejman; Enkin, Randolph J.; Olivo, Gema R.

    2017-08-01

    Spectral induced polarization (SIP) measurements are now widely used to infer mineralogical or hydrogeological properties from the low-frequency electrical properties of the subsurface in both mineral exploration and environmental sciences. We present an open-source program that performs fast multi-model inversion of laboratory complex resistivity measurements using Markov-chain Monte Carlo simulation. Using this stochastic method, SIP parameters and their uncertainties may be obtained from the Cole-Cole and Dias models, or from the Debye and Warburg decomposition approaches. The program is tested on synthetic and laboratory data to show that the posterior distribution of a multiple Cole-Cole model is multimodal in particular cases. The Warburg and Debye decomposition approaches yield unique solutions in all cases. It is shown that an adaptive Metropolis algorithm performs faster and is less dependent on the initial parameter values than the Metropolis-Hastings step method when inverting SIP data through the decomposition schemes. There are no advantages in using an adaptive step method for well-defined Cole-Cole inversion. Finally, the influence of measurement noise on the recovered relaxation time distribution is explored. We provide the geophysics community with a open-source platform that can serve as a base for further developments in stochastic SIP data inversion and that may be used to perform parameter analysis with various SIP models.

  15. GoFFish: A Sub-Graph Centric Framework for Large-Scale Graph Analytics1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Kumbhare, Alok; Wickramaarachchi, Charith

    2014-08-25

    Large scale graph processing is a major research area for Big Data exploration. Vertex centric programming models like Pregel are gaining traction due to their simple abstraction that allows for scalable execution on distributed systems naturally. However, there are limitations to this approach which cause vertex centric algorithms to under-perform due to poor compute to communication overhead ratio and slow convergence of iterative superstep. In this paper we introduce GoFFish a scalable sub-graph centric framework co-designed with a distributed persistent graph storage for large scale graph analytics on commodity clusters. We introduce a sub-graph centric programming abstraction that combines themore » scalability of a vertex centric approach with the flexibility of shared memory sub-graph computation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation.« less

  16. 4 years of PM10 pollution in Poland - observations and modelling

    NASA Astrophysics Data System (ADS)

    Durka, Pawel; Struzewska, Joanna; Kaminski, Jacek W.

    2017-04-01

    Poor air quality is a health issue in Poland, especially during winter. In central and northern part of the country, the primary source is low-level domestic emissions. In larger cities and agglomerations traffic emissions are also an issue. Quantification of the contribution of transboundary pollution sources is still an open issue. Analyses of 60 episodes for the period 2013-2016 with high PM10 concentrations were carried out under a contract from the Chief Inspectorate of Environmental Protection in Poland. Analyses of synoptic conditions and calculation of back trajectories were undertaken. A tropospheric chemistry model GEM-AQ was run at 10km resolution to calculate contributions from surface, line and point sources. We will present trajectories for different types of episodes, maps with contributions for specific emission sources and transboundary pollution. Also, mean distribution of PM10 concentrations during episodes will be shown.

  17. GNU Data Language (GDL) - a free and open-source implementation of IDL

    NASA Astrophysics Data System (ADS)

    Arabas, Sylwester; Schellens, Marc; Coulais, Alain; Gales, Joel; Messmer, Peter

    2010-05-01

    GNU Data Language (GDL) is developed with the aim of providing an open-source drop-in replacement for the ITTVIS's Interactive Data Language (IDL). It is free software developed by an international team of volunteers led by Marc Schellens - the project's founder (a list of contributors is available on the project's website). The development is hosted on SourceForge where GDL continuously ranks in the 99th percentile of most active projects. GDL with its library routines is designed as a tool for numerical data analysis and visualisation. As its proprietary counterparts (IDL and PV-WAVE), GDL is used particularly in geosciences and astronomy. GDL is dynamically-typed, vectorized and has object-oriented programming capabilities. The library routines handle numerical calculations, data visualisation, signal/image processing, interaction with host OS and data input/output. GDL supports several data formats such as netCDF, HDF4, HDF5, GRIB, PNG, TIFF, DICOM, etc. Graphical output is handled by X11, PostScript, SVG or z-buffer terminals, the last one allowing output to be saved in a variety of raster graphics formats. GDL is an incremental compiler with integrated debugging facilities. It is written in C++ using the ANTLR language-recognition framework. Most of the library routines are implemented as interfaces to open-source packages such as GNU Scientific Library, PLPlot, FFTW, ImageMagick, and others. GDL features a Python bridge (Python code can be called from GDL; GDL can be compiled as a Python module). Extensions to GDL can be written in C++, GDL, and Python. A number of open software libraries written in IDL, such as the NASA Astronomy Library, MPFIT, CMSVLIB and TeXtoIDL are fully or partially functional under GDL. Packaged versions of GDL are available for several Linux distributions and Mac OS X. The source code compiles on some other UNIX systems, including BSD and OpenSolaris. The presentation will cover the current status of the project, the key accomplishments, and the weaknesses - areas where contributions and users' feedback are welcome! While still being in beta-stage of development, GDL proved to be a useful tool for classroom work on data analysis. Its usage for teaching meteorological-data processing at the University of Warsaw will serve as an example.

  18. EO/IR scene generation open source initiative for real-time hardware-in-the-loop and all-digital simulation

    NASA Astrophysics Data System (ADS)

    Morris, Joseph W.; Lowry, Mac; Boren, Brett; Towers, James B.; Trimble, Darian E.; Bunfield, Dennis H.

    2011-06-01

    The US Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) and the Redstone Test Center (RTC) has formed the Scene Generation Development Center (SGDC) to support the Department of Defense (DoD) open source EO/IR Scene Generation initiative for real-time hardware-in-the-loop and all-digital simulation. Various branches of the DoD have invested significant resources in the development of advanced scene and target signature generation codes. The SGDC goal is to maintain unlimited government rights and controlled access to government open source scene generation and signature codes. In addition, the SGDC provides development support to a multi-service community of test and evaluation (T&E) users, developers, and integrators in a collaborative environment. The SGDC has leveraged the DoD Defense Information Systems Agency (DISA) ProjectForge (https://Project.Forge.mil) which provides a collaborative development and distribution environment for the DoD community. The SGDC will develop and maintain several codes for tactical and strategic simulation, such as the Joint Signature Image Generator (JSIG), the Multi-spectral Advanced Volumetric Real-time Imaging Compositor (MAVRIC), and Office of the Secretary of Defense (OSD) Test and Evaluation Science and Technology (T&E/S&T) thermal modeling and atmospherics packages, such as EOView, CHARM, and STAR. Other utility packages included are the ContinuumCore for real-time messaging and data management and IGStudio for run-time visualization and scenario generation.

  19. When Free Isn't Free: The Realities of Running Open Source in School

    ERIC Educational Resources Information Center

    Derringer, Pam

    2009-01-01

    Despite the last few years' growth in awareness of open-source software in schools and the potential savings it represents, its widespread adoption is still hampered. Randy Orwin, technology director of the Bainbridge Island School District in Washington State and a strong open-source advocate, cautions that installing an open-source…

  20. OpenID Connect as a security service in cloud-based medical imaging systems

    PubMed Central

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-01-01

    Abstract. The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as “Kerberos of cloud.” We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682

  1. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  2. Open Source Vision

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    Increasingly, colleges and universities are turning to open source as a way to meet their technology infrastructure and application needs. Open source has changed life for visionary CIOs and their campus communities nationwide. The author discusses what these technologists see as the benefits--and the considerations.

  3. SlicerRT: radiation therapy research toolkit for 3D Slicer.

    PubMed

    Pinter, Csaba; Lasso, Andras; Wang, An; Jaffray, David; Fichtinger, Gabor

    2012-10-01

    Interest in adaptive radiation therapy research is constantly growing, but software tools available for researchers are mostly either expensive, closed proprietary applications, or free open-source packages with limited scope, extensibility, reliability, or user support. To address these limitations, we propose SlicerRT, a customizable, free, and open-source radiation therapy research toolkit. SlicerRT aspires to be an open-source toolkit for RT research, providing fast computations, convenient workflows for researchers, and a general image-guided therapy infrastructure to assist clinical translation of experimental therapeutic approaches. It is a medium into which RT researchers can integrate their methods and algorithms, and conduct comparative testing. SlicerRT was implemented as an extension for the widely used 3D Slicer medical image visualization and analysis application platform. SlicerRT provides functionality specifically designed for radiation therapy research, in addition to the powerful tools that 3D Slicer offers for visualization, registration, segmentation, and data management. The feature set of SlicerRT was defined through consensus discussions with a large pool of RT researchers, including both radiation oncologists and medical physicists. The development processes used were similar to those of 3D Slicer to ensure software quality. Standardized mechanisms of 3D Slicer were applied for documentation, distribution, and user support. The testing and validation environment was configured to automatically launch a regression test upon each software change and to perform comparison with ground truth results provided by other RT applications. Modules have been created for importing and loading DICOM-RT data, computing and displaying dose volume histograms, creating accumulated dose volumes, comparing dose volumes, and visualizing isodose lines and surfaces. The effectiveness of using 3D Slicer with the proposed SlicerRT extension for radiation therapy research was demonstrated on multiple use cases. A new open-source software toolkit has been developed for radiation therapy research. SlicerRT can import treatment plans from various sources into 3D Slicer for visualization, analysis, comparison, and processing. The provided algorithms are extensively tested and they are accessible through a convenient graphical user interface as well as a flexible application programming interface.

  4. The Open Source Snowpack modelling ecosystem

    NASA Astrophysics Data System (ADS)

    Bavay, Mathias; Fierz, Charles; Egger, Thomas; Lehning, Michael

    2016-04-01

    As a large number of numerical snow models are available, a few stand out as quite mature and widespread. One such model is SNOWPACK, the Open Source model that is developed at the WSL Institute for Snow and Avalanche Research SLF. Over the years, various tools have been developed around SNOWPACK in order to expand its use or to integrate additional features. Today, the model is part of a whole ecosystem that has evolved to both offer seamless integration and high modularity so each tool can easily be used outside the ecosystem. Many of these Open Source tools experience their own, autonomous development and are successfully used in their own right in other models and applications. There is Alpine3D, the spatially distributed version of SNOWPACK, that forces it with terrain-corrected radiation fields and optionally with blowing and drifting snow. This model can be used on parallel systems (either with OpenMP or MPI) and has been used for applications ranging from climate change to reindeer herding. There is the MeteoIO pre-processing library that offers fully integrated data access, data filtering, data correction, data resampling and spatial interpolations. This library is now used by several other models and applications. There is the SnopViz snow profile visualization library and application that supports both measured and simulated snow profiles (relying on the CAAML standard) as well as time series. This JavaScript application can be used standalone without any internet connection or served on the web together with simulation results. There is the OSPER data platform effort with a data management service (build on the Global Sensor Network (GSN) platform) as well as a data documenting system (metadata management as a wiki). There are several distributed hydrological models for mountainous areas in ongoing development that require very little information about the soil structure based on the assumption that in step terrain, the most relevant information is contained in the Digital Elevation Model (DEM). There is finally a set of tools making up the operational chain to automatically run, monitor and publish SNOWPACK simulations for operational avalanche warning purposes. This tool chain has been developed with the aim of offering very low maintenance operation and very fast deployment and to easily adapt to other avalanche services.

  5. Hundred-watt-level high power random distributed feedback Raman fiber laser at 1150 nm and its application in mid-infrared laser generation.

    PubMed

    Zhang, Hanwei; Zhou, Pu; Wang, Xiong; Du, Xueyuan; Xiao, Hu; Xu, Xiaojun

    2015-06-29

    Two kinds of hundred-watt-level random distributed feedback Raman fiber have been demonstrated. The optical efficiency can reach to as high as 84.8%. The reported power and efficiency of the random laser is the highest one as we know. We have also demonstrated that the developed random laser can be further used to pump a Ho-doped fiber laser for mid-infrared laser generation. Finally, 23 W 2050 nm laser is achieved. The presented laser can obtain high power output efficiently and conveniently and opens a new direction for high power laser sources at designed wavelength.

  6. MAGIC: Model and Graphic Information Converter

    NASA Technical Reports Server (NTRS)

    Herbert, W. C.

    2009-01-01

    MAGIC is a software tool capable of converting highly detailed 3D models from an open, standard format, VRML 2.0/97, into the proprietary DTS file format used by the Torque Game Engine from GarageGames. MAGIC is used to convert 3D simulations from authoritative sources into the data needed to run the simulations in NASA's Distributed Observer Network. The Distributed Observer Network (DON) is a simulation presentation tool built by NASA to facilitate the simulation sharing requirements of the Data Presentation and Visualization effort within the Constellation Program. DON is built on top of the Torque Game Engine (TGE) and has chosen TGE's Dynamix Three Space (DTS) file format to represent 3D objects within simulations.

  7. Crystal plasticity simulation of Zirconium tube rolling using multi-grain representative volume element

    NASA Astrophysics Data System (ADS)

    Isaenkova, Margarita; Perlovich, Yuriy; Zhuk, Dmitry; Krymskaya, Olga

    2017-10-01

    The rolling of Zirconium tube is studied by means of the crystal plasticity viscoplastic self-consistent (VPSC) constitutive modeling. This modeling performed by a dislocation-based constitutive model and a spectral solver using open-source simulation of DAMASK kit. The multi-grain representative volume elements with periodic boundary conditions are used to predict the texture evolution and distributions of strain and stresses. Two models for randomly textured and partially rolled material are deformed to 30% reduction in tube wall thickness and 7% reduction in tube diameter. The resulting shapes of the models are shown and distributions of strain are plotted. Also, evolution of grain's shape during deformation is shown.

  8. Ecosystem variability in the offshore northeastern Chukchi Sea

    NASA Astrophysics Data System (ADS)

    Blanchard, Arny L.; Day, Robert H.; Gall, Adrian E.; Aerts, Lisanne A. M.; Delarue, Julien; Dobbins, Elizabeth L.; Hopcroft, Russell R.; Questel, Jennifer M.; Weingartner, Thomas J.; Wisdom, Sheyna S.

    2017-12-01

    Understanding influences of cumulative effects from multiple stressors in marine ecosystems requires an understanding of the sources for and scales of variability. A multidisciplinary ecosystem study in the offshore northeastern Chukchi Sea during 2008-2013 investigated the variability of the study area's two adjacent sub-ecosystems: a pelagic system influenced by interannual and/or seasonal temporal variation at large, oceanographic (regional) scales, and a benthic-associated system more influenced by small-scale spatial variations. Variability in zooplankton communities reflected interannual oceanographic differences in waters advected northward from the Bering Sea, whereas variation in benthic communities was associated with seafloor and bottom-water characteristics. Variations in the planktivorous seabird community were correlated with prey distributions, whereas interaction effects in ANOVA for walruses were related to declines of sea-ice. Long-term shifts in seabird distributions were also related to changes in sea-ice distributions that led to more open water. Although characteristics of the lower trophic-level animals within sub-ecosystems result from oceanographic variations and interactions with seafloor topography, distributions of apex predators were related to sea-ice as a feeding platform (walruses) or to its absence (i.e., open water) for feeding (seabirds). The stability of prey resources appears to be a key factor in mediating predator interactions with other ocean characteristics. Seabirds reliant on highly-variable zooplankton prey show long-term changes as open water increases, whereas walruses taking benthic prey in biomass hotspots respond to sea-ice changes in the short-term. A better understanding of how variability scales up from prey to predators and how prey resource stability (including how critical prey respond to environmental changes over space and time) might be altered by climate and anthropogenic stressors is essential to predicting the future state of both the Chukchi and other arctic systems.

  9. Assessment of Groundwater Susceptibility to Non-Point Source Contaminants Using Three-Dimensional Transient Indexes.

    PubMed

    Zhang, Yong; Weissmann, Gary S; Fogg, Graham E; Lu, Bingqing; Sun, HongGuang; Zheng, Chunmiao

    2018-06-05

    Groundwater susceptibility to non-point source contamination is typically quantified by stable indexes, while groundwater quality evolution (or deterioration globally) can be a long-term process that may last for decades and exhibit strong temporal variations. This study proposes a three-dimensional (3- d ), transient index map built upon physical models to characterize the complete temporal evolution of deep aquifer susceptibility. For illustration purposes, the previous travel time probability density (BTTPD) approach is extended to assess the 3- d deep groundwater susceptibility to non-point source contamination within a sequence stratigraphic framework observed in the Kings River fluvial fan (KRFF) aquifer. The BTTPD, which represents complete age distributions underlying a single groundwater sample in a regional-scale aquifer, is used as a quantitative, transient measure of aquifer susceptibility. The resultant 3- d imaging of susceptibility using the simulated BTTPDs in KRFF reveals the strong influence of regional-scale heterogeneity on susceptibility. The regional-scale incised-valley fill deposits increase the susceptibility of aquifers by enhancing rapid downward solute movement and displaying relatively narrow and young age distributions. In contrast, the regional-scale sequence-boundary paleosols within the open-fan deposits "protect" deep aquifers by slowing downward solute movement and displaying a relatively broad and old age distribution. Further comparison of the simulated susceptibility index maps to known contaminant distributions shows that these maps are generally consistent with the high concentration and quick evolution of 1,2-dibromo-3-chloropropane (DBCP) in groundwater around the incised-valley fill since the 1970s'. This application demonstrates that the BTTPDs can be used as quantitative and transient measures of deep aquifer susceptibility to non-point source contamination.

  10. Chaste: An Open Source C++ Library for Computational Physiology and Biology

    PubMed Central

    Mirams, Gary R.; Arthurs, Christopher J.; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G.; Harvey, Daniel G.; Marsh, Megan E.; Osborne, James M.; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J.

    2013-01-01

    Chaste — Cancer, Heart And Soft Tissue Environment — is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to ‘re-invent the wheel’ with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials. PMID:23516352

  11. 76 FR 34634 - Federal Acquisition Regulation; Prioritizing Sources of Supplies and Services for Use by the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-14

    ... contracts before commercial sources in the open market. The proposed rule amends FAR 8.002 as follows: The... requirements for supplies and services from commercial sources in the open market. The proposed FAR 8.004 would... subpart 8.6). (b) Commercial sources (including educational and non-profit institutions) in the open...

  12. Bootstrap inversion technique for atmospheric trace gas source detection and quantification using long open-path laser measurements

    NASA Astrophysics Data System (ADS)

    Alden, Caroline B.; Ghosh, Subhomoy; Coburn, Sean; Sweeney, Colm; Karion, Anna; Wright, Robert; Coddington, Ian; Rieker, Gregory B.; Prasad, Kuldeep

    2018-03-01

    Advances in natural gas extraction technology have led to increased activity in the production and transport sectors in the United States and, as a consequence, an increased need for reliable monitoring of methane leaks to the atmosphere. We present a statistical methodology in combination with an observing system for the detection and attribution of fugitive emissions of methane from distributed potential source location landscapes such as natural gas production sites. We measure long (> 500 m), integrated open-path concentrations of atmospheric methane using a dual frequency comb spectrometer and combine measurements with an atmospheric transport model to infer leak locations and strengths using a novel statistical method, the non-zero minimum bootstrap (NZMB). The new statistical method allows us to determine whether the empirical distribution of possible source strengths for a given location excludes zero. Using this information, we identify leaking source locations (i.e., natural gas wells) through rejection of the null hypothesis that the source is not leaking. The method is tested with a series of synthetic data inversions with varying measurement density and varying levels of model-data mismatch. It is also tested with field observations of (1) a non-leaking source location and (2) a source location where a controlled emission of 3.1 × 10-5 kg s-1 of methane gas is released over a period of several hours. This series of synthetic data tests and outdoor field observations using a controlled methane release demonstrates the viability of the approach for the detection and sizing of very small leaks of methane across large distances (4+ km2 in synthetic tests). The field tests demonstrate the ability to attribute small atmospheric enhancements of 17 ppb to the emitting source location against a background of combined atmospheric (e.g., background methane variability) and measurement uncertainty of 5 ppb (1σ), when measurements are averaged over 2 min. The results of the synthetic and field data testing show that the new observing system and statistical approach greatly decreases the incidence of false alarms (that is, wrongly identifying a well site to be leaking) compared with the same tests that do not use the NZMB approach and therefore offers increased leak detection and sizing capabilities.

  13. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  14. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.

    PubMed

    Desai, Trunil S; Srivastava, Shireesh

    2018-01-01

    13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.

  15. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses

    PubMed Central

    Desai, Trunil S.

    2018-01-01

    13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347

  16. Harness That S.O.B.: Distributing Remote Sensing Analysis in a Small Office/Business

    NASA Astrophysics Data System (ADS)

    Kramer, J.; Combe, J.; McCord, T. B.

    2009-12-01

    Researchers in a small office/business (SOB) operate with limited funding, equipment, and software availability. To mitigate these issues, we developed a distributed computing framework that: 1) leverages open source software to implement functionality otherwise reliant on proprietary software and 2) harnesses the unused power of (semi-)idle office computers with mixed operating systems (OSes). This abstract outlines some reasons for the effort, its conceptual basis and implementation, and provides brief speedup results. The Multiple-Endmember Linear Spectral Unmixing Model (MELSUM)1 processes remote-sensing (hyper-)spectral images. The algorithm is computationally expensive, sometimes taking a full week or more for a 1 million pixel/100 wavelength image. Analysis of pixels is independent, so a large benefit can be gained from parallel processing techniques. Job concurrency is limited by the number of active processing units. MELSUM was originally written in the Interactive Data Language (IDL). Despite its multi-threading capabilities, an IDL instance executes on a single machine, and so concurrency is limited by the machine's number of central processing units (CPUs). Network distribution can access more CPUs to provide a greater speedup, while also taking advantage of (often) underutilized extant equipment. appropriately integrating open source software magnifies the impact by avoiding the purchase of additional licenses. Our method of distribution breaks into four conceptual parts: 1) the top- or task-level user interface; 2) a mid-level program that manages hosts and jobs, called the distribution server; 3) a low-level executable for individual pixel calculations; and 4) a control program to synchronize sequential sub-tasks. Each part is a separate OS process, passing information via shell commands and/or temporary files. While the control and low-level executables are short-lived, the top-level program and distribution server run (at least) for the entirety of a task. While any language that supports "spawning" of OS processes can serve as the top-level interface, our solution, d-MELSUM, has been integrated with the IDL code. Doing so extracts the core calculating from IDL, but otherwise preserves IDL features and functionality. The distribution server is an extension of ADE2 mobile robot software, written in Java. Network connections rely on a secure shell (SSH) implementation, whether natively available (e.g., Linux or OS X) or user installed (e.g., OpenSSH available via Cygwin on Windows). Both the low-level and control programs are relatively small C++ programs (~54K, or 1500 lines, total) that were developed in-house, and use GNU's g++ compiler. The low-level code also relies on Linear Algebra PACKage (LAPACK) libraries for pixel calculations. Despite performance being contingent on data size, CPU speed, and network communication rate and latency to some degree, results have generally demonstrated a time reduction of a factor proportional to the number of open connections (one per CPU). For example, the task mentioned above requiring a week to process took 18 hours with d-MELSUM, using 10 CPUs on 2 computers. 1 J.-Ph Combe, et al., PSS 56, 2008. 2 J. Kramer and M. Scheutz, IROS2006, 2006.

  17. Using R to implement spatial analysis in open source environment

    NASA Astrophysics Data System (ADS)

    Shao, Yixi; Chen, Dong; Zhao, Bo

    2007-06-01

    R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.

  18. [The use of open source software in graphic anatomic reconstructions and in biomechanic simulations].

    PubMed

    Ciobanu, O

    2009-01-01

    The objective of this study was to obtain three-dimensional (3D) images and to perform biomechanical simulations starting from DICOM images obtained by computed tomography (CT). Open source software were used to prepare digitized 2D images of tissue sections and to create 3D reconstruction from the segmented structures. Finally, 3D images were used in open source software in order to perform biomechanic simulations. This study demonstrates the applicability and feasibility of open source software developed in our days for the 3D reconstruction and biomechanic simulation. The use of open source software may improve the efficiency of investments in imaging technologies and in CAD/CAM technologies for implants and prosthesis fabrication which need expensive specialized software.

  19. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  20. Rapid development of medical imaging tools with open-source libraries.

    PubMed

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  1. Open-Source RTOS Space Qualification: An RTEMS Case Study

    NASA Technical Reports Server (NTRS)

    Zemerick, Scott

    2017-01-01

    NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.

  2. Large area multiarc ion beam source {open_quote}MAIS{close_quote}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engelko, V.; Giese, H.; Schalk, S.

    1996-12-31

    A pulsed large area intense ion beam source is described, in which the ion emitting plasma is built up by an array of individual discharge units, homogeneously distributed over the surface of a common discharge electrode. A particularly advantageous feature of the source is that for plasma generation and subsequent acceleration of the ions only one common energy supply is necessary. This allows to simplify the source design and provides inherent synchronization of plasma production and ion extraction. The homogeneity of the plasma density was found to be superior to plasma sources using plasma expanders. Originally conceived for the productionmore » of proton beams, the source can easily be modified for the production of beams composed of carbon and metal ions or mixed ion species. Results of investigations of the source performance for the production of a proton beam are presented. The maximum beam current achieved to date is of the order of 100 A, with a particle kinetic energy of 15 - 30 keV and a pulse length in the range of 10 {mu}s.« less

  3. Collaborative development of predictive toxicology applications

    PubMed Central

    2010-01-01

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436

  4. Collaborative development of predictive toxicology applications.

    PubMed

    Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia

    2010-08-31

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.

  5. Learning from hackers: open-source clinical trials.

    PubMed

    Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico

    2012-05-02

    Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.

  6. SimulatorToFMU v0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nouidui, Thierry; Wetter, Michael

    SimulatorToFMU is a software package written in Python which allows users to export a memoryless Python-driven simulation program or script as a Functional Mock-up Unit (FMU) for model exchange or co-simulation.In CyDER (Cyber Physical Co-simulation Platform for Distributed Energy Resources in Smart Grids), SimulatorToFMU will allow exporting OPAL-RT as an FMU. This will enable OPAL-RT to be linked to CYMDIST and GridDyn FMUs through a standardized open source interface.

  7. OxMaR: open source free software for online minimization and randomization for clinical trials.

    PubMed

    O'Callaghan, Christopher A

    2014-01-01

    Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  8. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    PubMed

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Simbol-X: a formation flight mission with an unprecedented imaging capability in the 0.5-80 keV energy band

    NASA Astrophysics Data System (ADS)

    Tagliaferri, Gianpiero; Ferrando, Philippe; Le Duigou, Jean-Michel; Pareschi, Giovanni; Laurent, Philippe; Malaguti, Giuseppe; Clédassou, Rodolphe; Piermaria, Mauro; La Marle, Olivier; Fiore, Fabrizio; Giommi, Paolo

    2017-11-01

    The discovery of X-ray emission from cosmic sources in the 1960s has opened a new powerful observing window on the Universe. In fact, the exploration of the X-ray sky during the 70s-90s has established X-ray astronomy as a fundamental field of astrophysics. Today, the emission from astrophysical sources is by large best known at energies below 10 keV. The main reason for this situation is purely technical since grazing incidence reflection has so far been limited to the soft X-ray band. Above 10 keV all the observations have been obtained with collimated detectors or coded mask instruments. To make a leap step forward in Xray astronomy above 10 keV it is necessary to extend the principle of focusing X ray optics to higher energies, up to 80 keV and beyond. To this end, ASI and CNES are presently studying the implementation of a X-ray mission called Simbol-X. Taking advantage of emerging technology in mirror manufacturing and spacecraft formation flying, Simbol-X will push grazing incidence imaging up to 80 keV and beyond, providing a strong improvement both in sensitivity and angular resolution compared to all instruments that have operated so far above 10 keV. This technological breakthrough will open a new highenergy window in astrophysics and cosmology. Here we will address the problematic of the development for such a distributed and deformable instrument. We will focus on the main performances of the telescope, like angular resolution, sensitivity and source localization. We will also describe the specificity of the calibration aspects of the payload distributed over two satellites and therefore in a not "frozen" configuration.

  10. InterProScan 5: genome-scale protein function classification

    PubMed Central

    Jones, Philip; Binns, David; Chang, Hsin-Yu; Fraser, Matthew; Li, Weizhong; McAnulla, Craig; McWilliam, Hamish; Maslen, John; Mitchell, Alex; Nuka, Gift; Pesseat, Sebastien; Quinn, Antony F.; Sangrador-Vegas, Amaia; Scheremetjew, Maxim; Yong, Siew-Yit; Lopez, Rodrigo; Hunter, Sarah

    2014-01-01

    Motivation: Robust large-scale sequence analysis is a major challenge in modern genomic science, where biologists are frequently trying to characterize many millions of sequences. Here, we describe a new Java-based architecture for the widely used protein function prediction software package InterProScan. Developments include improvements and additions to the outputs of the software and the complete reimplementation of the software framework, resulting in a flexible and stable system that is able to use both multiprocessor machines and/or conventional clusters to achieve scalable distributed data analysis. InterProScan is freely available for download from the EMBl-EBI FTP site and the open source code is hosted at Google Code. Availability and implementation: InterProScan is distributed via FTP at ftp://ftp.ebi.ac.uk/pub/software/unix/iprscan/5/ and the source code is available from http://code.google.com/p/interproscan/. Contact: http://www.ebi.ac.uk/support or interhelp@ebi.ac.uk or mitchell@ebi.ac.uk PMID:24451626

  11. Influence of the spectral distribution of light on the characteristics of photovoltaic panel. Comparison between simulation and experimental

    NASA Astrophysics Data System (ADS)

    Chadel, Meriem; Bouzaki, Mohammed Moustafa; Chadel, Asma; Petit, Pierre; Sawicki, Jean-Paul; Aillerie, Michel; Benyoucef, Boumediene

    2017-02-01

    We present and analyze experimental results obtained with a laboratory setup based on a hardware and smart instrumentation for the complete study of performance of PV panels using for illumination an artificial radiation source (Halogen lamps). Associated to an accurate analysis, this global experimental procedure allows the determination of effective performance under standard conditions thanks to a simulation process originally developed under Matlab software environment. The uniformity of the irradiated surface was checked by simulation of the light field. We studied the response of standard commercial photovoltaic panels under enlightenment measured by a spectrometer with different spectra for two sources, halogen lamps and sunlight. Then, we bring a special attention to the influence of the spectral distribution of light on the characteristics of photovoltaic panel, that we have performed as a function of temperature and for different illuminations with dedicated measurements and studies of the open circuit voltage and short-circuit current.

  12. Benchmarking for Bayesian Reinforcement Learning

    PubMed Central

    Ernst, Damien; Couëtoux, Adrien

    2016-01-01

    In the Bayesian Reinforcement Learning (BRL) setting, agents try to maximise the collected rewards while interacting with their environment while using some prior knowledge that is accessed beforehand. Many BRL algorithms have already been proposed, but the benchmarks used to compare them are only relevant for specific cases. The paper addresses this problem, and provides a new BRL comparison methodology along with the corresponding open source library. In this methodology, a comparison criterion that measures the performance of algorithms on large sets of Markov Decision Processes (MDPs) drawn from some probability distributions is defined. In order to enable the comparison of non-anytime algorithms, our methodology also includes a detailed analysis of the computation time requirement of each algorithm. Our library is released with all source code and documentation: it includes three test problems, each of which has two different prior distributions, and seven state-of-the-art RL algorithms. Finally, our library is illustrated by comparing all the available algorithms and the results are discussed. PMID:27304891

  13. Benchmarking for Bayesian Reinforcement Learning.

    PubMed

    Castronovo, Michael; Ernst, Damien; Couëtoux, Adrien; Fonteneau, Raphael

    2016-01-01

    In the Bayesian Reinforcement Learning (BRL) setting, agents try to maximise the collected rewards while interacting with their environment while using some prior knowledge that is accessed beforehand. Many BRL algorithms have already been proposed, but the benchmarks used to compare them are only relevant for specific cases. The paper addresses this problem, and provides a new BRL comparison methodology along with the corresponding open source library. In this methodology, a comparison criterion that measures the performance of algorithms on large sets of Markov Decision Processes (MDPs) drawn from some probability distributions is defined. In order to enable the comparison of non-anytime algorithms, our methodology also includes a detailed analysis of the computation time requirement of each algorithm. Our library is released with all source code and documentation: it includes three test problems, each of which has two different prior distributions, and seven state-of-the-art RL algorithms. Finally, our library is illustrated by comparing all the available algorithms and the results are discussed.

  14. Modelling the phase curve and occultation of WASP-43b with SPIDERMAN

    NASA Astrophysics Data System (ADS)

    Louden, Tom

    2017-06-01

    Presenting SPIDERMAN, a fast code for calculating exoplanet phase curves and secondary eclipses with arbitrary two dimensional surface brightness distributions. SPIDERMAN uses an exact geometric algorithm to calculate the area of sub-regions of the planet that are occulted by the star, with no loss in numerical precision. The speed of this calculation makes it possible to run MCMCs to marginalise effectively over the underlying parameters controlling the brightness distribution of exoplanets. The code is fully open source and available over Github. We apply the code to the phase curve of WASP-43b using an analytical surface brightness distribution, and find an excellent fit to the data. We are able to place direct constraints on the physics of heat transport in the atmosphere, such as the ratio between advective and radiative timescales at different altitudes.

  15. Derivation of hydrous pyrolysis kinetic parameters from open-system pyrolysis

    NASA Astrophysics Data System (ADS)

    Tseng, Yu-Hsin; Huang, Wuu-Liang

    2010-05-01

    Kinetic information is essential to predict the temperature, timing or depth of hydrocarbon generation within a hydrocarbon system. The most common experiments for deriving kinetic parameters are mainly by open-system pyrolysis. However, it has been shown that the conditions of open-system pyrolysis are deviant from nature by its low near-ambient pressure and high temperatures. Also, the extrapolation of heating rates in open-system pyrolysis to geological conditions may be questionable. Recent study of Lewan and Ruble shows hydrous-pyrolysis conditions can simulate the natural conditions better and its applications are supported by two case studies with natural thermal-burial histories. Nevertheless, performing hydrous pyrolysis experiment is really tedious and requires large amount of sample, while open-system pyrolysis is rather convenient and efficient. Therefore, the present study aims at the derivation of convincing distributed hydrous pyrolysis Ea with only routine open-system Rock-Eval data. Our results unveil that there is a good correlation between open-system Rock-Eval parameter Tmax and the activation energy (Ea) derived from hydrous pyrolysis. The hydrous pyrolysis single Ea can be predicted from Tmax based on the correlation, while the frequency factor (A0) is estimated based on the linear relationship between single Ea and log A0. Because the Ea distribution is more rational than single Ea, we modify the predicted single hydrous pyrolysis Ea into distributed Ea by shifting the pattern of Ea distribution from open-system pyrolysis until the weight mean Ea distribution equals to the single hydrous pyrolysis Ea. Moreover, it has been shown that the shape of the Ea distribution is very much alike the shape of Tmax curve. Thus, in case of the absence of open-system Ea distribution, we may use the shape of Tmax curve to get the distributed hydrous pyrolysis Ea. The study offers a new approach as a simple method for obtaining distributed hydrous pyrolysis Ea with only routine open-system Rock-Eval data, which will allow for better estimating hydrocarbon generation.

  16. A virtual photon energy fluence model for Monte Carlo dose calculation.

    PubMed

    Fippel, Matthias; Haryanto, Freddy; Dohm, Oliver; Nüsslin, Fridtjof; Kriesen, Stephan

    2003-03-01

    The presented virtual energy fluence (VEF) model of the patient-independent part of the medical linear accelerator heads, consists of two Gaussian-shaped photon sources and one uniform electron source. The planar photon sources are located close to the bremsstrahlung target (primary source) and to the flattening filter (secondary source), respectively. The electron contamination source is located in the plane defining the lower end of the filter. The standard deviations or widths and the relative weights of each source are free parameters. Five other parameters correct for fluence variations, i.e., the horn or central depression effect. If these parameters and the field widths in the X and Y directions are given, the corresponding energy fluence distribution can be calculated analytically and compared to measured dose distributions in air. This provides a method of fitting the free parameters using the measurements for various square and rectangular fields and a fixed number of monitor units. The next step in generating the whole set of base data is to calculate monoenergetic central axis depth dose distributions in water which are used to derive the energy spectrum by deconvolving the measured depth dose curves. This spectrum is also corrected to take the off-axis softening into account. The VEF model is implemented together with geometry modules for the patient specific part of the treatment head (jaws, multileaf collimator) into the XVMC dose calculation engine. The implementation into other Monte Carlo codes is possible based on the information in this paper. Experiments are performed to verify the model by comparing measured and calculated dose distributions and output factors in water. It is demonstrated that open photon beams of linear accelerators from two different vendors are accurately simulated using the VEF model. The commissioning procedure of the VEF model is clinically feasible because it is based on standard measurements in air and water. It is also useful for IMRT applications because a full Monte Carlo simulation of the treatment head would be too time-consuming for many small fields.

  17. SU-C-201-03: Coded Aperture Gamma-Ray Imaging Using Pixelated Semiconductor Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, S; Kaye, W; Jaworski, J

    2015-06-15

    Purpose: Improved localization of gamma-ray emissions from radiotracers is essential to the progress of nuclear medicine. Polaris is a portable, room-temperature operated gamma-ray imaging spectrometer composed of two 3×3 arrays of thick CdZnTe (CZT) detectors, which detect gammas between 30keV and 3MeV with energy resolution of <1% FWHM at 662keV. Compton imaging is used to map out source distributions in 4-pi space; however, is only effective above 300keV where Compton scatter is dominant. This work extends imaging to photoelectric energies (<300keV) using coded aperture imaging (CAI), which is essential for localization of Tc-99m (140keV). Methods: CAI, similar to the pinholemore » camera, relies on an attenuating mask, with open/closed elements, placed between the source and position-sensitive detectors. Partial attenuation of the source results in a “shadow” or count distribution that closely matches a portion of the mask pattern. Ideally, each source direction corresponds to a unique count distribution. Using backprojection reconstruction, the source direction is determined within the field of view. The knowledge of 3D position of interaction results in improved image quality. Results: Using a single array of detectors, a coded aperture mask, and multiple Co-57 (122keV) point sources, image reconstruction is performed in real-time, on an event-by-event basis, resulting in images with an angular resolution of ∼6 degrees. Although material nonuniformities contribute to image degradation, the superposition of images from individual detectors results in improved SNR. CAI was integrated with Compton imaging for a seamless transition between energy regimes. Conclusion: For the first time, CAI has been applied to thick, 3D position sensitive CZT detectors. Real-time, combined CAI and Compton imaging is performed using two 3×3 detector arrays, resulting in a source distribution in space. This system has been commercialized by H3D, Inc. and is being acquired for various applications worldwide, including proton therapy imaging R&D.« less

  18. An Intense Slit Discharge Source of Jet-Cooled Molecular Ions and Radicals (T(sub rot) less than 30 K)

    NASA Technical Reports Server (NTRS)

    Anderson, David T.; Davis, Scott; Zwier, Timothy S.; Nesbitt, David J.

    1996-01-01

    A novel pulsed, slit supersonic discharge source is described for generating intense jet-cooled densities of radicals (greater than 10(exp 12)/cu cm) and molecular ions (greater than 10(exp 10)/cu cm) under long absorption path (80 cm), supersonically cooled conditions. The design confines the discharge region upstream of the supersonic expansion orifice to achieve efficient rotational cooling down to 30 K or less. The collisionally collimated velocity distribution in the slit discharge geometry yields sub-Doppler spectral linewidths, which for open-shell radicals reveals spin-rotation splittings and broadening due to nuclear hyperfine structure. Application of the slit source for high-resolution, direct IR laser absorption spectroscopy in discharges is demonstrated on species such as OH, H3O(+) and N2H(+).

  19. [Random Variable Read Me File

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Sankararaman, Shankar; Cullo, Aiden

    2017-01-01

    Readme for the Random Variable Toolbox usable manner. is a Web-based Git version control repository hosting service. It is mostly used for computer code. It offers all of the distributed version control and source code management (SCM) functionality of Git as well as adding its own features. It provides access control and several collaboration features such as bug tracking, feature requests, task management, and wikis for every project.[3] GitHub offers both plans for private and free repositories on the same account[4] which are commonly used to host open-source software projects.[5] As of April 2017, GitHub reports having almost 20 million users and 57 million repositories,[6] making it the largest host of source code in the world.[7] GitHub has a mascot called Octocat, a cat with five tentacles and a human-like face

  20. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  1. The validity of open-source data when assessing jail suicides.

    PubMed

    Thomas, Amanda L; Scott, Jacqueline; Mellow, Jeff

    2018-05-09

    The Bureau of Justice Statistics' Deaths in Custody Reporting Program is the primary source for jail suicide research, though the data is restricted from general dissemination. This study is the first to examine whether jail suicide data obtained from publicly available sources can help inform our understanding of this serious public health problem. Of the 304 suicides that were reported through the DCRP in 2009, roughly 56 percent (N = 170) of those suicides were identified through the open-source search protocol. Each of the sources was assessed based on how much information was collected on the incident and the types of variables available. A descriptive analysis was then conducted on the variables that were present in both data sources. The four variables present in each data source were: (1) demographic characteristics of the victim, (2) the location of occurrence within the facility, (3) the location of occurrence by state, and (4) the size of the facility. Findings demonstrate that the prevalence and correlates of jail suicides are extremely similar in both open-source and official data. However, for almost every variable measured, open-source data captured as much information as official data did, if not more. Further, variables not found in official data were identified in the open-source database, thus allowing researchers to have a more nuanced understanding of the situational characteristics of the event. This research provides support for the argument in favor of including open-source data in jail suicide research as it illustrates how open-source data can be used to provide additional information not originally found in official data. In sum, this research is vital in terms of possible suicide prevention, which may be directly linked to being able to manipulate environmental factors.

  2. Harvesting implementation for the GI-cat distributed catalog

    NASA Astrophysics Data System (ADS)

    Boldrini, Enrico; Papeschi, Fabrizio; Bigagli, Lorenzo; Mazzetti, Paolo

    2010-05-01

    GI-cat framework implements a distributed catalog service supporting different international standards and interoperability arrangements in use by the geoscientific community. The distribution functionality in conjunction with the mediation functionality allows to seamlessly query remote heterogeneous data sources, including OGC Web Services - e.e. OGC CSW, WCS, WFS and WMS, community standards such as UNIDATA THREDDS/OPeNDAP, SeaDataNet CDI (Common Data Index), GBIF (Global Biodiversity Information Facility) services and OpenSearch engines. In the GI-cat modular architecture a distributor component carry out the distribution functionality by query delegation to the mediator components (one for each different data source). Each of these mediator components is able to query a specific data source and convert back the results by mapping of the foreign data model to the GI-cat internal one, based on ISO 19139. In order to cope with deployment scenarios in which local data is expected, an harvesting approach has been experimented. The new strategy comes in addition to the consolidated distributed approach, allowing the user to switch between a remote and a local search at will for each federated resource; this extends GI-cat configuration possibilities. The harvesting strategy is designed in GI-cat by the use at the core of a local cache component, implemented as a native XML database and based on eXist. The different heterogeneous sources are queried for the bulk of available data; this data is then injected into the cache component after being converted to the GI-cat data model. The query and conversion steps are performed by the mediator components that were are part of the GI-cat framework. Afterward each new query can be exercised against local data that have been stored in the cache component. Considering both advantages and shortcomings that affect harvesting and query distribution approaches, it comes out that a user driven tuning is required to take the best of them. This is often related to the specific user scenarios to be implemented. GI-cat proved to be a flexible framework to address user need. The GI-cat configurator tool was updated to make such a tuning possible: each data source can be configured to enable either harvesting or query distribution approaches; in the former case an appropriate harvesting interval can be set.

  3. Evaluation of kinetic uncertainty in numerical models of petroleum generation

    USGS Publications Warehouse

    Peters, K.E.; Walters, C.C.; Mankiewicz, P.J.

    2006-01-01

    Oil-prone marine petroleum source rocks contain type I or type II kerogen having Rock-Eval pyrolysis hydrogen indices greater than 600 or 300-600 mg hydrocarbon/g total organic carbon (HI, mg HC/g TOC), respectively. Samples from 29 marine source rocks worldwide that contain mainly type II kerogen (HI = 230-786 mg HC/g TOC) were subjected to open-system programmed pyrolysis to determine the activation energy distributions for petroleum generation. Assuming a burial heating rate of 1??C/m.y. for each measured activation energy distribution, the calculated average temperature for 50% fractional conversion of the kerogen in the samples to petroleum is approximately 136 ?? 7??C, but the range spans about 30??C (???121-151??C). Fifty-two outcrop samples of thermally immature Jurassic Oxford Clay Formation were collected from five locations in the United Kingdom to determine the variations of kinetic response for one source rock unit. The samples contain mainly type I or type II kerogens (HI = 230-774 mg HC/g TOC). At a heating rate of 1??C/m.y., the calculated temperatures for 50% fractional conversion of the Oxford Clay kerogens to petroleum differ by as much as 23??C (127-150??C). The data indicate that kerogen type, as defined by hydrogen index, is not systematically linked to kinetic response, and that default kinetics for the thermal decomposition of type I or type II kerogen can introduce unacceptable errors into numerical simulations. Furthermore, custom kinetics based on one or a few samples may be inadequate to account for variations in organofacies within a source rock. We propose three methods to evaluate the uncertainty contributed by kerogen kinetics to numerical simulations: (1) use the average kinetic distribution for multiple samples of source rock and the standard deviation for each activation energy in that distribution; (2) use source rock kinetics determined at several locations to describe different parts of the study area; and (3) use a weighted-average method that combines kinetics for samples from different locations in the source rock unit by giving the activation energy distribution for each sample a weight proportional to its Rock-Eval pyrolysis S2 yield (hydrocarbons generated by pyrolytic degradation of organic matter). Copyright ?? 2006. The American Association of Petroleum Geologists. All rights reserved.

  4. Isotopes, Inventories and Seasonality: Unraveling Methane Source Distribution in the Complex Landscapes of the United Kingdom.

    NASA Astrophysics Data System (ADS)

    Lowry, D.; Fisher, R. E.; Zazzeri, G.; Lanoisellé, M.; France, J.; Allen, G.; Nisbet, E. G.

    2017-12-01

    Unlike the big open landscapes of many continents with large area sources dominated by one particular methane emission type that can be isotopically characterized by flight measurements and sampling, the complex patchwork of urban, fossil and agricultural methane sources across NW Europe require detailed ground surveys for characterization (Zazzeri et al., 2017). Here we outline the findings from multiple seasonal urban and rural measurement campaigns in the United Kingdom. These surveys aim to: 1) Assess source distribution and baseline in regions of planned fracking, and relate to on-site continuous baseline climatology. 2) Characterize spatial and seasonal differences in the isotopic signatures of the UNFCCC source categories, and 3) Assess the spatial validity of the 1 x 1 km UK inventory for large continuous emitters, proposed point sources, and seasonal / ephemeral emissions. The UK inventory suggests that 90% of methane emissions are from 3 source categories, ruminants, landfill and gas distribution. Bag sampling and GC-IRMS delta13C analysis shows that landfill gives a constant signature of -57 ±3 ‰ throughout the year. Fugitive gas emissions are consistent regionally depending on the North Sea supply regions feeding the network (-41 ± 2 ‰ in N England, -37 ± 2 ‰ in SE England). Ruminant, mostly cattle, emissions are far more complex as these spend winters in barns and summers in fields, but are essentially a mix of 2 end members, breath at -68 ±3 ‰ and manure at -51 ±3 ‰, resulting in broad summer field emission plumes of -64 ‰ and point winter barn emission plumes of -58 ‰. The inventory correctly locates emission hotspots from landfill, larger sewage treatment plants and gas compressor stations, giving a broad overview of emission distribution for regional model validation. Mobile surveys are adding an extra layer of detail to this which, combined with isotopic characterization, has identified spatial distribution of gas pipe leaks, some persisting since 2013 (Zazzeri et al., 2015), and seasonality and spatial variability of livestock emissions. Importantly existing significant gas leaks close to proposed fracking sites have been characterized so that any emissions to atmosphere with a different isotopic signature will be detected. Zazzeri, G., Atm. Env. 110, 151-162 (2015); Zazzeri, G., Sci. Rep. 7, 4854 (2017).

  5. Volume and surface area size distribution, water mass and model fitting of GCE/CASE/WATOX marine aerosols

    NASA Astrophysics Data System (ADS)

    Kim, Y.; Sievering, H.; Boatman, J.

    1990-06-01

    As a part of the Global Change Expedition/Coordinated Air-Sea Experiment/Western Atlantic Ocean Experiment (GCE/CASE/WATOX), size distributions of marine aerosols were measured at two altitudes of about 2750 and 150 m above sea level (asl) over the size range 0.1 ˜ 32 μm. Lognormal fitting was applied to the corrected aerosol size spectra to determine the volume and surface area size distributions of the CASE-WATOX marine aerosols. Each aerosol size distribution was fitted with three lognormal distributions representing fine-, large-, and giant-particle modes. Water volume fraction and dry particle size of each aerosol size distribution were also calculated using empirical formulas for particle size as a function of relative humidity and particle type. Because of the increased influence from anthropogenic sources in the continental United States, higher aerosol volume concentrations were observed in the fine-particle mode near-shore off the east coast; 2.11 and 3.63 μm3 cm-3 for free troposphere (FT) and marine boundary layer (MBL), compared with the open-sea Bermuda area values; 0.13 and 0.74 μm3 cm-3 for FT and MBL. The large-particle mode exhibits the least variations in volume distributions between the east coast and open-sea Bermuda area, having a volume geometric median diameter (VGMD) between 1.4 and 1.6 μm and a geometric standard deviation between 1.57 and 1.68. For the giant-particle mode, larger VGMD and volume concentrations were observed for marine aerosols nearshore off the east coast than in the open-sea Bermuda area because of higher relative humidity and higher surface wind speed conditions. Wet VGMD and aerosol water volume concentrations at 15 m asl ship level were determined by extrapolating from those obtained by analysis of the CASE-WATOX aircraft aerosol data. Abundance of aerosol water in the MBL serves as an important pathway for heterogeneous conversion of SO2 in sea salt aerosol particles.

  6. A review of dicarboxylic acids and related compounds in atmospheric aerosols: Molecular distributions, sources and transformation

    NASA Astrophysics Data System (ADS)

    Kawamura, Kimitaka; Bikkina, Srinivas

    2016-03-01

    This review aims to update our understanding on molecular distributions of water-soluble dicarboxylic acids and related compounds in atmospheric aerosols with a focus on their geographical variability, size distribution, sources and formation pathways. In general, molecular distributions of diacids in aerosols from the continental sites and over the open ocean waters are often characterized by the predominance of oxalic acid (C2) followed by malonic acid (C3) and/or succinic acid (C4), while those sampled over the polar regions often follow the order of C4 ≥ C2 and C3. The most abundant and ubiquitous diacid is oxalic acid, which is principally formed via atmospheric oxidation of its higher homologues of long chain diacids and other pollution-derived organic precursors (e.g., olefins and aromatic hydrocarbons). However, its occurrence in marine aerosols is mainly due to the transport from continental outflows (e.g., East Asian outflow during winter/spring to the North Pacific) and/or governed by photochemical/aqueous phase oxidation of biogenic unsaturated fatty acids (e.g., oleic acid) and isoprene emitted from the productive open ocean waters. The long-range atmospheric transport of pollutants from mid latitudes to the Arctic in dark winter facilitates to accumulate the reactants prior to their intense photochemical oxidation during springtime polar sunrise. Furthermore, the relative abundances of C2 in total diacid mass showed similar temporal trends with downward solar irradiation and ambient temperatures, suggesting the significance of atmospheric photochemical oxidation processing. Compound-specific isotopic analyses of oxalic acid showed the highest δ13C among diacids whereas azelaic acid showed the lowest value, corroborating the significance of atmospheric aging of oxalic acid. On the other hand, other diacids gave intermediate values between these two diacids, suggesting that aging of oxalic acid is associated with 13C enrichment.

  7. [Example of product development by industry and research solidarity].

    PubMed

    Seki, Masayoshi

    2014-01-01

    When the industrial firms develop the product, the research result from research institutions is used or to reflect the ideas from users on the developed product would be significant in order to improve the product. To state the software product which developed jointly as an example to describe the adopted development technique and its result, and to consider the modality of the industry solidarity seen from the company side and joint development. The software development methods have the merit and demerit and necessary to choose the optimal development technique by the system which develops. We have been jointly developed the dose distribution browsing software. As the software development method, we adopted the prototype model. In order to display the dose distribution information, it is necessary to load four objects which are CT-Image, Structure Set, RT-Plan, and RT-Dose, are displayed in a composite manner. The prototype model which is the development technique was adopted by this joint development was optimal especially to develop the dose distribution browsing software. In a prototype model, since the detail design was created based on the program source code after the program was finally completed, there was merit on the period shortening of document written and consist in design and implementation. This software eventually opened to the public as an open source. Based on this developed prototype software, the release version of the dose distribution browsing software was developed. Developing this type of novelty software, it normally takes two to three years, but since the joint development was adopted, it shortens the development period to one year. Shortening the development period was able to hold down to the minimum development cost for a company and thus, this will be reflected to the product price. The specialists make requests on the product from user's point of view are important, but increase in specialists as professionals for product development will increase the expectations to develop a product to meet the users demand.

  8. An Open Source Extensible Smart Energy Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rankin, Linda

    Aggregated distributed energy resources are the subject of much interest in the energy industry and are expected to play an important role in meeting our future energy needs by changing how we use, distribute and generate electricity. This energy future includes an increased amount of energy from renewable resources, load management techniques to improve resiliency and reliability, and distributed energy storage and generation capabilities that can be managed to meet the needs of the grid as well as individual customers. These energy assets are commonly referred to as Distributed Energy Resources (DER). DERs rely on a means to communicate informationmore » between an energy provider and multitudes of devices. Today DER control systems are typically vendor-specific, using custom hardware and software solutions. As a result, customers are locked into communication transport protocols, applications, tools, and data formats. Today’s systems are often difficult to extend to meet new application requirements, resulting in stranded assets when business requirements or energy management models evolve. By partnering with industry advisors and researchers, an implementation DER research platform was developed called the Smart Energy Framework (SEF). The hypothesis of this research was that an open source Internet of Things (IoT) framework could play a role in creating a commodity-based eco-system for DER assets that would reduce costs and provide interoperable products. SEF is based on the AllJoynTM IoT open source framework. The demonstration system incorporated DER assets, specifically batteries and smart water heaters. To verify the behavior of the distributed system, models of water heaters and batteries were also developed. An IoT interface for communicating between the assets and a control server was defined. This interface supports a series of “events” and telemetry reporting, similar to those defined by current smart grid communication standards. The results of this effort demonstrated the feasibility and application potential of using IoT frameworks for the creation of commodity-based DER systems. All of the identified commodity-based system requirements were met by the AllJoyn framework. By having commodity solutions, small vendors can enter the market and the cost of implementation for all parties is reduced. Utilities and aggregators can choose from multiple interoperable products reducing the risk of stranded assets. Based on this research it is recommended that interfaces based on existing smart grid communication protocol standards be created for these emerging IoT frameworks. These interfaces should be standardized as part of the IoT framework allowing for interoperability testing and certification. Similarly, IoT frameworks are introducing application level security. This type of security is needed for protecting application and platforms and will be important moving forward. Recommendations are that along with DER-based data model interfaces, platform and application security requirements also be prescribed when IoT devices support DER applications.« less

  9. Open source tools and toolkits for bioinformatics: significance, and where are we?

    PubMed

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  10. Open Source 2010: Reflections on 2007

    ERIC Educational Resources Information Center

    Wheeler, Brad

    2007-01-01

    Colleges and universities and commercial firms have demonstrated great progress in realizing the vision proffered for "Open Source 2007," and 2010 will mark even greater progress. Although much work remains in refining open source for higher education applications, the signals are now clear: the collaborative development of software can provide…

  11. Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments

    ERIC Educational Resources Information Center

    Wang, Shuo; Wang, Jing; Gao, Yanjing

    2017-01-01

    An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…

  12. Creating Open Source Conversation

    ERIC Educational Resources Information Center

    Sheehan, Kate

    2009-01-01

    Darien Library, where the author serves as head of knowledge and learning services, launched a new website on September 1, 2008. The website is built with Drupal, an open source content management system (CMS). In this article, the author describes how she and her colleagues overhauled the library's website to provide an open source content…

  13. Integrating an Automatic Judge into an Open Source LMS

    ERIC Educational Resources Information Center

    Georgouli, Katerina; Guerreiro, Pedro

    2011-01-01

    This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…

  14. 76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... be held in the General Services Administration (GSA), Central Office Auditorium, 1800 F Street NW...

  15. The open-source movement: an introduction for forestry professionals

    Treesearch

    Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove

    2005-01-01

    In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....

  16. Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.

    ERIC Educational Resources Information Center

    Newby, Gregory B.; Greenberg, Jane; Jones, Paul

    2003-01-01

    Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)

  17. Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat.

    PubMed

    Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart

    2015-04-21

    Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation.

  18. Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat

    PubMed Central

    Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart

    2015-01-01

    Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation. PMID:25897892

  19. Open Source and ROI: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    A switch to free open source software can minimize cost and allow funding to be diverted to equipment and other programs. For instance, the OpenOffice suite is an alternative to expensive basic application programs offered by major vendors. Many such programs on the market offer features seldom used in education but for which educators must pay.…

  20. A new software for deformation source optimization, the Bayesian Earthquake Analysis Tool (BEAT)

    NASA Astrophysics Data System (ADS)

    Vasyura-Bathke, H.; Dutta, R.; Jonsson, S.; Mai, P. M.

    2017-12-01

    Modern studies of crustal deformation and the related source estimation, including magmatic and tectonic sources, increasingly use non-linear optimization strategies to estimate geometric and/or kinematic source parameters and often consider both jointly, geodetic and seismic data. Bayesian inference is increasingly being used for estimating posterior distributions of deformation source model parameters, given measured/estimated/assumed data and model uncertainties. For instance, some studies consider uncertainties of a layered medium and propagate these into source parameter uncertainties, while others use informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed to efficiently explore the high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational burden of these methods is high and estimation codes are rarely made available along with the published results. Even if the codes are accessible, it is usually challenging to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in deformation source estimations, we undertook the effort of developing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package builds on the pyrocko seismological toolbox (www.pyrocko.org), and uses the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat), and we encourage and solicit contributions to the project. Here, we present our strategy for developing BEAT and show application examples; especially the effect of including the model prediction uncertainty of the velocity model in following source optimizations: full moment tensor, Mogi source, moderate strike-slip earth-quake.

  1. Quantarctica: A Unique, Open, Standalone GIS Package for Antarctic Research and Education

    NASA Astrophysics Data System (ADS)

    Roth, G.; Matsuoka, K.; Skoglund, A.; Melvaer, Y.; Tronstad, S.

    2016-12-01

    The Norwegian Polar Institute has developed Quantarctica, an open GIS package for use by the international Antarctic community. Quantarctica includes a wide range of cartographic basemap layers, geophysical and glaciological datasets, and satellite imagery in standardized file formats with a consistent Antarctic map projection and customized layer and labeling styles for quick, effective cartography. Quantarctica's strengths as an open science platform lie in 1) The complete, ready-to-use data package which includes full-resolution, original-quality vector and raster data, 2) A policy for freely-redistributable and modifiable data including all metadata and citations, and 3) QGIS, a free, full-featured, modular, offline-capable open-source GIS suite with a rapid and active development and support community. The Quantarctica team is actively seeking new contributions of peer-reviewed, freely distributable pan-Antarctic geospatial datasets for the next version release in 2017. As part of this ongoing development, we are investigating the best approaches for quickly and seamlessly distributing new and updated data to users, storing datasets in efficient file formats while maintaining full quality, and coexisting with numerous online data portals in a way that most actively benefits the Antarctic community. A recent survey of Quantarctica users showed broad geographical adoption among Antarctic Treaty countries, including those outside the large US and UK Antarctic programs. Maps and figures produced by Quantarctica have also appeared in open-access journals and outside of the formal scientific community on popular science and GIS blogs. Our experience with the Quantarctica project has shown the tremendous value of education and outreach, not only in promoting open software, data formats, and practices, but in empowering Antarctic science groups to more effectively use GIS and geospatial data. Open practices are making a huge impact in Antarctic GIS, where individual countries have historically maintained their own restricted Antarctic geodatabases and where a majority of the next generation of scientists are entering the field with experience in using geospatial thinking for planning, visualization, and problem solving.

  2. Estimating extinction using unsupervised machine learning

    NASA Astrophysics Data System (ADS)

    Meingast, Stefan; Lombardi, Marco; Alves, João

    2017-05-01

    Dust extinction is the most robust tracer of the gas distribution in the interstellar medium, but measuring extinction is limited by the systematic uncertainties involved in estimating the intrinsic colors to background stars. In this paper we present a new technique, Pnicer, that estimates intrinsic colors and extinction for individual stars using unsupervised machine learning algorithms. This new method aims to be free from any priors with respect to the column density and intrinsic color distribution. It is applicable to any combination of parameters and works in arbitrary numbers of dimensions. Furthermore, it is not restricted to color space. Extinction toward single sources is determined by fitting Gaussian mixture models along the extinction vector to (extinction-free) control field observations. In this way it becomes possible to describe the extinction for observed sources with probability densities, rather than a single value. Pnicer effectively eliminates known biases found in similar methods and outperforms them in cases of deep observational data where the number of background galaxies is significant, or when a large number of parameters is used to break degeneracies in the intrinsic color distributions. This new method remains computationally competitive, making it possible to correctly de-redden millions of sources within a matter of seconds. With the ever-increasing number of large-scale high-sensitivity imaging surveys, Pnicer offers a fast and reliable way to efficiently calculate extinction for arbitrary parameter combinations without prior information on source characteristics. The Pnicer software package also offers access to the well-established Nicer technique in a simple unified interface and is capable of building extinction maps including the Nicest correction for cloud substructure. Pnicer is offered to the community as an open-source software solution and is entirely written in Python.

  3. Open source drug discovery--a new paradigm of collaborative research in tuberculosis drug development.

    PubMed

    Bhardwaj, Anshu; Scaria, Vinod; Raghava, Gajendra Pal Singh; Lynn, Andrew Michael; Chandra, Nagasuma; Banerjee, Sulagna; Raghunandanan, Muthukurussi V; Pandey, Vikas; Taneja, Bhupesh; Yadav, Jyoti; Dash, Debasis; Bhattacharya, Jaijit; Misra, Amit; Kumar, Anil; Ramachandran, Srinivasan; Thomas, Zakir; Brahmachari, Samir K

    2011-09-01

    It is being realized that the traditional closed-door and market driven approaches for drug discovery may not be the best suited model for the diseases of the developing world such as tuberculosis and malaria, because most patients suffering from these diseases have poor paying capacity. To ensure that new drugs are created for patients suffering from these diseases, it is necessary to formulate an alternate paradigm of drug discovery process. The current model constrained by limitations for collaboration and for sharing of resources with confidentiality hampers the opportunities for bringing expertise from diverse fields. These limitations hinder the possibilities of lowering the cost of drug discovery. The Open Source Drug Discovery project initiated by Council of Scientific and Industrial Research, India has adopted an open source model to power wide participation across geographical borders. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multi-faceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Since the inventions are community generated, the new chemical entities developed by Open Source Drug Discovery will be taken up for clinical trial in a non-exclusive manner by participation of multiple companies with majority funding from Open Source Drug Discovery. This will ensure availability of drugs through a lower cost community driven drug discovery process for diseases afflicting people with poor paying capacity. Hopefully what LINUX the World Wide Web have done for the information technology, Open Source Drug Discovery will do for drug discovery. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Itzï (version 17.1): an open-source, distributed GIS model for dynamic flood simulation

    NASA Astrophysics Data System (ADS)

    Guillaume Courty, Laurent; Pedrozo-Acuña, Adrián; Bates, Paul David

    2017-05-01

    Worldwide, floods are acknowledged as one of the most destructive hazards. In human-dominated environments, their negative impacts are ascribed not only to the increase in frequency and intensity of floods but also to a strong feedback between the hydrological cycle and anthropogenic development. In order to advance a more comprehensive understanding of this complex interaction, this paper presents the development of a new open-source tool named Itzï that enables the 2-D numerical modelling of rainfall-runoff processes and surface flows integrated with the open-source geographic information system (GIS) software known as GRASS. Therefore, it takes advantage of the ability given by GIS environments to handle datasets with variations in both temporal and spatial resolutions. Furthermore, the presented numerical tool can handle datasets from different sources with varied spatial resolutions, facilitating the preparation and management of input and forcing data. This ability reduces the preprocessing time usually required by other models. Itzï uses a simplified form of the shallow water equations, the damped partial inertia equation, for the resolution of surface flows, and the Green-Ampt model for the infiltration. The source code is now publicly available online, along with complete documentation. The numerical model is verified against three different tests cases: firstly, a comparison with an analytic solution of the shallow water equations is introduced; secondly, a hypothetical flooding event in an urban area is implemented, where results are compared to those from an established model using a similar approach; and lastly, the reproduction of a real inundation event that occurred in the city of Kingston upon Hull, UK, in June 2007, is presented. The numerical approach proved its ability at reproducing the analytic and synthetic test cases. Moreover, simulation results of the real flood event showed its suitability at identifying areas affected by flooding, which were verified against those recorded after the event by local authorities.

  5. Mapping, Monitoring, and Modeling Geomorphic Processes to Identify Sources of Anthropogenic Sediment Pollution in West Maui, Hawai'i

    NASA Astrophysics Data System (ADS)

    Cerovski-Darriau, C.; Stock, J. D.; Winans, W. R.

    2016-12-01

    Episodic storm runoff in West Maui (Hawai'i) brings plumes of terrestrially-sourced fine sediment to the nearshore ocean environment, degrading coral reef ecosystems. The sediment pollution sources were largely unknown, though suspected to be due to modern human disturbance of the landscape, and initially assumed to be from visibly obvious exposed soil on agricultural fields and unimproved roads. To determine the sediment sources and estimate a sediment budget for the West Maui watersheds, we mapped the geomorphic processes in the field and from DEMs and orthoimagery, monitored erosion rates in the field, and modeled the sediment flux using the mapped processes and corresponding rates. We found the primary source of fine sands, silts and clays to be previously unidentified fill terraces along the stream bed. These terraces, formed during legacy agricultural activity, are the banks along 40-70% of the streams where the channels intersect human-modified landscapes. Monitoring over the last year shows that a few storms erode the fill terraces 10-20 mm annually, contributing up to 100s of tonnes of sediment per catchment. Compared to the average long-term, geologic erosion rate of 0.03 mm/yr, these fill terraces alone increase the suspended sediment flux to the coral reefs by 50-90%. Stakeholders can use our resulting geomorphic process map and sediment budget to inform the location and type of mitigation effort needed to limit terrestrial sediment pollution. We compare our mapping, monitoring, and modeling (M3) approach to NOAA's OpenNSPECT model. OpenNSPECT uses empirical hydrologic and soil erosion models paired with land cover data to compare the spatially distributed sediment yield from different land-use scenarios. We determine the relative effectiveness of calculating a baseline watershed sediment yield from each approach, and the utility of calibrating OpenNSEPCT with M3 results to better forecast future sediment yields from land-use or climate change scenarios.

  6. State-of-the-practice and lessons learned on implementing open data and open source policies.

    DOT National Transportation Integrated Search

    2012-05-01

    This report describes the current government, academic, and private sector practices associated with open data and open source application development. These practices are identified; and the potential uses with the ITS Programs Data Capture and M...

  7. Environmental Information Management For Data Discovery and Access System

    NASA Astrophysics Data System (ADS)

    Giriprakash, P.

    2011-01-01

    Mercury is a federated metadata harvesting, search and retrieval tool based on both open source software and software developed at Oak Ridge National Laboratory. It was originally developed for NASA, and the Mercury development consortium now includes funding from NASA, USGS, and DOE. A major new version of Mercury was developed during 2007 and released in early 2008. This new version provides orders of magnitude improvements in search speed, support for additional metadata formats, integration with Google Maps for spatial queries, support for RSS delivery of search results, and ready customization to meet the needs of the multiple projects which use Mercury. For the end users, Mercury provides a single portal to very quickly search for data and information contained in disparate data management systems. It collects metadata and key data from contributing project servers distributed around the world and builds a centralized index. The Mercury search interfaces then allow ! the users to perform simple, fielded, spatial and temporal searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results to the user, while allowing data providers to advertise the availability of their data and maintain complete control and ownership of that data.

  8. Your Personal Analysis Toolkit - An Open Source Solution

    NASA Astrophysics Data System (ADS)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  9. All-source Information Management and Integration for Improved Collective Intelligence Production

    DTIC Science & Technology

    2011-06-01

    Intelligence (ELINT) • Open Source Intelligence ( OSINT ) • Technical Intelligence (TECHINT) These intelligence disciplines produce... intelligence , measurement and signature intelligence , signals intelligence , and open - source data, in the production of intelligence . All- source intelligence ...All- Source Information Integration and Management) R&D Project 3 All- Source Intelligence

  10. iSpy: a powerful and lightweight event display

    NASA Astrophysics Data System (ADS)

    Alverson, G.; Eulisse, G.; McCauley, T.; Taylor, L.

    2012-12-01

    iSpy is a general-purpose event data and detector visualization program that was developed as an event display for the CMS experiment at the LHC and has seen use by the general public and teachers and students in the context of education and outreach. Central to the iSpy design philosophy is ease of installation, use, and extensibility. The application itself uses the open-access packages Qt4 and Open Inventor and is distributed either as a fully-bound executable or a standard installer package: one can simply download and double-click to begin. Mac OSX, Linux, and Windows are supported. iSpy renders the standard 2D, 3D, and tabular views, and the architecture allows for a generic approach to production of new views and projections. iSpy reads and displays data in the ig format: event information is written in compressed JSON format files designed for distribution over a network. This format is easily extensible and makes the iSpy client indifferent to the original input data source. The ig format is the one used for release of approved CMS data to the public.

  11. ADFNE: Open source software for discrete fracture network engineering, two and three dimensional applications

    NASA Astrophysics Data System (ADS)

    Fadakar Alghalandis, Younes

    2017-05-01

    Rapidly growing topic, the discrete fracture network engineering (DFNE), has already attracted many talents from diverse disciplines in academia and industry around the world to challenge difficult problems related to mining, geothermal, civil, oil and gas, water and many other projects. Although, there are few commercial software capable of providing some useful functionalities fundamental for DFNE, their costs, closed code (black box) distributions and hence limited programmability and tractability encouraged us to respond to this rising demand with a new solution. This paper introduces an open source comprehensive software package for stochastic modeling of fracture networks in two- and three-dimension in discrete formulation. Functionalities included are geometric modeling (e.g., complex polygonal fracture faces, and utilizing directional statistics), simulations, characterizations (e.g., intersection, clustering and connectivity analyses) and applications (e.g., fluid flow). The package is completely written in Matlab scripting language. Significant efforts have been made to bring maximum flexibility to the functions in order to solve problems in both two- and three-dimensions in an easy and united way that is suitable for beginners, advanced and experienced users.

  12. Tessera: Open source software for accelerated data science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sego, Landon H.; Hafen, Ryan P.; Director, Hannah M.

    2014-06-30

    Extracting useful, actionable information from data can be a formidable challenge for the safeguards, nonproliferation, and arms control verification communities. Data scientists are often on the “front-lines” of making sense of complex and large datasets. They require flexible tools that make it easy to rapidly reformat large datasets, interactively explore and visualize data, develop statistical algorithms, and validate their approaches—and they need to perform these activities with minimal lines of code. Existing commercial software solutions often lack extensibility and the flexibility required to address the nuances of the demanding and dynamic environments where data scientists work. To address this need,more » Pacific Northwest National Laboratory developed Tessera, an open source software suite designed to enable data scientists to interactively perform their craft at the terabyte scale. Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving insight from data. We illustrate the use of Tessera with an example analysis of computer network data.« less

  13. The cretaceous source rocks in the Zagros Foothills of Iran: An example of a large size intracratonic basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bordenave, M.L.; Huc, A.Y.

    1993-02-01

    The Zagros orogenic belt of Iran is one of the world most prolific petroleum producing area. However, most of the oil production is originated from a relatively small area, the 60,000 km[sup 2] wide Dezful Embayment which contains approximately 12% of the proven oil global reserves. The distribution of the oil and gas fields results from the area extent of six identified source rock layers, their thermal history and reservoir, cap rock and trap availability. In this paper, the emphasis is three of the layers of Cretaceous sources rocks. The Garau facies was deposited during the Neocomian to Albian intervalmore » over Lurestan, Northeast Khuzestan and extends over the extreme northeast part of Fars, the Kazhdumi source rock which deposited over the Dezful Embayment, and eventually the Senonian Gurpi Formation which has marginal source rock characteristics in limited areas of Khuzestan and Northern Fars. The deposition environment of these source rock layers corresponds to semipermanent depressions, included in an overall shallow water intracratonic basin communicating with the South Tethys Ocean. These depressions became anoxic when climatic oceanographical and geological conditions were adequate, i.e., humid climate, high stand water, influxes of fine grained clastics and the existence of sills separating the depression from the open sea. Distribution maps of these source rock layers resulting from extensive field work and well control are also given. The maturation history of source rocks is reconstructed from a set of isopachs. It was found that the main contributor to the oil reserves is the Kazhdumi source rock which is associated with excellent calcareous reservoirs.« less

  14. Chemical Imaging Analysis of Environmental Particles Using the Focused Ion Beam/Scanning Electron Microscopy Technique. Microanalysis Insights into Atmospheric Chemistry of Fly Ash

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Haihan; Grassian, Vicki H.; Saraf, Laxmikant V.

    2012-11-08

    Airborne fly ash from coal combustion may represent a source of bioavailable iron (Fe) in the open ocean. However, few studies have been made focusing on Fe speciation and distribution in coal fly ash. In this study, chemical imaging of fly ash has been performed using a dual-beam FIB/SEM (focused ion beam/scanning electron microscope) system for a better understanding of how simulated atmospheric processing modify the morphology, chemical compositions and element distributions of individual particles. A novel approach has been applied for cross-sectioning of fly ash specimen with a FIB in order to explore element distribution within the interior ofmore » individual particles. Our results indicate that simulated atmospheric processing causes disintegration of aluminosilicate glass, a dominant material in fly ash particles. Aluminosilicate-phase Fe in the inner core of fly ash particles is more easily mobilized compared with oxide-phase Fe present as surface aggregates on fly ash spheres. Fe release behavior depends strongly on Fe speciation in aerosol particles. The approach for preparation of cross-sectioned specimen described here opens new opportunities for particle microanalysis, particular with respect to inorganic refractive materials like fly ash and mineral dust.« less

  15. Real-time sensor validation and fusion for distributed autonomous sensors

    NASA Astrophysics Data System (ADS)

    Yuan, Xiaojing; Li, Xiangshang; Buckles, Bill P.

    2004-04-01

    Multi-sensor data fusion has found widespread applications in industrial and research sectors. The purpose of real time multi-sensor data fusion is to dynamically estimate an improved system model from a set of different data sources, i.e., sensors. This paper presented a systematic and unified real time sensor validation and fusion framework (RTSVFF) based on distributed autonomous sensors. The RTSVFF is an open architecture which consists of four layers - the transaction layer, the process fusion layer, the control layer, and the planning layer. This paradigm facilitates distribution of intelligence to the sensor level and sharing of information among sensors, controllers, and other devices in the system. The openness of the architecture also provides a platform to test different sensor validation and fusion algorithms and thus facilitates the selection of near optimal algorithms for specific sensor fusion application. In the version of the model presented in this paper, confidence weighted averaging is employed to address the dynamic system state issue noted above. The state is computed using an adaptive estimator and dynamic validation curve for numeric data fusion and a robust diagnostic map for decision level qualitative fusion. The framework is then applied to automatic monitoring of a gas-turbine engine, including a performance comparison of the proposed real-time sensor fusion algorithms and a traditional numerical weighted average.

  16. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.

  17. Quantarctica: A Unique, Open, Standalone GIS Package for Antarctic Research and Education

    NASA Astrophysics Data System (ADS)

    Roth, George; Matsuoka, Kenichi; Skoglund, Anders; Melvær, Yngve; Tronstad, Stein

    2017-04-01

    The Norwegian Polar Institute has developed Quantarctica (http://quantarctica.npolar.no), an open GIS package for use by the international Antarctic community. Quantarctica includes a wide range of cartographic basemap layers, geophysical and glaciological datasets, and satellite imagery in standardized open file formats with a consistent Antarctic map projection and customized layer and labeling styles for quick, effective cartography. Quantarctica's strengths as an open science platform lie in 1) The complete, ready-to-use data package which includes full-resolution, original-quality vector and raster data, 2) A policy for freely-redistributable and modifiable data including all metadata and citations, and 3) QGIS, a free, full-featured, modular, offline-capable open-source GIS suite with a rapid and active development and support community. The Quantarctica team is actively incorporating more up-to-date, peer-reviewed, freely distributable pan-Antarctic geospatial datasets for the next version release in 2017. As part of this ongoing development, we are investigating the best approaches for quickly and seamlessly distributing new and updated data to users, storing datasets in efficient, open file formats while maintaining full data integrity, and coexisting with numerous online data portals in a way that most actively benefits the Antarctic community. A recent survey of Quantarctica users showed broad geographical adoption among Antarctic Treaty countries, including those outside the large US and UK Antarctic programs. Maps and figures produced by Quantarctica have also appeared in open-access journals and outside of the formal scientific community on popular science and GIS blogs. Our experience with the Quantarctica project has shown the tremendous value of education and outreach, not only in promoting open software, data formats, and practices, but in empowering Antarctic science groups to more effectively use GIS and geospatial data. Open practices are making a huge impact in Antarctic GIS, where individual countries have historically maintained their own restricted Antarctic geodatabases and where the next generation of scientists are entering the field with experience in using geospatial thinking for planning, visualization, and problem solving.

  18. Open source EMR software: profiling, insights and hands-on analysis.

    PubMed

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    The use of open source software in health informatics is increasingly advocated by authors in the literature. Although there is no clear evidence of the superiority of the current open source applications in the healthcare field, the number of available open source applications online is growing and they are gaining greater prominence. This repertoire of open source options is of a great value for any future-planner interested in adopting an electronic medical/health record system, whether selecting an existent application or building a new one. The following questions arise. How do the available open source options compare to each other with respect to functionality, usability and security? Can an implementer of an open source application find sufficient support both as a user and as a developer, and to what extent? Does the available literature provide adequate answers to such questions? This review attempts to shed some light on these aspects. The objective of this study is to provide more comprehensive guidance from an implementer perspective toward the available alternatives of open source healthcare software, particularly in the field of electronic medical/health records. The design of this study is twofold. In the first part, we profile the published literature on a sample of existent and active open source software in the healthcare area. The purpose of this part is to provide a summary of the available guides and studies relative to the sampled systems, and to identify any gaps in the published literature with respect to our research questions. In the second part, we investigate those alternative systems relative to a set of metrics, by actually installing the software and reporting a hands-on experience of the installation process, usability, as well as other factors. The literature covers many aspects of open source software implementation and utilization in healthcare practice. Roughly, those aspects could be distilled into a basic taxonomy, making the literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Getting Open Source Software into Schools: Strategies and Challenges

    ERIC Educational Resources Information Center

    Hepburn, Gary; Buley, Jan

    2006-01-01

    In this article Gary Hepburn and Jan Buley outline different approaches to implementing open source software (OSS) in schools; they also address the challenges that open source advocates should anticipate as they try to convince educational leaders to adopt OSS. With regard to OSS implementation, they note that schools have a flexible range of…

  20. Open Source Library Management Systems: A Multidimensional Evaluation

    ERIC Educational Resources Information Center

    Balnaves, Edmund

    2008-01-01

    Open source library management systems have improved steadily in the last five years. They now present a credible option for small to medium libraries and library networks. An approach to their evaluation is proposed that takes account of three additional dimensions that only open source can offer: the developer and support community, the source…

Top