New Open-Source Version of FLORIS Released | News | NREL
New Open-Source Version of FLORIS Released New Open-Source Version of FLORIS Released January 26 , 2018 National Renewable Energy Laboratory (NREL) researchers recently released an updated open-source simplified and documented. Because of the living, open-source nature of the newly updated utility, NREL
The Visible Human Data Sets (VHD) and Insight Toolkit (ITk): Experiments in Open Source Software
Ackerman, Michael J.; Yoo, Terry S.
2003-01-01
From its inception in 1989, the Visible Human Project was designed as an experiment in open source software. In 1994 and 1995 the male and female Visible Human data sets were released by the National Library of Medicine (NLM) as open source data sets. In 2002 the NLM released the first version of the Insight Toolkit (ITk) as open source software. PMID:14728278
Building CHAOS: An Operating System for Livermore Linux Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garlick, J E; Dunlap, C M
2003-02-21
The Livermore Computing (LC) Linux Integration and Development Project (the Linux Project) produces and supports the Clustered High Availability Operating System (CHAOS), a cluster operating environment based on Red Hat Linux. Each CHAOS release begins with a set of requirements and ends with a formally tested, packaged, and documented release suitable for use on LC's production Linux clusters. One characteristic of CHAOS is that component software packages come from different sources under varying degrees of project control. Some are developed by the Linux Project, some are developed by other LC projects, some are external open source projects, and some aremore » commercial software packages. A challenge to the Linux Project is to adhere to release schedules and testing disciplines in a diverse, highly decentralized development environment. Communication channels are maintained for externally developed packages in order to obtain support, influence development decisions, and coordinate/understand release schedules. The Linux Project embraces open source by releasing locally developed packages under open source license, by collaborating with open source projects where mutually beneficial, and by preferring open source over proprietary software. Project members generally use open source development tools. The Linux Project requires system administrators and developers to work together to resolve problems that arise in production. This tight coupling of production and development is a key strategy for making a product that directly addresses LC's production requirements. It is another challenge to balance support and development activities in such a way that one does not overwhelm the other.« less
SolTrace | Concentrating Solar Power | NREL
NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted
Developing an Open Source Option for NASA Software
NASA Technical Reports Server (NTRS)
Moran, Patrick J.; Parks, John W. (Technical Monitor)
2003-01-01
We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.
Masking release by combined spatial and masker-fluctuation effects in the open sound field.
Middlebrooks, John C
2017-12-01
In a complex auditory scene, signals of interest can be distinguished from masking sounds by differences in source location [spatial release from masking (SRM)] and by differences between masker-alone and masker-plus-signal envelopes. This study investigated interactions between those factors in release of masking of 700-Hz tones in an open sound field. Signal and masker sources were colocated in front of the listener, or the signal source was shifted 90° to the side. In Experiment 1, the masker contained a 25-Hz-wide on-signal band plus flanking bands having envelopes that were either mutually uncorrelated or were comodulated. Comodulation masking release (CMR) was largely independent of signal location at a higher masker sound level, but at a lower level CMR was reduced for the lateral signal location. In Experiment 2, a brief signal was positioned at the envelope maximum (peak) or minimum (dip) of a 50-Hz-wide on-signal masker. Masking was released in dip more than in peak conditions only for the 90° signal. Overall, open-field SRM was greater in magnitude than binaural masking release reported in comparable closed-field studies, and envelope-related release was somewhat weaker. Mutual enhancement of masking release by spatial and envelope-related effects tended to increase with increasing masker level.
16 CFR 1211.14 - Instruction manual.
Code of Federal Regulations, 2011 CFR
2011-01-01
... opener. 4. Where possible, install door opener 7 feet or more above the floor. For products requiring an emergency release, mount the emergency release 6 feet above the floor. 5. Do not connect opener to source of... height of 5 feet so small children cannot reach it, and (c) away from all moving parts of the door. 7...
Open Source Initiative Powers Real-Time Data Streams
NASA Technical Reports Server (NTRS)
2014-01-01
Under an SBIR contract with Dryden Flight Research Center, Creare Inc. developed a data collection tool called the Ring Buffered Network Bus. The technology has now been released under an open source license and is hosted by the Open Source DataTurbine Initiative. DataTurbine allows anyone to stream live data from sensors, labs, cameras, ocean buoys, cell phones, and more.
The Core Flight System (cFS) Community: Providing Low Cost Solutions for Small Spacecraft
NASA Technical Reports Server (NTRS)
McComas, David; Wilmot, Jonathan; Cudmore, Alan
2016-01-01
In February 2015 the NASA Goddard Space Flight Center (GSFC) completed the open source release of the entire Core Flight Software (cFS) suite. After the open source release a multi-NASA center Configuration Control Board (CCB) was established that has managed multiple cFS product releases. The cFS was developed and is being maintained in compliance with the NASA Class B software development process requirements and the open source release includes all Class B artifacts. The cFS is currently running on three operational science spacecraft and is being used on multiple spacecraft and instrument development efforts. While the cFS itself is a viable flight software (FSW) solution, we have discovered that the cFS community is a continuous source of innovation and growth that provides products and tools that serve the entire FSW lifecycle and future mission needs. This paper summarizes the current state of the cFS community, the key FSW technologies being pursued, the development/verification tools and opportunities for the small satellite community to become engaged. The cFS is a proven high quality and cost-effective solution for small satellites with constrained budgets.
Open-Source Intelligence in the Czech Military: Knowledge System and Process Design
2002-06-01
in Open-Source Intelligence OSINT, as one of the intelligence disciplines, bears some of the general problems of intelligence " business " OSINT...ADAPTING KNOWLEDGE MANAGEMENT THEORY TO THE CZECH MILITARY INTELLIGENCE Knowledge work is the core business of the military intelligence . As...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited OPEN-SOURCE INTELLIGENCE IN THE
RMG An Open Source Electronic Structure Code for Multi-Petaflops Calculations
NASA Astrophysics Data System (ADS)
Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Bernholc, Jerzy
RMG (Real-space Multigrid) is an open source, density functional theory code for quantum simulations of materials. It solves the Kohn-Sham equations on real-space grids, which allows for natural parallelization via domain decomposition. Either subspace or Davidson diagonalization, coupled with multigrid methods, are used to accelerate convergence. RMG is a cross platform open source package which has been used in the study of a wide range of systems, including semiconductors, biomolecules, and nanoscale electronic devices. It can optionally use GPU accelerators to improve performance on systems where they are available. The recently released versions (>2.0) support multiple GPU's per compute node, have improved performance and scalability, enhanced accuracy and support for additional hardware platforms. New versions of the code are regularly released at http://www.rmgdft.org. The releases include binaries for Linux, Windows and MacIntosh systems, automated builds for clusters using cmake, as well as versions adapted to the major supercomputing installations and platforms. Several recent, large-scale applications of RMG will be discussed.
Open source tools and toolkits for bioinformatics: significance, and where are we?
Stajich, Jason E; Lapp, Hilmar
2006-09-01
This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.
PyPedal, an open source software package for pedigree analysis
USDA-ARS?s Scientific Manuscript database
The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...
Weather forecasting with open source software
NASA Astrophysics Data System (ADS)
Rautenhaus, Marc; Dörnbrack, Andreas
2013-04-01
To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.
mdFoam+: Advanced molecular dynamics in OpenFOAM
NASA Astrophysics Data System (ADS)
Longshaw, S. M.; Borg, M. K.; Ramisetti, S. B.; Zhang, J.; Lockerby, D. A.; Emerson, D. R.; Reese, J. M.
2018-03-01
This paper introduces mdFoam+, which is an MPI parallelised molecular dynamics (MD) solver implemented entirely within the OpenFOAM software framework. It is open-source and released under the same GNU General Public License (GPL) as OpenFOAM. The source code is released as a publicly open software repository that includes detailed documentation and tutorial cases. Since mdFoam+ is designed entirely within the OpenFOAM C++ object-oriented framework, it inherits a number of key features. The code is designed for extensibility and flexibility, so it is aimed first and foremost as an MD research tool, in which new models and test cases can be developed and tested rapidly. Implementing mdFoam+ in OpenFOAM also enables easier development of hybrid methods that couple MD with continuum-based solvers. Setting up MD cases follows the standard OpenFOAM format, as mdFoam+ also relies upon the OpenFOAM dictionary-based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of an MD simulation is not typical of most OpenFOAM applications. Results show that mdFoam+ compares well to another well-known MD code (e.g. LAMMPS) in terms of benchmark problems, although it also has additional functionality that does not exist in other open-source MD codes.
Crawling The Web for Libre: Selecting, Integrating, Extending and Releasing Open Source Software
NASA Astrophysics Data System (ADS)
Truslove, I.; Duerr, R. E.; Wilcox, H.; Savoie, M.; Lopez, L.; Brandt, M.
2012-12-01
Libre is a project developed by the National Snow and Ice Data Center (NSIDC). Libre is devoted to liberating science data from its traditional constraints of publication, location, and findability. Libre embraces and builds on the notion of making knowledge freely available, and both Creative Commons licensed content and Open Source Software are crucial building blocks for, as well as required deliverable outcomes of the project. One important aspect of the Libre project is to discover cryospheric data published on the internet without prior knowledge of the location or even existence of that data. Inspired by well-known search engines and their underlying web crawling technologies, Libre has explored tools and technologies required to build a search engine tailored to allow users to easily discover geospatial data related to the polar regions. After careful consideration, the Libre team decided to base its web crawling work on the Apache Nutch project (http://nutch.apache.org). Nutch is "an open source web-search software project" written in Java, with good documentation, a significant user base, and an active development community. Nutch was installed and configured to search for the types of data of interest, and the team created plugins to customize the default Nutch behavior to better find and categorize these data feeds. This presentation recounts the Libre team's experiences selecting, using, and extending Nutch, and working with the Nutch user and developer community. We will outline the technical and organizational challenges faced in order to release the project's software as Open Source, and detail the steps actually taken. We distill these experiences into a set of heuristics and recommendations for using, contributing to, and releasing Open Source Software.
Tycho 2: A Proxy Application for Kinetic Transport Sweeps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garrett, Charles Kristopher; Warsa, James S.
2016-09-14
Tycho 2 is a proxy application that implements discrete ordinates (SN) kinetic transport sweeps on unstructured, 3D, tetrahedral meshes. It has been designed to be small and require minimal dependencies to make collaboration and experimentation as easy as possible. Tycho 2 has been released as open source software. The software is currently in a beta release with plans for a stable release (version 1.0) before the end of the year. The code is parallelized via MPI across spatial cells and OpenMP across angles. Currently, several parallelization algorithms are implemented.
Open burning for waste disposal is, in many countries, the dominant source of polychlorinated dibenzodioxins/dibenzofurans and polychlorinated biphenyls (PCDD/PCDF/PCB) release to the environment. To generate emission factors for open burning, experimental pile burns of ca 100 k...
Open source clustering software.
de Hoon, M J L; Imoto, S; Nolan, J; Miyano, S
2004-06-12
We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.
Freeing Worldview's development process: Open source everything!
NASA Astrophysics Data System (ADS)
Gunnoe, T.
2016-12-01
Freeing your code and your project are important steps for creating an inviting environment for collaboration, with the added side effect of keeping a good relationship with your users. NASA Worldview's codebase was released with the open source NOSA (NASA Open Source Agreement) license in 2014, but this is only the first step. We also have to free our ideas, empower our users by involving them in the development process, and open channels that lead to the creation of a community project. There are many highly successful examples of Free and Open Source Software (FOSS) projects of which we can take note: the Linux kernel, Debian, GNOME, etc. These projects owe much of their success to having a passionate mix of developers/users with a great community and a common goal in mind. This presentation will describe the scope of this openness and how Worldview plans to move forward with a more community-inclusive approach.
NASA Astrophysics Data System (ADS)
Ferro, Andrea R.; Klepeis, Neil E.; Ott, Wayne R.; Nazaroff, William W.; Hildemann, Lynn M.; Switzer, Paul
Residential interior door positions influence the pollutant concentrations that result from short-term indoor sources, such as cigarettes, candles, and incense. To elucidate this influence, we reviewed past studies and conducted new experiments in three residences: a single-story 714 m 3 ranch-style house, a 510 m 3 two-story split-level house, and a 200 m 3 two-story house. During the experiments, we released sulfur hexafluoride or carbon monoxide tracer gas over short periods (≤30 min) and measured concentrations in the source room and at least one other (receptor) room for various interior door opening positions. We found that closing a door between rooms effectively prevented transport of air pollutants, reducing the average concentration in the receptor room relative to the source room by 57-100% over exposure periods of 1-8 h. When intervening doors were partially or fully open, the reduction in average concentrations ranged from 3% to 99%, varying as a function of door opening width and the distance between source and receptor rooms.
Open Source Clinical NLP - More than Any Single System.
Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.
Impact of methane flow through deformable lake sediments on atmospheric release
NASA Astrophysics Data System (ADS)
Scandella, B.; Juanes, R.
2010-12-01
Methane is a potent greenhouse gas that is generated geothermally and biologically in lake and ocean sediments. Free gas bubbles may escape oxidative traps and contribute more to the atmospheric source than dissolved methane, but the details of the methane release depend on the interactions between the multiple fluid phases and the deformable porous medium. We present a model and supporting laboratory experiments of methane release through “breathing” dynamic flow conduits that open in response to drops in the hydrostatic load on lake sediments, which has been validated against a high-resolution record of free gas flux and hydrostatic pressure in Upper Mystic Lake, MA. In contrast to previous linear elastic fracture mechanics analysis of gassy sediments, the evolution of gas transport in a deformable compliant sediment is presented within the framework of multiphase poroplasticity. Experiments address how strongly the mode and rate of gas flow, captured by our model, impacts the size of bubbles released into the water column. A bubble's size in turn determines how efficiently it transports methane to the atmosphere, and integrating this effect will be critical to improving estimates of the atmospheric methane source from lakes. Cross-sectional schematic of lake sediments showing two venting sites: one open at left and one closed at right. The vertical release of gas bubbles (red) at the open venting site creates a local pressure drop, which drives both bubble formation from the methane-rich pore water (higher concentrations shaded darker red) and lateral advection of dissolved methane (purple arrows). Even as bubbles in the open site escape, those at the closed site remain trapped.
Data Mining Meets HCI: Making Sense of Large Graphs
2012-07-01
graph algo- rithms, won the Open Source Software World Challenge, Silver Award. We have released Pegasus as free , open-source software, downloaded by...METIS [77], spectral clustering [108], and the parameter- free “Cross-associations” (CA) [26]. Belief Propagation can also be used for clus- tering, as...number of tools have been developed to support “ landscape ” views of information. These include WebBook and Web- Forager [23], which use a book metaphor
Open Source Clinical NLP – More than Any Single System
Masanz, James; Pakhomov, Serguei V.; Xu, Hua; Wu, Stephen T.; Chute, Christopher G.; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP’s mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice. PMID:25954581
Forward Field Computation with OpenMEEG
Gramfort, Alexandre; Papadopoulo, Théodore; Olivi, Emmanuel; Clerc, Maureen
2011-01-01
To recover the sources giving rise to electro- and magnetoencephalography in individual measurements, realistic physiological modeling is required, and accurate numerical solutions must be computed. We present OpenMEEG, which solves the electromagnetic forward problem in the quasistatic regime, for head models with piecewise constant conductivity. The core of OpenMEEG consists of the symmetric Boundary Element Method, which is based on an extended Green Representation theorem. OpenMEEG is able to provide lead fields for four different electromagnetic forward problems: Electroencephalography (EEG), Magnetoencephalography (MEG), Electrical Impedance Tomography (EIT), and intracranial electric potentials (IPs). OpenMEEG is open source and multiplatform. It can be used from Python and Matlab in conjunction with toolboxes that solve the inverse problem; its integration within FieldTrip is operational since release 2.0. PMID:21437231
CosmoQuest Transient Tracker: Opensource Photometry & Astrometry software
NASA Astrophysics Data System (ADS)
Myers, Joseph L.; Lehan, Cory; Gay, Pamela; Richardson, Matthew; CosmoQuest Team
2018-01-01
CosmoQuest is moving from online citizen science, to observational astronomy with the creation of Transient Trackers. This open source software is designed to identify asteroids and other transient/variable objects in image sets. Transient Tracker’s features in final form will include: astrometric and photometric solutions, identification of moving/transient objects, identification of variable objects, and lightcurve analysis. In this poster we present our initial, v0.1 release and seek community input.This software builds on the existing NIH funded ImageJ libraries. Creation of this suite of opensource image manipulation routines is lead by Wayne Rasband and is released primarily under the MIT license. In this release, we are building on these libraries to add source identification for point / point-like sources, and to do astrometry. Our materials released under the Apache 2.0 license on github (http://github.com/CosmoQuestTeam) and documentation can be found at http://cosmoquest.org/TransientTracker.
Special population planner 4 : an open source release.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuiper, J.; Metz, W.; Tanzman, E.
2008-01-01
Emergencies like Hurricane Katrina and the recent California wildfires underscore the critical need to meet the complex challenge of planning for individuals with special needs and for institutionalized special populations. People with special needs and special populations often have difficulty responding to emergencies or taking protective actions, and emergency responders may be unaware of their existence and situations during a crisis. Special Population Planner (SPP) is an ArcGIS-based emergency planning system released as an open source product. SPP provides for easy production of maps, reports, and analyses to develop and revise emergency response plans. It includes tools to manage amore » voluntary registry of data for people with special needs, integrated links to plans and documents, tools for response planning and analysis, preformatted reports and maps, and data on locations of special populations, facility and resource characteristics, and contacts. The system can be readily adapted for new settings without programming and is broadly applicable. Full documentation and a demonstration database are included in the release.« less
Development of an open-source web-based intervention for Brazilian smokers - Viva sem Tabaco.
Gomide, H P; Bernardino, H S; Richter, K; Martins, L F; Ronzani, T M
2016-08-02
Web-based interventions for smoking cessation available in Portuguese do not adhere to evidence-based treatment guidelines. Besides, all existing web-based interventions are built on proprietary platforms that developing countries often cannot afford. We aimed to describe the development of "Viva sem Tabaco", an open-source web-based intervention. The development of the intervention included the selection of content from evidence-based guidelines for smoking cessation, the design of the first layout, conduction of 2 focus groups to identify potential features, refinement of the layout based on focus groups and correction of content based on feedback provided by specialists on smoking cessation. At the end, we released the source-code and intervention on the Internet and translated it into Spanish and English. The intervention developed fills gaps in the information available in Portuguese and the lack of open-source interventions for smoking cessation. The open-source licensing format and its translation system may help researchers from different countries deploying evidence-based interventions for smoking cessation.
Open-Source Sequence Clustering Methods Improve the State Of the Art.
Kopylova, Evguenia; Navas-Molina, Jose A; Mercier, Céline; Xu, Zhenjiang Zech; Mahé, Frédéric; He, Yan; Zhou, Hong-Wei; Rognes, Torbjørn; Caporaso, J Gregory; Knight, Rob
2016-01-01
Sequence clustering is a common early step in amplicon-based microbial community analysis, when raw sequencing reads are clustered into operational taxonomic units (OTUs) to reduce the run time of subsequent analysis steps. Here, we evaluated the performance of recently released state-of-the-art open-source clustering software products, namely, OTUCLUST, Swarm, SUMACLUST, and SortMeRNA, against current principal options (UCLUST and USEARCH) in QIIME, hierarchical clustering methods in mothur, and USEARCH's most recent clustering algorithm, UPARSE. All the latest open-source tools showed promising results, reporting up to 60% fewer spurious OTUs than UCLUST, indicating that the underlying clustering algorithm can vastly reduce the number of these derived OTUs. Furthermore, we observed that stringent quality filtering, such as is done in UPARSE, can cause a significant underestimation of species abundance and diversity, leading to incorrect biological results. Swarm, SUMACLUST, and SortMeRNA have been included in the QIIME 1.9.0 release. IMPORTANCE Massive collections of next-generation sequencing data call for fast, accurate, and easily accessible bioinformatics algorithms to perform sequence clustering. A comprehensive benchmark is presented, including open-source tools and the popular USEARCH suite. Simulated, mock, and environmental communities were used to analyze sensitivity, selectivity, species diversity (alpha and beta), and taxonomic composition. The results demonstrate that recent clustering algorithms can significantly improve accuracy and preserve estimated diversity without the application of aggressive filtering. Moreover, these tools are all open source, apply multiple levels of multithreading, and scale to the demands of modern next-generation sequencing data, which is essential for the analysis of massive multidisciplinary studies such as the Earth Microbiome Project (EMP) (J. A. Gilbert, J. K. Jansson, and R. Knight, BMC Biol 12:69, 2014, http://dx.doi.org/10.1186/s12915-014-0069-1).
Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.
Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel
2015-01-01
There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).
The Mercury Monitoring Workshop was developed because mercury contamination, both nationally and internationally, has long been recognized as a growing problem for both humans and ecosystems. Mercury is released to the environment from a variety of human (anthropogenic) sources i...
Clawpack: Building an open source ecosystem for solving hyperbolic PDEs
Iverson, Richard M.; Mandli, K.T.; Ahmadia, Aron J.; Berger, M.J.; Calhoun, Donna; George, David L.; Hadjimichael, Y.; Ketcheson, David I.; Lemoine, Grady L.; LeVeque, Randall J.
2016-01-01
Clawpack is a software package designed to solve nonlinear hyperbolic partial differential equations using high-resolution finite volume methods based on Riemann solvers and limiters. The package includes a number of variants aimed at different applications and user communities. Clawpack has been actively developed as an open source project for over 20 years. The latest major release, Clawpack 5, introduces a number of new features and changes to the code base and a new development model based on GitHub and Git submodules. This article provides a summary of the most significant changes, the rationale behind some of these changes, and a description of our current development model. Clawpack: building an open source ecosystem for solving hyperbolic PDEs.
NASA Astrophysics Data System (ADS)
Zelt, C. A.
2017-12-01
Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site, ideally maintained by someone in a funded position. Perhaps the biggest challenge is the reality that researches who use software, as opposed to develop software, are more attractive university hires because they are more likely to be "big picture" scientists that publish in the highest profile journals, although sometimes the two go together.
Importance of vesicle release stochasticity in neuro-spike communication.
Ramezani, Hamideh; Akan, Ozgur B
2017-07-01
Aim of this paper is proposing a stochastic model for vesicle release process, a part of neuro-spike communication. Hence, we study biological events occurring in this process and use microphysiological simulations to observe functionality of these events. Since the most important source of variability in vesicle release probability is opening of voltage dependent calcium channels (VDCCs) followed by influx of calcium ions through these channels, we propose a stochastic model for this event, while using a deterministic model for other variability sources. To capture the stochasticity of calcium influx to pre-synaptic neuron in our model, we study its statistics and find that it can be modeled by a distribution defined based on Normal and Logistic distributions.
Command & Control in Virtual Environments: Designing a Virtual Environment for Experimentation
2010-06-01
proceed with the research: Second Life/ OpenSim A popular leader in the desktop virtual worlds revolution, for many Second Life has become...prototype environments and adapt them quickly within the world. OpenSim is an open-source community built around upon the Second Life platform...functionality natively present in Second Life and the Opensim platform. With the recent release of Second Life Viewer 2.0, which contains a complete
pyOpenMS: a Python-based interface to the OpenMS mass-spectrometry algorithm library.
Röst, Hannes L; Schmitt, Uwe; Aebersold, Ruedi; Malmström, Lars
2014-01-01
pyOpenMS is an open-source, Python-based interface to the C++ OpenMS library, providing facile access to a feature-rich, open-source algorithm library for MS-based proteomics analysis. It contains Python bindings that allow raw access to the data structures and algorithms implemented in OpenMS, specifically those for file access (mzXML, mzML, TraML, mzIdentML among others), basic signal processing (smoothing, filtering, de-isotoping, and peak-picking) and complex data analysis (including label-free, SILAC, iTRAQ, and SWATH analysis tools). pyOpenMS thus allows fast prototyping and efficient workflow development in a fully interactive manner (using the interactive Python interpreter) and is also ideally suited for researchers not proficient in C++. In addition, our code to wrap a complex C++ library is completely open-source, allowing other projects to create similar bindings with ease. The pyOpenMS framework is freely available at https://pypi.python.org/pypi/pyopenms while the autowrap tool to create Cython code automatically is available at https://pypi.python.org/pypi/autowrap (both released under the 3-clause BSD licence). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
scikit-image: image processing in Python.
van der Walt, Stéfan; Schönberger, Johannes L; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony
2014-01-01
scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org.
Katzman, G L; Morris, D; Lauman, J; Cochella, C; Goede, P; Harnsberger, H R
2001-06-01
To foster a community supported evaluation processes for open-source digital teaching file (DTF) development and maintenance. The mechanisms used to support this process will include standard web browsers, web servers, forum software, and custom additions to the forum software to potentially enable a mediated voting protocol. The web server will also serve as a focal point for beta and release software distribution, which is the desired end-goal of this process. We foresee that www.mdtf.org will provide for widespread distribution of open source DTF software that will include function and interface design decisions from community participation on the website forums.
AIR EMISSIONS FROM SCRAP TIRE COMBUSTION
The report discusses air emissions from two types of scrap tire combustion: uncontrolled and controlled. Uncontrolled sources are open tire fires, which produce many unhealthful products of incomplete combustion and release them directly into the atmosphere. Controlled combustion...
The Future of ECHO: Evaluating Open Source Possibilities
NASA Astrophysics Data System (ADS)
Pilone, D.; Gilman, J.; Baynes, K.; Mitchell, A. E.
2012-12-01
NASA's Earth Observing System ClearingHOuse (ECHO) is a format agnostic metadata repository supporting over 3000 collections and 100M science granules. ECHO exposes FTP and RESTful Data Ingest APIs in addition to both SOAP and RESTful search and order capabilities. Built on top of ECHO is a human facing search and order web application named Reverb. ECHO processes hundreds of orders, tens of thousands of searches, and 1-2M ingest actions each week. As ECHO's holdings, metadata format support, and visibility have increased, the ECHO team has received requests by non-NASA entities for copies of ECHO that can be run locally against their data holdings. ESDIS and the ECHO Team have begun investigations into various deployment and Open Sourcing models that can balance the real constraints faced by the ECHO project with the benefits of providing ECHO capabilities to a broader set of users and providers. This talk will discuss several release and Open Source models being investigated by the ECHO team along with the impacts those models are expected to have on the project. We discuss: - Addressing complex deployment or setup issues for potential users - Models of vetting code contributions - Balancing external (public) user requests versus our primary partners - Preparing project code for public release, including navigating licensing issues related to leveraged libraries - Dealing with non-free project dependencies such as commercial databases - Dealing with sensitive aspects of project code such as database passwords, authentication approaches, security through obscurity, etc. - Ongoing support for the released code including increased testing demands, bug fixes, security fixes, and new features.
The openEHR Java reference implementation project.
Chen, Rong; Klein, Gunnar
2007-01-01
The openEHR foundation has developed an innovative design for interoperable and future-proof Electronic Health Record (EHR) systems based on a dual model approach with a stable reference information model complemented by archetypes for specific clinical purposes.A team from Sweden has implemented all the stable specifications in the Java programming language and donated the source code to the openEHR foundation. It was adopted as the openEHR Java Reference Implementation in March 2005 and released under open source licenses. This encourages early EHR implementation projects around the world and a number of groups have already started to use this code. The early Java implementation experience has also led to the publication of the openEHR Java Implementation Technology Specification. A number of design changes to the specifications and important minor corrections have been directly initiated by the implementation project over the last two years. The Java Implementation has been important for the validation and improvement of the openEHR design specifications and provides building blocks for future EHR systems.
scikit-image: image processing in Python
Schönberger, Johannes L.; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D.; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony
2014-01-01
scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921
EMISIONES AL AIRE DE LA COMBUSTION DE LLANTAS USADAS (SPANISH VERSION)
The report discusses air emissions from two types of scrap tire combustion: uncontrolled and controlled. Uncontrolled sources are open tire fires, which produce many unhealthful products of incomplete combustion and release them directly into the atmosphere. Controlled combustion...
Open-Source Wax RepRap 3-D Printer for Rapid Prototyping Paper-Based Microfluidics.
Pearce, J M; Anzalone, N C; Heldt, C L
2016-08-01
The open-source release of self-replicating rapid prototypers (RepRaps) has created a rich opportunity for low-cost distributed digital fabrication of complex 3-D objects such as scientific equipment. For example, 3-D printable reactionware devices offer the opportunity to combine open hardware microfluidic handling with lab-on-a-chip reactionware to radically reduce costs and increase the number and complexity of microfluidic applications. To further drive down the cost while improving the performance of lab-on-a-chip paper-based microfluidic prototyping, this study reports on the development of a RepRap upgrade capable of converting a Prusa Mendel RepRap into a wax 3-D printer for paper-based microfluidic applications. An open-source hardware approach is used to demonstrate a 3-D printable upgrade for the 3-D printer, which combines a heated syringe pump with the RepRap/Arduino 3-D control. The bill of materials, designs, basic assembly, and use instructions are provided, along with a completely free and open-source software tool chain. The open-source hardware device described here accelerates the potential of the nascent field of electrochemical detection combined with paper-based microfluidics by dropping the marginal cost of prototyping to nearly zero while accelerating the turnover between paper-based microfluidic designs. © 2016 Society for Laboratory Automation and Screening.
Hoffman, John M; Noo, Frédéric; Young, Stefano; Hsieh, Scott S; McNitt-Gray, Michael
2018-06-01
To facilitate investigations into the impacts of acquisition and reconstruction parameters on quantitative imaging, radiomics and CAD using CT imaging, we previously released an open source implementation of a conventional weighted filtered backprojection reconstruction called FreeCT_wFBP. Our purpose was to extend that work by providing an open-source implementation of a model-based iterative reconstruction method using coordinate descent optimization, called FreeCT_ICD. Model-based iterative reconstruction offers the potential for substantial radiation dose reduction, but can impose substantial computational processing and storage requirements. FreeCT_ICD is an open source implementation of a model-based iterative reconstruction method that provides a reasonable tradeoff between these requirements. This was accomplished by adapting a previously proposed method that allows the system matrix to be stored with a reasonable memory requirement. The method amounts to describing the attenuation coefficient using rotating slices that follow the helical geometry. In the initially-proposed version, the rotating slices are themselves described using blobs. We have replaced this description by a unique model that relies on tri-linear interpolation together with the principles of Joseph's method. This model offers an improvement in memory requirement while still allowing highly accurate reconstruction for conventional CT geometries. The system matrix is stored column-wise and combined with an iterative coordinate descent (ICD) optimization. The result is FreeCT_ICD, which is a reconstruction program developed on the Linux platform using C++ libraries and the open source GNU GPL v2.0 license. The software is capable of reconstructing raw projection data of helical CT scans. In this work, the software has been described and evaluated by reconstructing datasets exported from a clinical scanner which consisted of an ACR accreditation phantom dataset and a clinical pediatric thoracic scan. For the ACR phantom, image quality was comparable to clinical reconstructions as well as reconstructions using open-source FreeCT_wFBP software. The pediatric thoracic scan also yielded acceptable results. In addition, we did not observe any deleterious impact in image quality associated with the utilization of rotating slices. These evaluations also demonstrated reasonable tradeoffs in storage requirements and computational demands. FreeCT_ICD is an open-source implementation of a model-based iterative reconstruction method that extends the capabilities of previously released open source reconstruction software and provides the ability to perform vendor-independent reconstructions of clinically acquired raw projection data. This implementation represents a reasonable tradeoff between storage and computational requirements and has demonstrated acceptable image quality in both simulated and clinical image datasets. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
NASA's Earth Imagery Service as Open Source Software
NASA Astrophysics Data System (ADS)
De Cesare, C.; Alarcon, C.; Huang, T.; Roberts, J. T.; Rodriguez, J.; Cechini, M. F.; Boller, R. A.; Baynes, K.
2016-12-01
The NASA Global Imagery Browse Service (GIBS) is a software system that provides access to an archive of historical and near-real-time Earth imagery from NASA-supported satellite instruments. The imagery itself is open data, and is accessible via standards such as the Open Geospatial Consortium (OGC)'s Web Map Tile Service (WMTS) protocol. GIBS includes three core software projects: The Imagery Exchange (TIE), OnEarth, and the Meta Raster Format (MRF) project. These projects are developed using a variety of open source software, including: Apache HTTPD, GDAL, Mapserver, Grails, Zookeeper, Eclipse, Maven, git, and Apache Commons. TIE has recently been released for open source, and is now available on GitHub. OnEarth, MRF, and their sub-projects have been on GitHub since 2014, and the MRF project in particular receives many external contributions from the community. Our software has been successful beyond the scope of GIBS: the PO.DAAC State of the Ocean and COVERAGE visualization projects reuse components from OnEarth. The MRF source code has recently been incorporated into GDAL, which is a core library in many widely-used GIS software such as QGIS and GeoServer. This presentation will describe the challenges faced in incorporating open software and open data into GIBS, and also showcase GIBS as a platform on which scientists and the general public can build their own applications.
The role of open-source software in innovation and standardization in radiology.
Erickson, Bradley J; Langer, Steve; Nagy, Paul
2005-11-01
The use of open-source software (OSS), in which developers release the source code to applications they have developed, is popular in the software industry. This is done to allow others to modify and improve software (which may or may not be shared back to the community) and to allow others to learn from the software. Radiology was an early participant in this model, supporting OSS that implemented the ACR-National Electrical Manufacturers Association (now Digital Imaging and Communications in Medicine) standard for medical image communications. In radiology and in other fields, OSS has promoted innovation and the adoption of standards. Popular OSS is of high quality because access to source code allows many people to identify and resolve errors. Open-source software is analogous to the peer-review scientific process: one must be able to see and reproduce results to understand and promote what is shared. The authors emphasize that support for OSS need not threaten vendors; most vendors embrace and benefit from standards. Open-source development does not replace vendors but more clearly defines their roles, typically focusing on areas in which proprietary differentiators benefit customers and on professional services such as implementation planning and service. Continued support for OSS is essential for the success of our field.
Bioclipse: an open source workbench for chemo- and bioinformatics.
Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl E S
2007-02-22
There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no successful attempts have been made to integrate chemo- and bioinformatics into a single framework. Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at http://www.bioclipse.net.
Gichoya, Judy W; Kohli, Marc; Ivange, Larry; Schmidt, Teri S; Purkayastha, Saptarshi
2018-05-10
Open-source development can provide a platform for innovation by seeking feedback from community members as well as providing tools and infrastructure to test new standards. Vendors of proprietary systems may delay adoption of new standards until there are sufficient incentives such as legal mandates or financial incentives to encourage/mandate adoption. Moreover, open-source systems in healthcare have been widely adopted in low- and middle-income countries and can be used to bridge gaps that exist in global health radiology. Since 2011, the authors, along with a community of open-source contributors, have worked on developing an open-source radiology information system (RIS) across two communities-OpenMRS and LibreHealth. The main purpose of the RIS is to implement core radiology workflows, on which others can build and test new radiology standards. This work has resulted in three major releases of the system, with current architectural changes driven by changing technology, development of new standards in health and imaging informatics, and changing user needs. At their core, both these communities are focused on building general-purpose EHR systems, but based on user contributions from the fringes, we have been able to create an innovative system that has been used by hospitals and clinics in four different countries. We provide an overview of the history of the LibreHealth RIS, the architecture of the system, overview of standards integration, describe challenges of developing an open-source product, and future directions. Our goal is to attract more participation and involvement to further develop the LibreHealth RIS into an Enterprise Imaging System that can be used in other clinical imaging including pathology and dermatology.
Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L
2018-02-01
Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.
Fisher Matrix Preloaded — FISHER4CAST
NASA Astrophysics Data System (ADS)
Bassett, Bruce A.; Fantaye, Yabebal; Hlozek, Renée; Kotze, Jacques
The Fisher Matrix is the backbone of modern cosmological forecasting. We describe the Fisher4Cast software: A general-purpose, easy-to-use, Fisher Matrix framework. It is open source, rigorously designed and tested and includes a Graphical User Interface (GUI) with automated LATEX file creation capability and point-and-click Fisher ellipse generation. Fisher4Cast was designed for ease of extension and, although written in Matlab, is easily portable to open-source alternatives such as Octave and Scilab. Here we use Fisher4Cast to present new 3D and 4D visualizations of the forecasting landscape and to investigate the effects of growth and curvature on future cosmological surveys. Early releases have been available at since mid-2008. The current release of the code is Version 2.2 which is described here. For ease of reference a Quick Start guide and the code used to produce the figures in this paper are included, in the hope that it will be useful to the cosmology and wider scientific communities.
THE BERKELEY DATA ANALYSIS SYSTEM (BDAS): AN OPEN SOURCE PLATFORM FOR BIG DATA ANALYTICS
2017-09-01
Evan Sparks, Oliver Zahn, Michael J. Franklin, David A. Patterson, Saul Perlmutter. Scientific Computing Meets Big Data Technology: An Astronomy ...Processing Astronomy Imagery Using Big Data Technology. IEEE Transaction on Big Data, 2016. Approved for Public Release; Distribution Unlimited. 22 [93
Source replenishment device for vacuum deposition
Hill, Ronald A.
1988-01-01
A material source replenishment device for use with a vacuum deposition apparatus. The source replenishment device comprises an intermittent motion producing gear arrangement disposed within the vacuum deposition chamber. An elongated rod having one end operably connected to the gearing arrangement is provided with a multiarmed head at the opposite end disposed adjacent the heating element of the vacuum deposition apparatus. An inverted U-shaped source material element is releasably attached to the outer end of each arm member whereby said multiarmed head is moved to locate a first of said material elements above said heating element, whereupon said multiarmed head is lowered to engage said material element with the heating element and further lowered to release said material element on the heating element. After vaporization of said material element, second and subsequent material elements may be provided to the heating element without the need for opening the vacuum deposition apparatus to the atmosphere.
Source replenishment device for vacuum deposition
Hill, R.A.
1986-05-15
A material source replenishment device for use with a vacuum deposition apparatus is described. The source replenishment device comprises an intermittent motion producing gear arrangement disposed within the vacuum deposition chamber. An elongated rod having one end operably connected to the gearing arrangement is provided with a multiarmed head at the opposite end disposed adjacent the heating element of the vacuum deposition apparatus. An inverted U-shaped source material element is releasably attached to the outer end of each arm member whereby said multiarmed head is moved to locate a first of said material elements above said heating element, whereupon said multiarmed head is lowered to engage said material element with the heating element and further lowered to release said material element on the heating element. After vaporization of said material element, second and subsequent material elements may be provided to the heating element without the need for opening the vacuum deposition apparatus to the atmosphere.
dsmcFoam+: An OpenFOAM based direct simulation Monte Carlo solver
NASA Astrophysics Data System (ADS)
White, C.; Borg, M. K.; Scanlon, T. J.; Longshaw, S. M.; John, B.; Emerson, D. R.; Reese, J. M.
2018-03-01
dsmcFoam+ is a direct simulation Monte Carlo (DSMC) solver for rarefied gas dynamics, implemented within the OpenFOAM software framework, and parallelised with MPI. It is open-source and released under the GNU General Public License in a publicly available software repository that includes detailed documentation and tutorial DSMC gas flow cases. This release of the code includes many features not found in standard dsmcFoam, such as molecular vibrational and electronic energy modes, chemical reactions, and subsonic pressure boundary conditions. Since dsmcFoam+ is designed entirely within OpenFOAM's C++ object-oriented framework, it benefits from a number of key features: the code emphasises extensibility and flexibility so it is aimed first and foremost as a research tool for DSMC, allowing new models and test cases to be developed and tested rapidly. All DSMC cases are as straightforward as setting up any standard OpenFOAM case, as dsmcFoam+ relies upon the standard OpenFOAM dictionary based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of a DSMC simulation is not typical of most OpenFOAM applications. We show that dsmcFoam+ compares well to other well-known DSMC codes and to analytical solutions in terms of benchmark results.
2017-06-01
for GIFT Cloud, the web -based application version of the Generalized Intelligent Framework for Tutoring (GIFT). GIFT is a modular, open-source...external applications. GIFT is available to users with a GIFT Account at no cost. GIFT Cloud is an implementation of GIFT. This web -based application...section. Approved for public release; distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser
Open Data and Open Science for better Research in the Geo and Space Domain
NASA Astrophysics Data System (ADS)
Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.
2015-12-01
Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data catalog based on semantical interoperability including the transparent access to data in relational data bases. References: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/207772/Open_Data_Charter.pdfhttp://www.openscience.org/blog/wp-content/uploads/2013/06/OpenSciencePoster.pdf
Open Source Drug Discovery in Practice: A Case Study
Årdal, Christine; Røttingen, John-Arne
2012-01-01
Background Open source drug discovery offers potential for developing new and inexpensive drugs to combat diseases that disproportionally affect the poor. The concept borrows two principle aspects from open source computing (i.e., collaboration and open access) and applies them to pharmaceutical innovation. By opening a project to external contributors, its research capacity may increase significantly. To date there are only a handful of open source R&D projects focusing on neglected diseases. We wanted to learn from these first movers, their successes and failures, in order to generate a better understanding of how a much-discussed theoretical concept works in practice and may be implemented. Methodology/Principal Findings A descriptive case study was performed, evaluating two specific R&D projects focused on neglected diseases. CSIR Team India Consortium's Open Source Drug Discovery project (CSIR OSDD) and The Synaptic Leap's Schistosomiasis project (TSLS). Data were gathered from four sources: interviews of participating members (n = 14), a survey of potential members (n = 61), an analysis of the websites and a literature review. Both cases have made significant achievements; however, they have done so in very different ways. CSIR OSDD encourages international collaboration, but its process facilitates contributions from mostly Indian researchers and students. Its processes are formal with each task being reviewed by a mentor (almost always offline) before a result is made public. TSLS, on the other hand, has attracted contributors internationally, albeit significantly fewer than CSIR OSDD. Both have obtained funding used to pay for access to facilities, physical resources and, at times, labor costs. TSLS releases its results into the public domain, whereas CSIR OSDD asserts ownership over its results. Conclusions/Significance Technically TSLS is an open source project, whereas CSIR OSDD is a crowdsourced project. However, both have enabled high quality research at low cost. The critical success factors appear to be clearly defined entry points, transparency and funding to cover core material costs. PMID:23029588
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neuscamman, Stephanie J.
This section describes ways in which an urban environment can affect the distribution of airborne radiological material. In an urban area, winds at street level are significantly more variable and complex than the prevailing winds above the buildings. Elevated winds may be uniform and representative of the general flow over the surrounding area, but buildings influence the local flow such that the winds below the building heights vary significantly in location and time (Hanna et al 2006). For a release of material near an individual building, the complex effect of the building on the airflow may locally enhance the airmore » concentration of released material in some regions near the building and reduce it in others compared to a release in open terrain. However, the overall effect of an individual building is to induce a rapid enlargement and dilution of an incident plume from an isolated source upwind of the building (Hosker 1984). A plume spreading through an urban environment of multiple buildings will experience enhanced mixing and greater spreading of the contaminant plume in both the vertical and horizontal directions, compared to the same release in open terrain.« less
Cost-Minimization Analysis of Open and Endoscopic Carpal Tunnel Release.
Zhang, Steven; Vora, Molly; Harris, Alex H S; Baker, Laurence; Curtin, Catherine; Kamal, Robin N
2016-12-07
Carpal tunnel release is the most common upper-limb surgical procedure performed annually in the U.S. There are 2 surgical methods of carpal tunnel release: open or endoscopic. Currently, there is no clear clinical or economic evidence supporting the use of one procedure over the other. We completed a cost-minimization analysis of open and endoscopic carpal tunnel release, testing the null hypothesis that there is no difference between the procedures in terms of cost. We conducted a retrospective review using a private-payer and Medicare Advantage database composed of 16 million patient records from 2007 to 2014. The cohort consisted of records with an ICD-9 (International Classification of Diseases, Ninth Revision) diagnosis of carpal tunnel syndrome and a CPT (Current Procedural Terminology) code for carpal tunnel release. Payer fees were used to define cost. We also assessed other associated costs of care, including those of electrodiagnostic studies and occupational therapy. Bivariate comparisons were performed using the chi-square test and the Student t test. Data showed that 86% of the patients underwent open carpal tunnel release. Reimbursement fees for endoscopic release were significantly higher than for open release. Facility fees were responsible for most of the difference between the procedures in reimbursement: facility fees averaged $1,884 for endoscopic release compared with $1,080 for open release (p < 0.0001). Endoscopic release also demonstrated significantly higher physician fees than open release (an average of $555 compared with $428; p < 0.0001). Occupational therapy fees associated with endoscopic release were less than those associated with open release (an average of $237 per session compared with $272; p = 0.07). The total average annual reimbursement per patient for endoscopic release (facility, surgeon, and occupational therapy fees) was significantly higher than for open release ($2,602 compared with $1,751; p < 0.0001). Our data showed that the total average fees per patient for endoscopic release were significantly higher than those for open release, although there currently is no strong evidence supporting better clinical outcomes of either technique. Value-based health-care models that favor delivering high-quality care and improving patient health, while also minimizing costs, may favor open carpal tunnel release.
The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality
NASA Technical Reports Server (NTRS)
Conway, Darrel J.; Hughes, Steven P.
2010-01-01
The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).
Is There a Second Life for Virtual Worlds?
ERIC Educational Resources Information Center
Ramaswami, Rama
2011-01-01
Just a few years ago, virtual worlds were credited with the power to transform the universe. Used since the late 1990s in military and medical applications, virtual worlds first gained mainstream media attention when Linden Lab released Second Life in 2003. While other worlds, including open source environments, have launched since then (examples…
It's Time to Consider Open Source Software
ERIC Educational Resources Information Center
Pfaffman, Jay
2007-01-01
In 1985 Richard Stallman, a computer programmer, released "The GNU Manifesto" in which he proclaimed a golden rule: One must share computer programs. Software vendors required him to agree to license agreements that forbade sharing programs with others, but he refused to "break solidarity" with other computer users whom he assumed also wanted to…
Nitrates in drinking water: relation with intensive livestock production.
Giammarino, M; Quatto, P
2015-01-01
An excess of nitrates causes environmental pollution in receiving water bodies and health risk for human, if contaminated water is source of drinking water. The directive 91/676/ CEE [1] aims to reduce the nitrogen pressure in Europe from agriculture sources and identifies the livestock population as one of the predominant sources of surplus of nutrients that could be released in water and air. Directive is concerned about cattle, sheep, pigs and poultry and their territorial loads, but it does not deal with fish farms. Fish farms effluents may contain pollutants affecting ecosystem water quality. On the basis of multivariate statistical analysis, this paper aims to establish what types of farming affect the presence of nitrates in drinking water in the province of Cuneo, Piedmont, Italy. In this regard, we have used data from official sources on nitrates in drinking water and data Arvet database, concerning the presence of intensive farming in the considered area. For model selection we have employed automatic variable selection algorithm. We have identified fish farms as a major source of nitrogen released into the environment, while pollution from sheep and poultry has appeared negligible. We would like to emphasize the need to include in the "Nitrate Vulnerable Zones" (as defined in Directive 91/676/CEE [1]), all areas where there are intensive farming of fish with open-system type of water use. Besides, aquaculture open-system should be equipped with adequate downstream system of filtering for removing nitrates in the wastewater.
QUATTO, P.
2015-01-01
Summary Introduction. An excess of nitrates causes environmental pollution in receiving water bodies and health risk for human, if contaminated water is source of drinking water. The directive 91/676/ CEE [1] aims to reduce the nitrogen pressure in Europe from agriculture sources and identifies the livestock population as one of the predominant sources of surplus of nutrients that could be released in water and air. Directive is concerned about cattle, sheep, pigs and poultry and their territorial loads, but it does not deal with fish farms. Fish farms effluents may contain pollutants affecting ecosystem water quality. Methods. On the basis of multivariate statistical analysis, this paper aims to establish what types of farming affect the presence of nitrates in drinking water in the province of Cuneo, Piedmont, Italy. In this regard, we have used data from official sources on nitrates in drinking water and data Arvet database, concerning the presence of intensive farming in the considered area. For model selection we have employed automatic variable selection algorithm. Results and discussion. We have identified fish farms as a major source of nitrogen released into the environment, while pollution from sheep and poultry has appeared negligible. We would like to emphasize the need to include in the "Nitrate Vulnerable Zones" (as defined in Directive 91/676/CEE [1]), all areas where there are intensive farming of fish with open-system type of water use. Besides, aquaculture open-system should be equipped with adequate downstream system of filtering for removing nitrates in the wastewater. PMID:26900335
Micro Computer Feedback Report for the Strategic Leader Development Inventory; Source Code
1994-03-01
SEL5 ;exit if error CALL SELZCT SCRZEN ;display select screen JC SEL4 ;no files in directory .------- display the files NOV BX, [BarPos] ;starting...SEL2 ;if not goto next test imp SEL4 ; Ecit SEL2: CUP AL,ODh ;in it a pick ? 3Z SEL3 ;if YES exit loop ------- see if an active control key was...file CALL READCOMFIG eread file into memory JC SEL5 ;exit to main menu CALL OPEN DATA FILE ;is data arailable? SEL4 : CALL RELEASE_ _MDR ;release mom
The use of open data from social media for the creation of 3D georeferenced modeling
NASA Astrophysics Data System (ADS)
Themistocleous, Kyriacos
2016-08-01
There is a great deal of open source video on the internet that is posted by users on social media sites. With the release of low-cost unmanned aerial vehicles, many hobbyists are uploading videos from different locations, especially in remote areas. Using open source data that is available on the internet, this study utilized structure to motion (SfM) as a range imaging technique to estimate 3 dimensional landscape features from 2 dimensional image sequences subtracted from video, applied image distortion correction and geo-referencing. This type of documentation may be necessary for cultural heritage sites that are inaccessible or documentation is difficult, where we can access video from Unmanned Aerial Vehicles (UAV). These 3D models can be viewed using Google Earth, create orthoimage, drawings and create digital terrain modeling for cultural heritage and archaeological purposes in remote or inaccessible areas.
Ro, Kyoung S; Johnson, Melvin H; Varma, Ravi M; Hashmonay, Ram A; Hunt, Patrick
2009-08-01
Improved characterization of distributed emission sources of greenhouse gases such as methane from concentrated animal feeding operations require more accurate methods. One promising method is recently used by the USEPA. It employs a vertical radial plume mapping (VRPM) algorithm using optical remote sensing techniques. We evaluated this method to estimate emission rates from simulated distributed methane sources. A scanning open-path tunable diode laser was used to collect path-integrated concentrations (PICs) along different optical paths on a vertical plane downwind of controlled methane releases. Each cycle consists of 3 ground-level PICs and 2 above ground PICs. Three- to 10-cycle moving averages were used to reconstruct mass equivalent concentration plum maps on the vertical plane. The VRPM algorithm estimated emission rates of methane along with meteorological and PIC data collected concomitantly under different atmospheric stability conditions. The derived emission rates compared well with actual released rates irrespective of atmospheric stability conditions. The maximum error was 22 percent when 3-cycle moving average PICs were used; however, it decreased to 11% when 10-cycle moving average PICs were used. Our validation results suggest that this new VRPM method may be used for improved estimations of greenhouse gas emission from a variety of agricultural sources.
Masanz, James J; Ogren, Philip V; Zheng, Jiaping; Sohn, Sunghwan; Kipper-Schuler, Karin C; Chute, Christopher G
2010-01-01
We aim to build and evaluate an open-source natural language processing system for information extraction from electronic medical record clinical free-text. We describe and evaluate our system, the clinical Text Analysis and Knowledge Extraction System (cTAKES), released open-source at http://www.ohnlp.org. The cTAKES builds on existing open-source technologies—the Unstructured Information Management Architecture framework and OpenNLP natural language processing toolkit. Its components, specifically trained for the clinical domain, create rich linguistic and semantic annotations. Performance of individual components: sentence boundary detector accuracy=0.949; tokenizer accuracy=0.949; part-of-speech tagger accuracy=0.936; shallow parser F-score=0.924; named entity recognizer and system-level evaluation F-score=0.715 for exact and 0.824 for overlapping spans, and accuracy for concept mapping, negation, and status attributes for exact and overlapping spans of 0.957, 0.943, 0.859, and 0.580, 0.939, and 0.839, respectively. Overall performance is discussed against five applications. The cTAKES annotations are the foundation for methods and modules for higher-level semantic processing of clinical free-text. PMID:20819853
gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data
NASA Astrophysics Data System (ADS)
Hummel, Jacob A.
2016-11-01
We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.
jTraML: an open source Java API for TraML, the PSI standard for sharing SRM transitions.
Helsens, Kenny; Brusniak, Mi-Youn; Deutsch, Eric; Moritz, Robert L; Martens, Lennart
2011-11-04
We here present jTraML, a Java API for the Proteomics Standards Initiative TraML data standard. The library provides fully functional classes for all elements specified in the TraML XSD document, as well as convenient methods to construct controlled vocabulary-based instances required to define SRM transitions. The use of jTraML is demonstrated via a two-way conversion tool between TraML documents and vendor specific files, facilitating the adoption process of this new community standard. The library is released as open source under the permissive Apache2 license and can be downloaded from http://jtraml.googlecode.com . TraML files can also be converted online at http://iomics.ugent.be/jtraml .
pyNS: an open-source framework for 0D haemodynamic modelling.
Manini, Simone; Antiga, Luca; Botti, Lorenzo; Remuzzi, Andrea
2015-06-01
A number of computational approaches have been proposed for the simulation of haemodynamics and vascular wall dynamics in complex vascular networks. Among them, 0D pulse wave propagation methods allow to efficiently model flow and pressure distributions and wall displacements throughout vascular networks at low computational costs. Although several techniques are documented in literature, the availability of open-source computational tools is still limited. We here present python Network Solver, a modular solver framework for 0D problems released under a BSD license as part of the archToolkit ( http://archtk.github.com ). As an application, we describe patient-specific models of the systemic circulation and detailed upper extremity for use in the prediction of maturation after surgical creation of vascular access for haemodialysis.
Detection of emission sources using passive-remote Fourier transform infrared spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demirgian, J.C.; Macha, S.M.; Darby, S.M.
1995-12-31
The detection and identification of toxic chemicals released in the environment is important for public safety. Passive-remote Fourier transform infrared (FTIR) spectrometers can be used to detect these releases. Their primary advantages are their small size and ease of setup and use. Open-path FTIR spectrometers are used to detect concentrations of pollutants from a fixed frame of reference. These instruments detect plumes, but they are too large and difficult to aim to be used to track a plume to its source. Passive remote FTIR spectrometers contain an interferometer, optics, and a detector. They can be used on tripods and inmore » some cases can be hand-held. A telescope can be added to most units. The authors will discuss the capability of passive-remote FTIR spectrometers to detect the origin of plumes. Low concentration plumes were released using a custom-constructed vaporizer. These plumes were detected with different spectrometers from different distances. Passive-remote spectrometers were able to detect small 10 cm on a side chemical releases at concentration-pathlengths at the low parts per million-meter (ppm-m) level.« less
Detection of emission sources using passive-remote Fourier transform infrared spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demirgian, J.C.; Macha, S.M.; Darby, S.M.
1995-04-01
The detection and identification of toxic chemicals released in the environment is important for public safety. Passive-remote Fourier transform infrared (FTIR) spectrometers can be used to detect these releases. Their primary advantages are their small size and ease of setup and use. Open-path FTIR spectrometers are used to detect concentrations of pollutants from a fixed frame of reference. These instruments detect plumes, but they are too large and difficult to aim to be used to track a plume to its source. Passive remote FTIR spectrometers contain an interferometer, optics, and a detector. They can be used on tripods and inmore » some cases can be hand-held. A telescope can be added to most units. We will discuss the capability of passive-remote FTIR spectrometers to detect the origin of plumes. Low concentration plumes were released using a custom-constructed vaporizer. These plumes were detected with different spectrometers from different distances. Passive-remote spectrometers were able to detect small 10 cm on a side chemical releases at concentration-pathlengths at the low parts per million-meter (ppm-m) level.« less
An Open Source Agenda for Research Linking Text and Image Content Features.
ERIC Educational Resources Information Center
Goodrum, Abby A.; Rorvig, Mark E.; Jeong, Ki-Tai; Suresh, Chitturi
2001-01-01
Proposes methods to utilize image primitives to support term assignment for image classification. Proposes to release code for image analysis in a common tool set for other researchers to use. Of particular focus is the expansion of work by researchers in image indexing to include image content-based feature extraction capabilities in their work.…
46 CFR 108.171 - Class I, Division 1 locations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... locations. The following are Class I, Division 1 locations: (a) An enclosed space that contains any part of the mud circulating system that has an opening into the space and is between the well and final... possible source of gas release. (c) An enclosed space that is on the drill floor, and is not separated by a...
46 CFR 108.171 - Class I, Division 1 locations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... locations. The following are Class I, Division 1 locations: (a) An enclosed space that contains any part of the mud circulating system that has an opening into the space and is between the well and final... possible source of gas release. (c) An enclosed space that is on the drill floor, and is not separated by a...
46 CFR 108.171 - Class I, Division 1 locations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... locations. The following are Class I, Division 1 locations: (a) An enclosed space that contains any part of the mud circulating system that has an opening into the space and is between the well and final... possible source of gas release. (c) An enclosed space that is on the drill floor, and is not separated by a...
46 CFR 108.171 - Class I, Division 1 locations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... locations. The following are Class I, Division 1 locations: (a) An enclosed space that contains any part of the mud circulating system that has an opening into the space and is between the well and final... possible source of gas release. (c) An enclosed space that is on the drill floor, and is not separated by a...
46 CFR 108.171 - Class I, Division 1 locations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... locations. The following are Class I, Division 1 locations: (a) An enclosed space that contains any part of the mud circulating system that has an opening into the space and is between the well and final... possible source of gas release. (c) An enclosed space that is on the drill floor, and is not separated by a...
The RAVE/VERTIGO vertex reconstruction toolkit and framework
NASA Astrophysics Data System (ADS)
Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.
2008-07-01
A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.
Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J
2004-09-24
Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.
Douglas, P; Tyrrel, S F; Kinnersley, R P; Whelan, M; Longhurst, P J; Walsh, K; Pollard, S J T; Drew, G H
2016-12-15
Bioaerosols are released in elevated quantities from composting facilities and are associated with negative health effects, although dose-response relationships are not well understood, and require improved exposure classification. Dispersion modelling has great potential to improve exposure classification, but has not yet been extensively used or validated in this context. We present a sensitivity analysis of the ADMS dispersion model specific to input parameter ranges relevant to bioaerosol emissions from open windrow composting. This analysis provides an aid for model calibration by prioritising parameter adjustment and targeting independent parameter estimation. Results showed that predicted exposure was most sensitive to the wet and dry deposition modules and the majority of parameters relating to emission source characteristics, including pollutant emission velocity, source geometry and source height. This research improves understanding of the accuracy of model input data required to provide more reliable exposure predictions. Copyright © 2016. Published by Elsevier Ltd.
Seedorf, Jens; Schmidt, Ralf-Gunther
2017-08-01
Research that investigates bioaerosol emissions from animal transport vehicles (ATVs) and their importance in the spread of harmful airborne agents while the ATVs travel on roads is limited. To investigate the dynamical behaviour of theoretically released particles from a moving ATV, the open-source computational fluid dynamics (CFD) software OpenFOAM was used to calculate the external and internal air flow fields with passive and forced ventilated openings of a common ATV moving at a speed of 80 km/h. In addition to a computed flow rate of approximately 40,000 m 3 /h crossing the interior of the ATV, the visualization of the trajectories has demonstrated distinct patterns of the spatial distribution of potentially released bioaerosols in the vicinity of the ATV. Although the front openings show the highest air flow to the outside, the recirculations of air masses between the interior of the ATV and the atmosphere also occur, which complicate the emission and the dispersion characterizations. To specify the future emission rates of ATVs, a database of bioaerosol concentrations within the ATV is necessary in conjunction with high-performance computing resources to simulate the potential dispersion of bioaerosols in the environment.
Open-source image registration for MRI-TRUS fusion-guided prostate interventions.
Fedorov, Andriy; Khallaghi, Siavash; Sánchez, C Antonio; Lasso, Andras; Fels, Sidney; Tuncali, Kemal; Sugar, Emily Neubauer; Kapur, Tina; Zhang, Chenxi; Wells, William; Nguyen, Paul L; Abolmaesumi, Purang; Tempany, Clare
2015-06-01
We propose two software tools for non-rigid registration of MRI and transrectal ultrasound (TRUS) images of the prostate. Our ultimate goal is to develop an open-source solution to support MRI-TRUS fusion image guidance of prostate interventions, such as targeted biopsy for prostate cancer detection and focal therapy. It is widely hypothesized that image registration is an essential component in such systems. The two non-rigid registration methods are: (1) a deformable registration of the prostate segmentation distance maps with B-spline regularization and (2) a finite element-based deformable registration of the segmentation surfaces in the presence of partial data. We evaluate the methods retrospectively using clinical patient image data collected during standard clinical procedures. Computation time and Target Registration Error (TRE) calculated at the expert-identified anatomical landmarks were used as quantitative measures for the evaluation. The presented image registration tools were capable of completing deformable registration computation within 5 min. Average TRE was approximately 3 mm for both methods, which is comparable with the slice thickness in our MRI data. Both tools are available under nonrestrictive open-source license. We release open-source tools that may be used for registration during MRI-TRUS-guided prostate interventions. Our tools implement novel registration approaches and produce acceptable registration results. We believe these tools will lower the barriers in development and deployment of interventional research solutions and facilitate comparison with similar tools.
The TENCompetence Infrastructure: A Learning Network Implementation
NASA Astrophysics Data System (ADS)
Vogten, Hubert; Martens, Harrie; Lemmers, Ruud
The TENCompetence project developed a first release of a Learning Network infrastructure to support individuals, groups and organisations in professional competence development. This infrastructure Learning Network infrastructure was released as open source to the community thereby allowing users and organisations to use and contribute to this development as they see fit. The infrastructure consists of client applications providing the user experience and server components that provide the services to these clients. These services implement the domain model (Koper 2006) by provisioning the entities of the domain model (see also Sect. 18.4) and henceforth will be referenced as domain entity services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seefeldt, Ben; Sondak, David; Hensinger, David M.
Drekar is an application code that solves partial differential equations for fluids that can be optionally coupled to electromagnetics. Drekar solves low-mach compressible and incompressible computational fluid dynamics (CFD), compressible and incompressible resistive magnetohydrodynamics (MHD), and multiple species plasmas interacting with electromagnetic fields. Drekar discretization technology includes continuous and discontinuous finite element formulations, stabilized finite element formulations, mixed integration finite element bases (nodal, edge, face, volume) and an initial arbitrary Lagrangian Eulerian (ALE) capability. Drekar contains the implementation of the discretized physics and leverages the open source Trilinos project for both parallel solver capabilities and general finite element discretization tools.more » The code will be released open source under a BSD license. The code is used for fundamental research for simulation of fluids and plasmas on high performance computing environments.« less
Kauai Test Facility hazards assessment document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swihart, A
1995-05-01
The Department of Energy Order 55003A requires facility-specific hazards assessment be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the Kauai Test Facility, Barking Sands, Kauai, Hawaii. The Kauai Test Facility`s chemical and radiological inventories were screened according to potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance to themore » Early Severe Health Effects threshold is 4.2 kilometers. The highest emergency classification is a General Emergency at the {open_quotes}Main Complex{close_quotes} and a Site Area Emergency at the Kokole Point Launch Site. The Emergency Planning Zone for the {open_quotes}Main Complex{close_quotes} is 5 kilometers. The Emergency Planning Zone for the Kokole Point Launch Site is the Pacific Missile Range Facility`s site boundary.« less
Suppaphol, Sorasak; Worathanarat, Patarawan; Kawinwongkovit, Viroj; Pittayawutwinit, Preecha
2012-04-01
To compare the operative outcome of carpal tunnel release between limited open carpal tunnel release using direct vision and tunneling technique (group A) with standard open carpal tunnel release (group B). Twenty-eight patients were enrolled in the present study. A single blind randomized control trial study was conducted to compare the postoperative results between group A and B. The study parameters were Levine's symptom severity and functional score, grip and pinch strength, and average two-point discrimination. The postoperative results between two groups were comparable with no statistical significance. Only grip strength at three months follow up was significantly greater in group A than in group B. The limited open carpal tunnel release in the present study is effective comparable to the standard open carpal tunnel release. The others advantage of this technique are better cosmesis and improvement in grip strength at the three months postoperative period.
Assimilation of Long-Range Lightning Data over the Pacific
2011-09-30
convective rainfall analyses over the Pacific, and (iii) to improve marine prediction of cyclogenesis of both tropical and extratropical cyclones through...data over the North Pacific Ocean, refine the relationships between lightning and storm hydrometeor characteristics, and assimilate lightning...unresolved storm -scale areas of deep convection over the data-sparse open oceans. Diabatic heating sources, especially latent heat release in deep
J.J. McDonnell; K. McGuire; P. Aggarwal; K.J. Beven; D. Biondi; G. Destouni; S. Dunn; A. James; J. Kirchner; P. Kraft; S. Lyon; P. Maloszewski; B. Newman; L. Pfister; A. Rinaldo; A. Rodhe; T. Sayama; J. Seibert; K. Solomon; C. Soulsby; M. Stewart; D. Tetzlaff; C. Tobin; P. Troch; M. Weiler; A. Western; A. Wörman; S. Wrede
2010-01-01
The time water spends travelling subsurface through a catchment to the stream network (i.e. the catchment water transit time) fundamentally describes the storage, flow pathway heterogeneity and sources of water in a catchment. The distribution of transit times reflects how catchments retain and release water and solutes that in turn set biogeochemical conditions and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Ruisheng; Chen, Yao; Wang, Bing
The cold-dense plasma is occasionally detected in the solar wind with in situ data, but the source of the cold-dense plasma remains illusive. Interchange reconnections (IRs) between closed fields and nearby open fields are known to contribute to the formation of solar winds. We present a confined filament eruption associated with a puff-like coronal mass ejection (CME) on 2014 December 24. The filament underwent successive activations and finally erupted, due to continuous magnetic flux cancelations and emergences. The confined erupting filament showed a clear untwist motion, and most of the filament material fell back. During the eruption, some tiny blobsmore » escaped from the confined filament body, along newly formed open field lines rooted around the south end of the filament, and some bright plasma flowed from the north end of the filament to remote sites at nearby open fields. The newly formed open field lines shifted southward with multiple branches. The puff-like CME also showed multiple bright fronts and a clear southward shift. All the results indicate an intermittent IR existed between closed fields of the confined erupting filament and nearby open fields, which released a portion of filament material (blobs) to form the puff-like CME. We suggest that the IR provides a possible source of cold-dense plasma in the solar wind.« less
NASA Astrophysics Data System (ADS)
Mattmann, C. A.
2013-12-01
A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.
Application of Open Source Software by the Lunar Mapping and Modeling Project
NASA Astrophysics Data System (ADS)
Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.
2011-12-01
The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on top of XML. Apache Solr, an open source search engine, was used to drive our search interface and as way to store references to metadata and data exposed via REST endpoints. As was the case with Apache OODT there was team experience with this component that helped drive this choice. Lastly, OpenSSO, an open source single sign on service, was used to secure and provide access constraints to our REST based services. For this product there was little past experience but given our service based approach seemed to be a natural fit. Given our exposure to open source we will discuss the tradeoffs and benefits received by the choices made. Moreover, we will dive into the context of how the software packages were used and the impact of their design and extensibility had on the construction of the infrastructure. Finally, we will compare our encounter across open source solutions and attributes that can vary the impression one will get. This comprehensive account of our endeavor should aid others in their assessment and use of open source.
Science Objectives of the FOXSI Small Explorer Mission Concept
NASA Astrophysics Data System (ADS)
Shih, Albert Y.; Christe, Steven; Alaoui, Meriem; Allred, Joel C.; Antiochos, Spiro K.; Battaglia, Marina; Buitrago-Casas, Juan Camilo; Caspi, Amir; Dennis, Brian R.; Drake, James; Fleishman, Gregory D.; Gary, Dale E.; Glesener, Lindsay; Grefenstette, Brian; Hannah, Iain; Holman, Gordon D.; Hudson, Hugh S.; Inglis, Andrew R.; Ireland, Jack; Ishikawa, Shin-Nosuke; Jeffrey, Natasha; Klimchuk, James A.; Kontar, Eduard; Krucker, Sam; Longcope, Dana; Musset, Sophie; Nita, Gelu M.; Ramsey, Brian; Ryan, Daniel; Saint-Hilaire, Pascal; Schwartz, Richard A.; Vilmer, Nicole; White, Stephen M.; Wilson-Hodge, Colleen
2016-05-01
Impulsive particle acceleration and plasma heating at the Sun, from the largest solar eruptive events to the smallest flares, are related to fundamental processes throughout the Universe. While there have been significant advances in our understanding of impulsive energy release since the advent of RHESSI observations, there is a clear need for new X-ray observations that can capture the full range of emission in flares (e.g., faint coronal sources near bright chromospheric sources), follow the intricate evolution of energy release and changes in morphology, and search for the signatures of impulsive energy release in even the quiescent Sun. The FOXSI Small Explorer (SMEX) mission concept combines state-of-the-art grazing-incidence focusing optics with pixelated solid-state detectors to provide direct imaging of hard X-rays for the first time on a solar observatory. We present the science objectives of FOXSI and how its capabilities will address and resolve open questions regarding impulsive energy release at the Sun. These questions include: What are the time scales of the processes that accelerate electrons? How do flare-accelerated electrons escape into the heliosphere? What is the energy input of accelerated electrons into the chromosphere, and how is super-heated coronal plasma produced?
RAVE—a Detector-independent vertex reconstruction toolkit
NASA Astrophysics Data System (ADS)
Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian
2007-10-01
A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".
Increasing Flight Software Reuse with OpenSatKit
NASA Technical Reports Server (NTRS)
McComas, David C.
2018-01-01
In January 2015 the NASA Goddard Space Flight Center (GSFC) released the Core Flight System (cFS) as open source under the NASA Open Source Agreement (NOSA) license. The cFS is based on flight software (FSW) developed for 12 spacecraft spanning nearly two decades of effort and it can provide about a third of the FSW functionality for a low-earth orbiting scientific spacecraft. The cFS is a FSW framework that is portable, configurable, and extendable using a product line deployment model. However, the components are maintained separately so the user must configure, integrate, and deploy them as a cohesive functional system. This can be very challenging especially for organizations such as universities building cubesats that have minimal experience developing FSW. Supporting universities was one of the primary motivators for releasing the cFS under NOSA. This paper describes the OpenSatKit that was developed to address the cFS deployment challenges and to serve as a cFS training platform for new users. It provides a fully functional out-of-the box software system that includes NASA's cFS, Ball Aerospace's command and control system COSMOS, and a NASA dynamic simulator called 42. The kit is freely available since all of the components have been released as open source. The kit runs on a Linux platform, includes 8 cFS applications, several kit-specific applications, and built in demos illustrating how to use key application features. It also includes the software necessary to port the cFS to a Raspberry Pi and instructions for configuring COSMOS to communicate with the target. All of the demos and test scripts can be rerun unchanged with the cFS running on the Raspberry Pi. The cFS uses a 3-tiered layered architecture including a platform abstraction layer, a Core Flight Executive (cFE) middle layer, and an application layer. Similar to smart phones, the cFS application layer is the key architectural feature for users to extend the FSW functionality to meet their mission-specific requirements. The platform abstraction layer and the cFE layers go a step further than smart phones by providing a platform-agnostic Application Programmer Interface (API) that allows applications to run unchanged on different platforms. OpenSatKit can serve two significant architectural roles that will further help the adoption of the cFS and help create a community of users that can share assets. First, the kit is being enhanced to automate the integration of applications with the goal of creating a virtual cFS "App Store".. Second, a platform certification test suite can be developed that would allow users to verify the port of the cFS to a new platform. This paper will describe the current state of these efforts and future plans.
NASA Astrophysics Data System (ADS)
Mergili, Martin; Fischer, Jan-Thomas; Krenn, Julia; Pudasaini, Shiva P.
2017-02-01
r.avaflow represents an innovative open-source computational tool for routing rapid mass flows, avalanches, or process chains from a defined release area down an arbitrary topography to a deposition area. In contrast to most existing computational tools, r.avaflow (i) employs a two-phase, interacting solid and fluid mixture model (Pudasaini, 2012); (ii) is suitable for modelling more or less complex process chains and interactions; (iii) explicitly considers both entrainment and stopping with deposition, i.e. the change of the basal topography; (iv) allows for the definition of multiple release masses, and/or hydrographs; and (v) serves with built-in functionalities for validation, parameter optimization, and sensitivity analysis. r.avaflow is freely available as a raster module of the GRASS GIS software, employing the programming languages Python and C along with the statistical software R. We exemplify the functionalities of r.avaflow by means of two sets of computational experiments: (1) generic process chains consisting in bulk mass and hydrograph release into a reservoir with entrainment of the dam and impact downstream; (2) the prehistoric Acheron rock avalanche, New Zealand. The simulation results are generally plausible for (1) and, after the optimization of two key parameters, reasonably in line with the corresponding observations for (2). However, we identify some potential to enhance the analytic and numerical concepts. Further, thorough parameter studies will be necessary in order to make r.avaflow fit for reliable forward simulations of possible future mass flow events.
VIVO Open Source Software: Connecting Facilities to Promote Discovery and Further Research.
NASA Astrophysics Data System (ADS)
Gross, M. B.; Rowan, L. R.; Mayernik, M. S.; Daniels, M. D.; Stott, D.; Allison, J.; Maull, K. E.; Krafft, D. B.; Khan, H.
2016-12-01
EarthCollab (http://earthcube.org/group/earthcollab), a National Science Foundation (NSF) EarthCube Building Block project, has adapted an open source semantic web application, VIVO, for use within the earth science domain. EarthCollab is a partnership between UNAVCO, an NSF facility supporting research through geodetic services, the Earth Observing Laboratory (EOL) at the National Center for Atmospheric Research (NCAR), and Cornell University, where VIVO was created to highlight the scholarly output of researchers at universities. Two public sites have been released: Connect UNAVCO (connect.unavco.org) and Arctic Data Connects (vivo.eol.ucar.edu). The core VIVO software and ontology have been extended to work better with concepts necessary for capturing work within UNAVCO's and EOL's province such as principal investigators for continuous GPS/GNSS stations at UNAVCO and keywords describing cruise datasets at EOL. The sites increase discoverability of large and diverse data archives by linking data with people, research, and field projects. Disambiguation is a major challenge when using VIVO and open data when "anyone can say anything about anything." Concepts and controlled vocabularies help to build consistent and easily searchable connections within VIVO. We use aspects of subject heading services such as FAST and LOC, as well as AGU and GSA fields of research and subject areas to reveal connections, especially with VIVO instances at other institutions. VIVO works effectively with persistent IDs and the projects strive to utilize publication and data DOIs, ORCIDs for people, and ISNI and GRID for organizations. ORCID, an open source project, is very useful for disambiguation and unlike other identifier systems for people developed by publishers, makes public data available via an API. VIVO utilizes Solr and Freemarker, which are open source search engine and templating technologies, respectively. Additionally, a handful of popular open source libraries and applications are being used in the project such as D3.js, jQuery, Leaflet, and Elasticsearch. Our implementation of these open source projects within VIVO is available for adaptation by other institutions using VIVO via GitHub (git.io/vG9AJ).
Fall 2014 SEI Research Review Edge-Enabled Tactical Systems (EETS)
2014-10-29
Effective communicate and reasoning despite connectivity issues • More generally, how to make programming distributed algorithms with extensible...distributed collaboration in VREP simulations for 5-12 quadcopters and ground robots • Open-source middleware and algorithms released to community...Integration into CMU Drone-RK quadcopter and Platypus autonomous boat platforms • Presentations at DARPA (CODE), AFRL C4I Workshop, and AFRL Eglin
Large CO2 and CH4 release from a flooded formerly drained fen
NASA Astrophysics Data System (ADS)
Sachs, T.; Franz, D.; Koebsch, F.; Larmanou, E.; Augustin, J.
2016-12-01
Drained peatlands are usually strong carbon dioxide (CO2) sources. In Germany, up to 4.5 % of the national CO2 emissions are estimated to be released from agriculturally used peatlands and for some peatland-rich northern states, such as Mecklenburg-Western Pomerania, this share increases to about 20%. Reducing this CO2 source and restoring the peatlands' natural carbon sink is one objective of large-scale nature protection and restoration measures, in which 37.000 ha of drained and degraded peatlands in Mecklenburg-Western Pomerania are slated for rewetting. It is well known, however, that in the initial phase of rewetting, a reduction of the CO2 source strength is usually accompanied by an increase in CH4 emissions. Thus, whether and when the intended effects of rewetting with regard to greenhouse gases are achieved, depends on the balance of CO2 and CH4 fluxes and on the duration of the initial CH4 emission phase. In 2013, a new Fluxnet site went online at a flooded formerly drained river valley fen site near Zarnekow, NE Germany (DE-Zrk), to investigate the combined CO2 and CH4 dynamics at such a heavily degraded and rewetted peatland. The site is dominated by open water with submerged and floating vegetation and surrounding Typha latifolia.Nine year after rewetting, we found large CH4 emissions of 53 g CH4 m-2 a-1 from the open water area, which are 4-fold higher than from the surrounding vegetation zone (13 g CH4 m-2 a-1). Surprisingly, both the open water and the vegetated area were net CO2 sources of 158 and 750 g CO2 m-2 a-1, respectively. Unusual meteorological conditions with a warm and dry summer and a mild winter might have facilitated high respiration rates, particularly from temporally non-inundated organic mud in the vegetation zone.
CAMPAIGN: an open-source library of GPU-accelerated data clustering algorithms.
Kohlhoff, Kai J; Sosnick, Marc H; Hsu, William T; Pande, Vijay S; Altman, Russ B
2011-08-15
Data clustering techniques are an essential component of a good data analysis toolbox. Many current bioinformatics applications are inherently compute-intense and work with very large datasets. Sequential algorithms are inadequate for providing the necessary performance. For this reason, we have created Clustering Algorithms for Massively Parallel Architectures, Including GPU Nodes (CAMPAIGN), a central resource for data clustering algorithms and tools that are implemented specifically for execution on massively parallel processing architectures. CAMPAIGN is a library of data clustering algorithms and tools, written in 'C for CUDA' for Nvidia GPUs. The library provides up to two orders of magnitude speed-up over respective CPU-based clustering algorithms and is intended as an open-source resource. New modules from the community will be accepted into the library and the layout of it is such that it can easily be extended to promising future platforms such as OpenCL. Releases of the CAMPAIGN library are freely available for download under the LGPL from https://simtk.org/home/campaign. Source code can also be obtained through anonymous subversion access as described on https://simtk.org/scm/?group_id=453. kjk33@cantab.net.
The Herschel-ATLAS data release 1 - I. Maps, catalogues and number counts
NASA Astrophysics Data System (ADS)
Valiante, E.; Smith, M. W. L.; Eales, S.; Maddox, S. J.; Ibar, E.; Hopwood, R.; Dunne, L.; Cigan, P. J.; Dye, S.; Pascale, E.; Rigby, E. E.; Bourne, N.; Furlanetto, C.; Ivison, R. J.
2016-11-01
We present the first major data release of the largest single key-project in area carried out in open time with the Herschel Space Observatory. The Herschel Astrophysical Terahertz Large Area Survey (H-ATLAS) is a survey of 600 deg2 in five photometric bands - 100, 160, 250, 350 and 500 μm - with the Photoconductor Array Camera and Spectrometer and Spectral and Photometric Imaging Receiver (SPIRE) cameras. In this paper and the companion Paper II, we present the survey of three fields on the celestial equator, covering a total area of 161.6 deg2 and previously observed in the Galaxy and Mass Assembly (GAMA) spectroscopic survey. This paper describes the Herschel images and catalogues of the sources detected on the SPIRE 250 μm images. The 1σ noise for source detection, including both confusion and instrumental noise, is 7.4, 9.4 and 10.2 mJy at 250, 350 and 500 μm. Our catalogue includes 120 230 sources in total, with 113 995, 46 209 and 11 011 sources detected at >4σ at 250, 350 and 500 μm. The catalogue contains detections at >3σ at 100 and 160 μm for 4650 and 5685 sources, and the typical noise at these wavelengths is 44 and 49 mJy. We include estimates of the completeness of the survey and of the effects of flux bias and also describe a novel method for determining the true source counts. The H-ATLAS source counts are very similar to the source counts from the deeper HerMES survey at 250 and 350 μm, with a small difference at 500 μm. Appendix A provides a quick start in using the released data sets, including instructions and cautions on how to use them.
Marsili, Simone; Signorini, Giorgio Federico; Chelli, Riccardo; Marchi, Massimo; Procacci, Piero
2010-04-15
We present the new release of the ORAC engine (Procacci et al., Comput Chem 1997, 18, 1834), a FORTRAN suite to simulate complex biosystems at the atomistic level. The previous release of the ORAC code included multiple time steps integration, smooth particle mesh Ewald method, constant pressure and constant temperature simulations. The present release has been supplemented with the most advanced techniques for enhanced sampling in atomistic systems including replica exchange with solute tempering, metadynamics and steered molecular dynamics. All these computational technologies have been implemented for parallel architectures using the standard MPI communication protocol. ORAC is an open-source program distributed free of charge under the GNU general public license (GPL) at http://www.chim.unifi.it/orac. 2009 Wiley Periodicals, Inc.
Experience of public procurement of Open Compute servers
NASA Astrophysics Data System (ADS)
Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony
2015-12-01
The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).
Hart, Reece K; Rico, Rudolph; Hare, Emily; Garcia, John; Westbrook, Jody; Fusaro, Vincent A
2015-01-15
Biological sequence variants are commonly represented in scientific literature, clinical reports and databases of variation using the mutation nomenclature guidelines endorsed by the Human Genome Variation Society (HGVS). Despite the widespread use of the standard, no freely available and comprehensive programming libraries are available. Here we report an open-source and easy-to-use Python library that facilitates the parsing, manipulation, formatting and validation of variants according to the HGVS specification. The current implementation focuses on the subset of the HGVS recommendations that precisely describe sequence-level variation relevant to the application of high-throughput sequencing to clinical diagnostics. The package is released under the Apache 2.0 open-source license. Source code, documentation and issue tracking are available at http://bitbucket.org/hgvs/hgvs/. Python packages are available at PyPI (https://pypi.python.org/pypi/hgvs). Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Hart, Reece K.; Rico, Rudolph; Hare, Emily; Garcia, John; Westbrook, Jody; Fusaro, Vincent A.
2015-01-01
Summary: Biological sequence variants are commonly represented in scientific literature, clinical reports and databases of variation using the mutation nomenclature guidelines endorsed by the Human Genome Variation Society (HGVS). Despite the widespread use of the standard, no freely available and comprehensive programming libraries are available. Here we report an open-source and easy-to-use Python library that facilitates the parsing, manipulation, formatting and validation of variants according to the HGVS specification. The current implementation focuses on the subset of the HGVS recommendations that precisely describe sequence-level variation relevant to the application of high-throughput sequencing to clinical diagnostics. Availability and implementation: The package is released under the Apache 2.0 open-source license. Source code, documentation and issue tracking are available at http://bitbucket.org/hgvs/hgvs/. Python packages are available at PyPI (https://pypi.python.org/pypi/hgvs). Contact: reecehart@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25273102
Continuous integration for concurrent MOOSE framework and application development on GitHub
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.; ...
2015-11-20
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Continuous integration for concurrent MOOSE framework and application development on GitHub
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Li, Ting-Yu; Zhou, Jun-Feng; Wu, Chen-Chou; Bao, Lian-Jun; Shi, Lei; Zeng, Eddy Y
2018-04-17
Primitive processing of e-waste potentially releases abundant organic contaminants to the environment, but the magnitudes and mechanisms remain to be adequately addressed. We conducted thermal treatment and open burning of typical e-wastes, that is, plastics and printed circuit boards. Emission factors of the sum of 39 polybrominated diphenyl ethers (∑ 39 PBDE) were 817-1.60 × 10 5 ng g -1 in thermal treatment and nondetected-9.14 × 10 4 ng g -1 , in open burning. Airborne particles (87%) were the main carriers of PBDEs, followed by residual ashes (13%) and gaseous constituents (0.3%), in thermal treatment, while they were 30%, 43% and 27% in open burning. The output-input mass ratios of ∑ 39 PBDE were 0.12-3.76 in thermal treatment and 0-0.16 in open burning. All PBDEs were largely affiliated with fine particles, with geometric mean diameters at 0.61-0.83 μm in thermal degradation and 0.57-1.16 μm in open burning from plastic casings, and 0.44-0.56 and nondetected- 0.55 μm, from printed circuit boards. Evaporation and reabsorption may be the main emission mechanisms for lightly brominated BDEs, but heavily brominated BDEs tend to affiliate with particles from heating or combustion. The different size distributions of particulate PBDEs in emission sources and adjacent air implicated a noteworthy redisposition process during atmospheric dispersal.
Davies, Holly; Delistraty, Damon
2016-02-01
Polychlorinated biphenyls (PCBs) are ubiquitously distributed in the environment and produce multiple adverse effects in humans and wildlife. As a result, the purpose of our study was to characterize PCB sources in anthropogenic materials and releases to the environment in Washington State (USA) in order to formulate recommendations to reduce PCB exposures. Methods included review of relevant publications (e.g., open literature, industry studies and reports, federal and state government databases), scaling of PCB sources from national or county estimates to state estimates, and communication with industry associations and private and public utilities. Recognizing high associated uncertainty due to incomplete data, we strived to provide central tendency estimates for PCB sources. In terms of mass (high to low), PCB sources include lamp ballasts, caulk, small capacitors, large capacitors, and transformers. For perspective, these sources (200,000-500,000 kg) overwhelm PCBs estimated to reside in the Puget Sound ecosystem (1500 kg). Annual releases of PCBs to the environment (high to low) are attributed to lamp ballasts (400-1500 kg), inadvertent generation by industrial processes (900 kg), caulk (160 kg), small capacitors (3-150 kg), large capacitors (10-80 kg), pigments and dyes (0.02-31 kg), and transformers (<2 kg). Recommendations to characterize the extent of PCB distribution and decrease exposures include assessment of PCBs in buildings (e.g., schools) and replacement of these materials, development of Best Management Practices (BMPs) to contain PCBs, reduction of inadvertent generation of PCBs in consumer products, expansion of environmental monitoring and public education, and research to identify specific PCB congener profiles in human tissues.
NASA Astrophysics Data System (ADS)
Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan
2015-04-01
Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.
Fear and Trembling in Connecticut: (Or "How I Learned to Stop Worrying and Love Open Source")
ERIC Educational Resources Information Center
Terlaga, Amy
2010-01-01
In March 2007, SirsiDynix notified its Horizon and classic Dynix customers that it would not be releasing Horizon 8.0 in favor of developing its Unicorn software. As vice president/president-elect of SirsiDynix's Horizon/Dynix user group, the author was one of the first ones to be notified of this abrupt change in company strategy. The news sent…
World Wind Tools Reveal Environmental Change
NASA Technical Reports Server (NTRS)
2012-01-01
Originally developed under NASA's Learning Technologies program as a tool to engage and inspire students, World Wind software was released under the NASA Open Source Agreement license. Honolulu, Hawaii based Intelesense Technologies is one of the companies currently making use of the technology for environmental, public health, and other monitoring applications for nonprofit organizations and Government agencies. The company saved about $1 million in development costs by using the NASA software.
Sykes, Melissa L.; Jones, Amy J.; Shelper, Todd B.; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E.
2017-01-01
ABSTRACT Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro. Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. PMID:28674055
Duffy, Sandra; Sykes, Melissa L; Jones, Amy J; Shelper, Todd B; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E; Avery, Vicky M
2017-09-01
Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. Copyright © 2017 Duffy et al.
Plenario: An Open Data Discovery and Exploration Platform for Urban Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Catlett, Charlie; Malik, Tanu; Goldstein, Brett J.
2014-12-01
The past decade has seen the widespread release of open data concerning city services, conditions, and activities by government bodies and public institutions of all sizes. Hundreds of open data portals now host thousands of datasets of many different types. These new data sources represent enormous po- tential for improved understanding of urban dynamics and processes—and, ultimately, for more livable, efficient, and prosperous communities. However, those who seek to realize this potential quickly discover that discovering and applying those data relevant to any particular question can be extraordinarily dif- ficult, due to decentralized storage, heterogeneous formats, and poor documentation. Inmore » this context, we introduce Plenario, a platform designed to automating time-consuming tasks associated with the discovery, exploration, and application of open city data—and, in so doing, reduce barriers to data use for researchers, policymakers, service providers, journalists, and members of the general public. Key innovations include a geospatial data warehouse that allows data from many sources to be registered into a common spatial and temporal frame; simple and intuitive interfaces that permit rapid discovery and exploration of data subsets pertaining to a particular area and time, regardless of type and source; easy export of such data subsets for further analysis; a user-configurable data ingest framework for automated importing and periodic updating of new datasets into the data warehouse; cloud hosting for elastic scaling and rapid creation of new Plenario instances; and an open source implementation to enable community contributions. We describe here the architecture and implementation of the Plenario platform, discuss lessons learned from its use by several communities, and outline plans for future work.« less
NASA Astrophysics Data System (ADS)
Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.
2016-01-01
The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.
Low-Cost energy contraption design using playground seesaw
NASA Astrophysics Data System (ADS)
Banlawe, I. A. P.; Acosta, N. J. E. L.
2017-05-01
The study was conducted at Western Philippines University, San Juan, Aborlan, Palawan. The study used the mechanical motion of playground seesaw as a means to produce electrical energy. The study aimed to design a low-cost prototype energy contraption using playground seesaw using locally available and recycled materials, to measure the voltage, current and power outputs produced at different situations and estimate the cost of the prototype. Using principle of pneumatics, two hand air pumps were employed on the two end sides of the playground seesaw and the mechanical motion of the seesaw up and down produces air that is used to rotate a DC motor to produce electrical energy. This electricity can be utilized for powering basic or low-power appliances. There were two trials of testing, each trial tests the different pressure level of the air tank and tests the opening of on-off valve (Full open and half open) when the compressed air was released. Results showed that all pressure level at full open produced significantly higher voltage, than the half open. However, the mean values of the current and power produced in all pressure level at full and half open have negligible variation. These results signify that the energy contraption using playground seesaw is an alternative viable source of electrical energy in the playgrounds, parks and other places and can be used as an auxiliary or back-up source for electricity.
NASA Astrophysics Data System (ADS)
Smith, J. A.; Peter, D. B.; Tromp, J.; Komatitsch, D.; Lefebvre, M. P.
2015-12-01
We present both SPECFEM3D_Cartesian and SPECFEM3D_GLOBE open-source codes, representing high-performance numerical wave solvers simulating seismic wave propagation for local-, regional-, and global-scale application. These codes are suitable for both forward propagation in complex media and tomographic imaging. Both solvers compute highly accurate seismic wave fields using the continuous Galerkin spectral-element method on unstructured meshes. Lateral variations in compressional- and shear-wave speeds, density, as well as 3D attenuation Q models, topography and fluid-solid coupling are all readily included in both codes. For global simulations, effects due to rotation, ellipticity, the oceans, 3D crustal models, and self-gravitation are additionally included. Both packages provide forward and adjoint functionality suitable for adjoint tomography on high-performance computing architectures. We highlight the most recent release of the global version which includes improved performance, simultaneous MPI runs, OpenCL and CUDA support via an automatic source-to-source transformation library (BOAST), parallel I/O readers and writers for databases using ADIOS and seismograms using the recently developed Adaptable Seismic Data Format (ASDF) with built-in provenance. This makes our spectral-element solvers current state-of-the-art, open-source community codes for high-performance seismic wave propagation on arbitrarily complex 3D models. Together with these solvers, we provide full-waveform inversion tools to image the Earth's interior at unprecedented resolution.
Increasing Flight Software Reuse with OpenSatKit
NASA Technical Reports Server (NTRS)
McComas, David
2018-01-01
In January 2015 the NASA Goddard Space Flight Center (GSFC) released the Core Flight System (cFS) as open source under the NASA Open Source Agreement (NOSA) license. The cFS is based on flight software (FSW) developed for 12 spacecraft spanning nearly two decades of effort and it can provide about a third of the FSW functionality for a low-earth orbiting scientific spacecraft. The cFS is a FSW framework that is portable, configurable, and extendable using a product line deployment model. However, the components are maintained separately so the user must configure, integrate, and deploy them as a cohesive functional system. This can be very challenging especially for organizations such as universities building cubesats that have minimal experience developing FSW. Supporting universities was one of the primary motivators for releasing the cFS under NOSA. This paper describes the OpenSatKit that was developed to address the cFS deployment challenges and to serve as a cFS training platform for new users. It provides a fully functional out-of-the box software system that includes NASA's cFS, Ball Aerospaceâ€"TM"s command and control system COSMOS, and a NASA dynamic simulator called 42. The kit is freely available since all of the components have been released as open source. The kit runs on a Linux platform, includes 8 cFS applications, several kit-specific applications, and built in demos illustrating how to use key application features. It also includes the software necessary to port the cFS to a Raspberry Pi and instructions for configuring COSMOS to communicate with the target. All of the demos and test scripts can be rerun unchanged with the cFS running on the Raspberry Pi. The cFS uses a 3-tiered layered architecture including a platform abstraction layer, a Core Flight Executive (cFE) middle layer, and an application layer. Similar to smart phones, the cFS application layer is the key architectural feature for userâ€"TM"s to extend the FSW functionality to meet their mission-specific requirements. The platform abstraction layer and the cFE layers go a step further than smart phones by providing a platform-agnostic Application Programmer Interface (API) that allows applications to run unchanged on different platforms. OpenSatKit can serve two significant architectural roles that will further help the adoption of the cFS and help create a community of users that can share assets. First, the kit is being enhanced to automate the integration of applications with the goal of creating a virtual cFS 'App Store'. Second, a platform certification test suite can be developed that would allow users to verify the port of the cFS to a new platform. This paper will describe the current state of these efforts and future plans.
Autocorrel I: A Neural Network Based Network Event Correlation Approach
2005-05-01
which concern any component of the network. 2.1.1 Existing Intrusion Detection Systems EMERALD [8] is a distributed, scalable, hierarchal, customizable...writing this paper, the updaters of this system had not released their correlation unit to the public. EMERALD ex- plicitly divides statistical analysis... EMERALD , NetSTAT is scalable and composi- ble. QuidSCOR [12] is an open-source IDS, though it requires a subscription from its publisher, Qualys Inc
Summary and evaluation of the Strategic Defense Initiative Space Power Architecture Study
NASA Technical Reports Server (NTRS)
Edenburn, M. (Editor); Smith, J. M. (Editor)
1989-01-01
The Space Power Architecture Study (SPAS) identified and evaluated power subsystem options for multimegawatt electric (MMWE) space based weapons and surveillance platforms for the Strategic Defense Initiative (SDI) applications. Steady state requirements of less than 1 MMWE are adequately covered by the SP-100 nuclear space power program and hence were not addressed in the SPAS. Four steady state power systems less than 1 MMWE were investigated with little difference between them on a mass basis. The majority of the burst power systems utilized H(2) from the weapons and were either closed (no effluent), open (effluent release) or steady state with storage (no effluent). Closed systems used nuclear or combustion heat source with thermionic, Rankine, turboalternator, fuel cell and battery conversion devices. Open systems included nuclear or combustion heat sources using turboalternator, magnetohydrodynamic, fuel cell or battery power conversion devices. The steady state systems with storage used the SP-100 or Star-M reactors as energy sources and flywheels, fuel cells or batteries to store energy for burst applications. As with other studies the open systems are by far the lightest, most compact and simplist (most reliable) systems. However, unlike other studies the SPAS studied potential platform operational problems caused by effluents or vibration.
Open Babel: An open chemical toolbox
2011-01-01
Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300
The elementary events of Ca2+ release elicited by membrane depolarization in mammalian muscle.
Csernoch, L; Zhou, J; Stern, M D; Brum, G; Ríos, E
2004-05-15
Cytosolic [Ca(2+)] transients elicited by voltage clamp depolarization were examined by confocal line scanning of rat skeletal muscle fibres. Ca(2+) sparks were observed in the fibres' membrane-permeabilized ends, but not in responses to voltage in the membrane-intact area. Elementary events of the depolarization-evoked response could be separated either at low voltages (near -50 mV) or at -20 mV in partially inactivated cells. These were of lower amplitude, narrower and of much longer duration than sparks, similar to 'lone embers' observed in the permeabilized segments. Their average amplitude was 0.19 and spatial half-width 1.3 microm. Other parameters depended on voltage. At -50 mV average duration was 111 ms and latency 185 ms. At -20 mV duration was 203 ms and latency 24 ms. Ca(2+) release current, calculated on an average of events, was nearly steady at 0.5-0.6 pA. Accordingly, simulations of the fluorescence event elicited by a subresolution source of 0.5 pA open for 100 ms had morphology similar to the experimental average. Because 0.5 pA is approximately the current measured for single RyR channels in physiological conditions, the elementary fluorescence events in rat muscle probably reflect opening of a single RyR channel. A reconstruction of cell-averaged release flux at -20 mV based on the observed distribution of latencies and calculated elementary release had qualitatively correct but slower kinetics than the release flux in prior whole-cell measurements. The qualitative agreement indicates that global Ca(2+) release flux results from summation of these discrete events. The quantitative discrepancies suggest that the partial inactivation strategy may lead to events of greater duration than those occurring physiologically in fully polarized cells.
Toirac, Alexander; Giugale, Juan M; Fowler, John R
2017-05-01
Endoscopic cubital tunnel release has been proposed as an alternative to open in situ release. However, it is difficult to analyze outcomes after endoscopic release, as only a few small case series exist. The electronic databases of PubMed (1960-June 2014) were systematically screened for studies related to endoscopic cubital tunnel release or open in situ cubital tunnel release. Baseline characteristics, clinical scores, and complication rates were abstracted. The binary outcome was defined as rate of excellent/good response versus fair/poor. Complications were recorded into 3 categories: wound problems, persistent ulnar nerve symptoms, and other. We included 8 articles that reported the clinical outcomes after surgical intervention including a total of 494 patients (344 endoscopic, 150 open in situ). The pooled rate of excellent/good was 92.0% (88.8%-95.2%) for endoscopic and 82.7% (76.15%-89.2%) for open. We identified 18 articles that detailed complications including a total of 1108 patients (691 endoscopic, 417 open). The 4 articles that listed complication rates for both endoscopic and open techniques were analyzed and showed a pooled odds ratio of 0.280 (95% confidence interval, 0.125-0.625), indicating that endoscopic patients have reduced odds of complications. The results of this systematic review suggest that there is a difference in clinical outcomes between the open in situ and endoscopic cubital tunnel release, with the endoscopic technique being superior in regard to both complication rates along with patient satisfaction.
Leveraging Open Standards and Technologies to Enhance Community Access to Earth Science Lidar Data
NASA Astrophysics Data System (ADS)
Crosby, C. J.; Nandigam, V.; Krishnan, S.; Cowart, C.; Baru, C.; Arrowsmith, R.
2011-12-01
Lidar (Light Detection and Ranging) data, collected from space, airborne and terrestrial platforms, have emerged as an invaluable tool for a variety of Earth science applications ranging from ice sheet monitoring to modeling of earth surface processes. However, lidar present a unique suite of challenges from the perspective of building cyberinfrastructure systems that enable the scientific community to access these valuable research datasets. Lidar data are typically characterized by millions to billions of individual measurements of x,y,z position plus attributes; these "raw" data are also often accompanied by derived raster products and are frequently terabytes in size. As a relatively new and rapidly evolving data collection technology, relevant open data standards and software projects are immature compared to those for other remote sensing platforms. The NSF-funded OpenTopography Facility project has developed an online lidar data access and processing system that co-locates data with on-demand processing tools to enable users to access both raw point cloud data as well as custom derived products and visualizations. OpenTopography is built on a Service Oriented Architecture (SOA) in which applications and data resources are deployed as standards compliant (XML and SOAP) Web services with the open source Opal Toolkit. To develop the underlying applications for data access, filtering and conversion, and various processing tasks, OpenTopography has heavily leveraged existing open source software efforts for both lidar and raster data. Operating on the de facto LAS binary point cloud format (maintained by ASPRS), open source libLAS and LASlib libraries provide OpenTopography data ingestion, query and translation capabilities. Similarly, raster data manipulation is performed through a suite of services built on the Geospatial Data Abstraction Library (GDAL). OpenTopography has also developed our own algorithm for high-performance gridding of lidar point cloud data, Points2Grid, and have released the code as an open source project. An emerging conversation that the lidar community and OpenTopography are actively engaged in is the need for open, community supported standards and metadata for both full waveform and terrestrial (waveform and discrete return) lidar data. Further, given the immature nature of many lidar data archives and limited online access to public domain data, there is an opportunity to develop interoperable data catalogs based on an open standard such as the OGC CSW specification to facilitate discovery and access to Earth science oriented lidar data.
ProFound: Source Extraction and Application to Modern Survey Data
NASA Astrophysics Data System (ADS)
Robotham, A. S. G.; Davies, L. J. M.; Driver, S. P.; Koushan, S.; Taranu, D. S.; Casura, S.; Liske, J.
2018-05-01
We introduce PROFOUND, a source finding and image analysis package. PROFOUND provides methods to detect sources in noisy images, generate segmentation maps identifying the pixels belonging to each source, and measure statistics like flux, size, and ellipticity. These inputs are key requirements of PROFIT, our recently released galaxy profiling package, where the design aim is that these two software packages will be used in unison to semi-automatically profile large samples of galaxies. The key novel feature introduced in PROFOUND is that all photometry is executed on dilated segmentation maps that fully contain the identifiable flux, rather than using more traditional circular or ellipse-based photometry. Also, to be less sensitive to pathological segmentation issues, the de-blending is made across saddle points in flux. We apply PROFOUND in a number of simulated and real-world cases, and demonstrate that it behaves reasonably given its stated design goals. In particular, it offers good initial parameter estimation for PROFIT, and also segmentation maps that follow the sometimes complex geometry of resolved sources, whilst capturing nearly all of the flux. A number of bulge-disc decomposition projects are already making use of the PROFOUND and PROFIT pipeline, and adoption is being encouraged by publicly releasing the software for the open source R data analysis platform under an LGPL-3 license on GitHub (github.com/asgr/ProFound).
SBEToolbox: A Matlab Toolbox for Biological Network Analysis
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418
SBEToolbox: A Matlab Toolbox for Biological Network Analysis.
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.
NASA Technical Reports Server (NTRS)
Utz, Hans Heinrich
2011-01-01
This talk gives an overview of the the Robot Applications Programmers Interface Delegate (RAPID) as well as the distributed systems middleware Data Distribution Service (DDS). DDS is an open software standard, RAPID is cleared for open-source release under NOSA. RAPID specifies data-structures and semantics for high-level telemetry published by NASA robotic software. These data-structures are supported by multiple robotic platforms at Johnson Space Center (JSC), Jet Propulsion Laboratory (JPL) and Ames Research Center (ARC), providing high-level interoperability between those platforms. DDS is used as the middleware for data transfer. The feature set of the middleware heavily influences the design decision made in the RAPID specification. So it is appropriate to discuss both in this introductory talk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan
A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less
NASA Astrophysics Data System (ADS)
Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.
2017-09-01
We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.
Open Source Tools for Seismicity Analysis
NASA Astrophysics Data System (ADS)
Powers, P.
2010-12-01
The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.
NASA Astrophysics Data System (ADS)
Sandalski, Stou
Smooth particle hydrodynamics is an efficient method for modeling the dynamics of fluids. It is commonly used to simulate astrophysical processes such as binary mergers. We present a newly developed GPU accelerated smooth particle hydrodynamics code for astrophysical simulations. The code is named
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perez, R. Navarro; Schunck, N.; Lasseri, R.
2017-03-09
HFBTHO is a physics computer code that is used to model the structure of the nucleus. It is an implementation of the nuclear energy Density Functional Theory (DFT), where the energy of the nucleus is obtained by integration over space of some phenomenological energy density, which is itself a functional of the neutron and proton densities. In HFBTHO, the energy density derives either from the zero-range Dkyrme or the finite-range Gogny effective two-body interaction between nucleons. Nuclear superfluidity is treated at the Hartree-Fock-Bogoliubov (HFB) approximation, and axial-symmetry of the nuclear shape is assumed. This version is the 3rd release ofmore » the program; the two previous versions were published in Computer Physics Communications [1,2]. The previous version was released at LLNL under GPL 3 Open Source License and was given release code LLNL-CODE-573953.« less
NASA Astrophysics Data System (ADS)
Yver Kwok, C. E.; Müller, D.; Caldow, C.; Lebègue, B.; Mønster, J. G.; Rella, C. W.; Scheutz, C.; Schmidt, M.; Ramonet, M.; Warneke, T.; Broquet, G.; Ciais, P.
2015-07-01
This study presents two methods for estimating methane emissions from a waste water treatment plant (WWTP) along with results from a measurement campaign at a WWTP in Valence, France. These methods, chamber measurements and tracer release, rely on Fourier transform infrared spectroscopy and cavity ring-down spectroscopy instruments. We show that the tracer release method is suitable for quantifying facility- and some process-scale emissions, while the chamber measurements provide insight into individual process emissions. Uncertainties for the two methods are described and discussed. Applying the methods to CH4 emissions of the WWTP, we confirm that the open basins are not a major source of CH4 on the WWTP (about 10 % of the total emissions), but that the pretreatment and sludge treatment are the main emitters. Overall, the waste water treatment plant is representative of an average French WWTP.
Demonstrations that the Solar Wind Is Not Accelerated by Waves
NASA Technical Reports Server (NTRS)
Roberts, Aaron
2008-01-01
The present work uses both observations and theoretical considerations to show that hydromagnetic waves cannot produce the acceleration of the fast solar wind and the related heating of the open solar corona. Waves do exist, and can play a role in the differential heating and acceleration of minor ions, but their amplitudes are not sufficient to power the wind, as demonstrated by extrapolation of magnetic spectra from Helios and Ulysses observations. Dissipation mechanisms invoked to circumvent this conclusion cannot be effective for a variety of reasons. In particular, turbulence does not play a strong role in the corona as shown by both observations of coronal striations and theoretical considerations of line-tying to a nonturbulent photosphere, nonlocality of interactions, and the nature of the kinetic dissipation. In the absence of wave heating and acceleration, the chromosphere and transition region become the natural source of open coronal energization. We suggest a variant of the 'velocity filtration' approach in which the emergence and complex churning of the magnetic flux in the chromosphere and transition region continuously and ubiquitously produces the nonthermal distributions required. These particles are then released by magnetic carpet reconnection at a wide range of scales and produce the wind as described in kinetic approaches. Since the carpet reconnection is not the main source of the energization of the plasma, there is no expectation of an observable release of energy in nanoflares.
Modular Integrated Stackable Layers (MISL) 1.1 Design Specification. Design Guideline Document
NASA Technical Reports Server (NTRS)
Yim, Hester J.
2012-01-01
This document establishes the design guideline of the Modular Instrumentation Data Acquisition (MI-DAQ) system in utilization of several designs available in EV. The MI- DAQ provides the options to the customers depending on their system requirements i.e. a 28V interface power supply, a low power battery operated system, a low power microcontroller, a higher performance microcontroller, a USB interface, a Ethernet interface, a wireless communication, various sensor interfaces, etc. Depending on customer's requirements, the each functional board can be stacked up from a bottom level of power supply to a higher level of stack to provide user interfaces. The stack up of boards are accomplished by a predefined and standardized power bus and data bus connections which are included in this document along with other physical and electrical guidelines. This guideline also provides information for a new design options. This specification is the product of a collaboration between NASA/JSC/EV and Texas A&M University. The goal of the collaboration is to open source the specification and allow outside entities to design, build, and market modules that are compatible with the specification. NASA has designed and is using numerous modules that are compatible to this specification. A limited number of these modules will also be released as open source designs to support the collaboration. The released designs are listed in the Applicable Documents.
Arctic Sea Salt Aerosol from Blowing Snow and Sea Ice Surfaces - a Missing Natural Source in Winter
NASA Astrophysics Data System (ADS)
Frey, M. M.; Norris, S. J.; Brooks, I. M.; Nishimura, K.; Jones, A. E.
2015-12-01
Atmospheric particles in the polar regions consist mostly of sea salt aerosol (SSA). SSA plays an important role in regional climate change through influencing the surface energy balance either directly or indirectly via cloud formation. SSA irradiated by sunlight also releases very reactive halogen radicals, which control concentrations of ozone, a pollutant and greenhouse gas. However, models under-predict SSA concentrations in the Arctic during winter pointing to a missing source. It has been recently suggested that salty blowing snow above sea ice, which is evaporating, to be that source as it may produce more SSA than equivalent areas of open ocean. Participation in the 'Norwegian Young Sea Ice Cruise (N-ICE 2015)' on board the research vessel `Lance' allowed to test this hypothesis in the Arctic sea ice zone during winter. Measurements were carried out from the ship frozen into the pack ice North of 80º N during February to March 2015. Observations at ground level (0.1-2 m) and from the ship's crows nest (30 m) included number concentrations and size spectra of SSA (diameter range 0.3-10 μm) as well as snow particles (diameter range 50-500 μm). During and after blowing snow events significant SSA production was observed. In the aerosol and snow phase sulfate is fractionated with respect to sea water, which confirms sea ice surfaces and salty snow, and not the open ocean, to be the dominant source of airborne SSA. Aerosol shows depletion in bromide with respect to sea water, especially after sunrise, indicating photochemically driven release of bromine. We discuss the SSA source strength from blowing snow in light of environmental conditions (wind speed, atmospheric turbulence, temperature and snow salinity) and recommend improved model parameterisations to estimate regional aerosol production. N-ICE 2015 results are then compared to a similar study carried out previously in the Weddell Sea during the Antarctic winter.
NASA Astrophysics Data System (ADS)
Rieger, C.; Byrne, J. M.
2015-12-01
Citizen science includes networks of ordinary people acting as sensors, observing and recording information for science. OpenStreetMap is one such sensor network which empowers citizens to collaboratively produce a global picture from free geographic information. The success of this open source software is extended by the development of freely used open databases for the user community. Participating citizens do not require a high level of skill. Final results are processed by professionals following quality assurance protocols before map information is released. OpenStreetMap is not only the cheapest source of timely maps in many cases but also often the only source. This is particularly true in developing countries. Emergency responses to the recent earthquake in Nepal illustrates the value for rapidly updated geographical information. This includes emergency management, damage assessment, post-disaster response, and future risk mitigation. Local disaster conditions (landslides, road closings, bridge failures, etc.) were documented for local aid workers by citizen scientists working remotely. Satellites and drones provide digital imagery of the disaster zone and OpenStreetMap participants shared the data from locations around the globe. For the Nepal earthquake, OpenStreetMap provided a team of volunteers on the ground through their Humanitarian OpenStreetMap Team (HOT) which contribute data to the disaster response through smartphones and laptops. This, combined with global citizen science efforts, provided immediate geographically useful maps to assist aid workers, including the Red Cross and Canadian DART Team, and the Nepalese government. As of August 2014, almost 1.7 million users provided over 2.5 billion edits to the OpenStreetMap map database. Due to the increased usage of smartphones, GPS-enabled devices, and the growing participation in citizen science projects, data gathering is proving an effective way to contribute as a global citizen. This paper aims to describe the significance of citizen participation in the case of the Nepal earthquake using OpenStreetMap to respond to disasters as well as its role in future risk mitigation.
Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy
2017-10-06
The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.
Plenario: A Spatio-Temporal Platform for Discovery and Exploration of Urban Science Data
NASA Astrophysics Data System (ADS)
Engler, W. H.; Malik, T.; Catlett, C.; Foster, I.; Goldstein, B.
2015-12-01
The past decade has seen the widespread release of open data concerning city services, conditions, and activities by government bodies and public institutions of all sizes. Hundreds of open data portals now host thousands of datasets of many different types. These new data sources represent enormous potential for improved understanding of urban dynamics and processes—and, ultimately, for more livable, efficient, and prosperous communities. However, those who seek to realize this potential quickly discover that discovering and applying those data relevant to any particular question can be extraordinarily difficult, due to decentralized storage, heterogeneous formats, and poor documentation. In this context, we introduce Plenario, a platform designed to automating time-consuming tasks associated with the discovery, exploration, and application of open city data—and, in so doing, reduce barriers to data use for researchers, policymakers, service providers, journalists, and members of the general public. Key innovations include a geospatial data warehouse that allows data from many sources to be registered into a common spatial and temporal frame; simple and intuitive interfaces that permit rapid discovery and exploration of data subsets pertaining to a particular area and time, regardless of type and source; easy export of such data subsets for further analysis; a user-configurable data ingest framework for automated importing and periodic updating of new datasets into the data warehouse; cloud hosting for elastic scaling and rapid creation of new Plenario instances; and an open source implementation to enable community contributions. We describe here the architecture and implementation of the Plenario platform, discuss lessons learned from its use by several communities, and outline plans for future work.
Geologic map of the Richland 1:100,000 quadrangle, Washington
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reidel, S.P.; Fecht, K.R.
1993-09-01
This map of the Richland 1:100,000-scale quadrangle, Washington, shows the geology of one of fifteen complete or partial 1:100,000-scale quadrangles that cover the southeast quadrant of Washington. Geologic maps of these quadrangles have been compiled by geologists with the Washington Division of Geology and Earth Resources (DGER) and Washington State University and are the principal data sources for a 1:250,000-scale geologic map of the southeast quadrant of Washington, which is in preparation. Eleven of these quadrangles are being released as DGER open-file reports. The map of the Wenatchee quadrangle has been published by the US Geological Survey, and the Mosesmore » Lake, Ritzville quadrangles have already been released.« less
Cloud prediction of protein structure and function with PredictProtein for Debian.
Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Staniewski, Cedric; Rost, Burkhard
2013-01-01
We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome.
Cloud Prediction of Protein Structure and Function with PredictProtein for Debian
Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Rost, Burkhard
2013-01-01
We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome. PMID:23971032
Exploiting Open Environmental Data using Linked Data and Cloud Computing: the MELODIES project
NASA Astrophysics Data System (ADS)
Blower, Jon; Gonçalves, Pedro; Caumont, Hervé; Koubarakis, Manolis; Perkins, Bethan
2015-04-01
The European Open Data Strategy establishes important new principles that ensure that European public sector data will be released at no cost (or marginal cost), in machine-readable, commonly-understood formats, and with liberal licences enabling wide reuse. These data encompass both scientific data about the environment (from Earth Observation and other fields) and other public sector information, including diverse topics such as demographics, health and crime. Many open geospatial datasets (e.g. land use) are already available through the INSPIRE directive and made available through infrastructures such as the Global Earth Observation System of Systems (GEOSS). The intention of the Open Data Strategy is to stimulate the growth of research and value-adding services that build upon these data streams; however, the potential value inherent in open data, and the benefits that can be gained by combining previously-disparate sources of information are only just starting to become understood. The MELODIES project (Maximising the Exploitation of Linked Open Data In Enterprise and Science) is developing eight innovative and sustainable services, based upon Open Data, for users in research, government, industry and the general public in a broad range of societal and environmental benefit areas. MELODIES (http://melodiesproject.eu) is a European FP7 project that is coordinated by the University of Reading and has sixteen partners (including nine SMEs) from eight European countries. It started in November 2013 and will run for three years. The project is therefore in its early stages and therefore we will value the opportunity that this workshop affords to present our plans and interact with the wider Linked Geospatial Data community. The project is developing eight new services[1] covering a range of domains including agriculture, urban ecosystems, land use management, marine information, desertification, crisis management and hydrology. These services will combine Earth Observation data with other open data sources to produce new information for the benefit of scientists, industry, government decision-makers, public service providers and citizens. The long-term sustainability of the services will be assessed critically throughout the project from a number of angles (technical, political and economic), in order to ensure that the full benefits of the MELODIES project are realised in the long term. The priority of the project, therefore, is to demonstrate that releasing data openly leads to concrete commercial and scientific benefits, and can stimulate the production of new applications and viable services. [1] http://www.melodiesproject.eu/services.html
Observations of the release of non-methane hydrocarbons from fractured shale.
Sommariva, Roberto; Blake, Robert S; Cuss, Robert J; Cordell, Rebecca L; Harrington, Jon F; White, Iain R; Monks, Paul S
2014-01-01
The organic content of shale has become of commercial interest as a source of hydrocarbons, owing to the development of hydraulic fracturing ("fracking"). While the main focus is on the extraction of methane, shale also contains significant amounts of non-methane hydrocarbons (NMHCs). We describe the first real-time observations of the release of NMHCs from a fractured shale. Samples from the Bowland-Hodder formation (England) were analyzed under different conditions using mass spectrometry, with the objective of understanding the dynamic process of gas release upon fracturing of the shale. A wide range of NMHCs (alkanes, cycloalkanes, aromatics, and bicyclic hydrocarbons) are released at parts per million or parts per billion level with temperature- and humidity-dependent release rates, which can be rationalized in terms of the physicochemical characteristics of different hydrocarbon classes. Our results indicate that higher energy inputs (i.e., temperatures) significantly increase the amount of NMHCs released from shale, while humidity tends to suppress it; additionally, a large fraction of the gas is released within the first hour after the shale has been fractured. These findings suggest that other hydrocarbons of commercial interest may be extracted from shale and open the possibility to optimize the "fracking" process, improving gas yields and reducing environmental impacts.
A multi-source dataset of urban life in the city of Milan and the Province of Trentino.
Barlacchi, Gianni; De Nadai, Marco; Larcher, Roberto; Casella, Antonio; Chitic, Cristiana; Torrisi, Giovanni; Antonelli, Fabrizio; Vespignani, Alessandro; Pentland, Alex; Lepri, Bruno
2015-01-01
The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others.
A multi-source dataset of urban life in the city of Milan and the Province of Trentino
NASA Astrophysics Data System (ADS)
Barlacchi, Gianni; de Nadai, Marco; Larcher, Roberto; Casella, Antonio; Chitic, Cristiana; Torrisi, Giovanni; Antonelli, Fabrizio; Vespignani, Alessandro; Pentland, Alex; Lepri, Bruno
2015-10-01
The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others.
A multi-source dataset of urban life in the city of Milan and the Province of Trentino
Barlacchi, Gianni; De Nadai, Marco; Larcher, Roberto; Casella, Antonio; Chitic, Cristiana; Torrisi, Giovanni; Antonelli, Fabrizio; Vespignani, Alessandro; Pentland, Alex; Lepri, Bruno
2015-01-01
The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others. PMID:26528394
Rowe, Aaron A; Bonham, Andrew J; White, Ryan J; Zimmer, Michael P; Yadgar, Ramsin J; Hobza, Tony M; Honea, Jim W; Ben-Yaacov, Ilan; Plaxco, Kevin W
2011-01-01
Although potentiostats are the foundation of modern electrochemical research, they have seen relatively little application in resource poor settings, such as undergraduate laboratory courses and the developing world. One reason for the low penetration of potentiostats is their cost, as even the least expensive commercially available laboratory potentiostats sell for more than one thousand dollars. An inexpensive electrochemical workstation could thus prove useful in educational labs, and increase access to electrochemistry-based analytical techniques for food, drug and environmental monitoring. With these motivations in mind, we describe here the CheapStat, an inexpensive (<$80), open-source (software and hardware), hand-held potentiostat that can be constructed by anyone who is proficient at assembling circuits. This device supports a number of potential waveforms necessary to perform cyclic, square wave, linear sweep and anodic stripping voltammetry. As we demonstrate, it is suitable for a wide range of applications ranging from food- and drug-quality testing to environmental monitoring, rapid DNA detection, and educational exercises. The device's schematics, parts lists, circuit board layout files, sample experiments, and detailed assembly instructions are available in the supporting information and are released under an open hardware license.
CheapStat: An Open-Source, “Do-It-Yourself” Potentiostat for Analytical and Educational Applications
Rowe, Aaron A.; Bonham, Andrew J.; White, Ryan J.; Zimmer, Michael P.; Yadgar, Ramsin J.; Hobza, Tony M.; Honea, Jim W.; Ben-Yaacov, Ilan; Plaxco, Kevin W.
2011-01-01
Although potentiostats are the foundation of modern electrochemical research, they have seen relatively little application in resource poor settings, such as undergraduate laboratory courses and the developing world. One reason for the low penetration of potentiostats is their cost, as even the least expensive commercially available laboratory potentiostats sell for more than one thousand dollars. An inexpensive electrochemical workstation could thus prove useful in educational labs, and increase access to electrochemistry-based analytical techniques for food, drug and environmental monitoring. With these motivations in mind, we describe here the CheapStat, an inexpensive (<$80), open-source (software and hardware), hand-held potentiostat that can be constructed by anyone who is proficient at assembling circuits. This device supports a number of potential waveforms necessary to perform cyclic, square wave, linear sweep and anodic stripping voltammetry. As we demonstrate, it is suitable for a wide range of applications ranging from food- and drug-quality testing to environmental monitoring, rapid DNA detection, and educational exercises. The device's schematics, parts lists, circuit board layout files, sample experiments, and detailed assembly instructions are available in the supporting information and are released under an open hardware license. PMID:21931613
The Ultracool Typing Kit - An Open-Source, Qualitative Spectral Typing GUI for L Dwarfs
NASA Astrophysics Data System (ADS)
Schwab, Ellianna; Cruz, Kelle; Núñez, Alejandro; Burgasser, Adam J.; Rice, Emily; Reid, Neill; Faherty, Jacqueline K.; BDNYC
2018-01-01
The Ultracool Typing Kit (UTK) is an open-source graphical user interface for classifying the NIR spectral types of L dwarfs, including field and low-gravity dwarfs spanning L0-L9. The user is able to input an NIR spectrum and qualitatively compare the input spectrum to a full suite of spectral templates, including low-gravity beta and gamma templates. The user can choose to view the input spectrum as both a band-by-band comparison with the templates and a full bandwidth comparison with NIR spectral standards. Once an optimal qualitative comparison is selected, the user can save their spectral type selection both graphically and to a database. Using UTK to classify 78 previously typed L dwarfs, we show that a band-by-band classification method more accurately agrees with optical spectral typing systems than previous L dwarf NIR classification schemes. UTK is written in python, released on Zenodo with a BSD-3 clause license and publicly available on the BDNYC Github page.
Hawkeye and AMOS: visualizing and assessing the quality of genome assemblies
Schatz, Michael C.; Phillippy, Adam M.; Sommer, Daniel D.; Delcher, Arthur L.; Puiu, Daniela; Narzisi, Giuseppe; Salzberg, Steven L.; Pop, Mihai
2013-01-01
Since its launch in 2004, the open-source AMOS project has released several innovative DNA sequence analysis applications including: Hawkeye, a visual analytics tool for inspecting the structure of genome assemblies; the Assembly Forensics and FRCurve pipelines for systematically evaluating the quality of a genome assembly; and AMOScmp, the first comparative genome assembler. These applications have been used to assemble and analyze dozens of genomes ranging in complexity from simple microbial species through mammalian genomes. Recent efforts have been focused on enhancing support for new data characteristics brought on by second- and now third-generation sequencing. This review describes the major components of AMOS in light of these challenges, with an emphasis on methods for assessing assembly quality and the visual analytics capabilities of Hawkeye. These interactive graphical aspects are essential for navigating and understanding the complexities of a genome assembly, from the overall genome structure down to individual bases. Hawkeye and AMOS are available open source at http://amos.sourceforge.net. PMID:22199379
Apis - a Digital Inventory of Archaeological Heritage Based on Remote Sensing Data
NASA Astrophysics Data System (ADS)
Doneus, M.; Forwagner, U.; Liem, J.; Sevara, C.
2017-08-01
Heritage managers are in need of dynamic spatial inventories of archaeological and cultural heritage that provide them with multipurpose tools to interactively understand information about archaeological heritage within its landscape context. Specifically, linking site information with the respective non-invasive prospection data is of increasing importance as it allows for the assessment of inherent uncertainties related to the use and interpretation of remote sensing data by the educated and knowledgeable heritage manager. APIS, the archaeological prospection information system of the Aerial Archive of the University of Vienna, is specifically designed to meet these needs. It provides storage and easy access to all data concerning aerial photographs and archaeological sites through a single GIS-based application. Furthermore, APIS has been developed in an open source environment, which allows it to be freely distributed and modified. This combination in one single open source system facilitates an easy workflow for data management, interpretation, storage, and retrieval. APIS and a sample dataset will be released free of charge under creative commons license in near future.
LSSGalPy: Interactive Visualization of the Large-scale Environment Around Galaxies
NASA Astrophysics Data System (ADS)
Argudo-Fernández, M.; Duarte Puertas, S.; Ruiz, J. E.; Sabater, J.; Verley, S.; Bergond, G.
2017-05-01
New tools are needed to handle the growth of data in astrophysics delivered by recent and upcoming surveys. We aim to build open-source, light, flexible, and interactive software designed to visualize extensive three-dimensional (3D) tabular data. Entirely written in the Python language, we have developed interactive tools to browse and visualize the positions of galaxies in the universe and their positions with respect to its large-scale structures (LSS). Motivated by a previous study, we created two codes using Mollweide projection and wedge diagram visualizations, where survey galaxies can be overplotted on the LSS of the universe. These are interactive representations where the visualizations can be controlled by widgets. We have released these open-source codes that have been designed to be easily re-used and customized by the scientific community to fulfill their needs. The codes are adaptable to other kinds of 3D tabular data and are robust enough to handle several millions of objects. .
Parser for Sabin-to-Mahoney Transition Model of Quasispecies Replication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ecale Zhou, Carol
2016-01-03
This code is a data parse for preparing output from the Qspp agent-based stochastic simulation model for plotting in Excel. This code is specific to a set of simulations that were run for the purpose of preparing data for a publication. It is necessary to make this code open-source in order to publish the model code (Qspp), which has already been released. There is a necessity of assuring that results from using Qspp for a publication
The Experiment Factory: Standardizing Behavioral Experiments.
Sochat, Vanessa V; Eisenberg, Ian W; Enkavi, A Zeynep; Li, Jamie; Bissett, Patrick G; Poldrack, Russell A
2016-01-01
The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms.
The Experiment Factory: Standardizing Behavioral Experiments
Sochat, Vanessa V.; Eisenberg, Ian W.; Enkavi, A. Zeynep; Li, Jamie; Bissett, Patrick G.; Poldrack, Russell A.
2016-01-01
The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms. PMID:27199843
caGrid 1.0 : an enterprise Grid infrastructure for biomedical research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oster, S.; Langella, S.; Hastings, S.
To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design: An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG{trademark}) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including (1) discovery, (2) integrated and large-scale data analysis, and (3) coordinated study. Measurements: The caGrid is built as a Grid software infrastructure andmore » leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results: The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL:
Upflow bioreactor with septum and pressure release mechanism
Hansen, Conly L.; Hansen, Carl S.; Pack, Kevin; Milligan, John; Benefiel, Bradley C.; Tolman, C. Wayne; Tolman, Kenneth W.
2010-04-20
An upflow bioreactor includes a vessel having an inlet and an outlet configured for upflow operation. A septum is positioned within the vessel and defines a lower chamber and an upper chamber. The septum includes an aperture that provides fluid communication between the upper chamber and lower chamber. The bioreactor also includes means for releasing pressure buildup in the lower chamber. In one configuration, the septum includes a releasable portion having an open position and a closed position. The releasable portion is configured to move to the open position in response to pressure buildup in the lower chamber. In the open position fluid communication between the lower chamber and the upper chamber is increased. Alternatively the lower chamber can include a pressure release line that is selectively actuated by pressure buildup. The pressure release mechanism can prevent the bioreactor from plugging and/or prevent catastrophic damage to the bioreactor caused by high pressures.
NASA Astrophysics Data System (ADS)
Brovelli, M. A.; Oxoli, D.; Zurbarán, M. A.
2016-06-01
During the past years Web 2.0 technologies have caused the emergence of platforms where users can share data related to their activities which in some cases are then publicly released with open licenses. Popular categories for this include community platforms where users can upload GPS tracks collected during slow travel activities (e.g. hiking, biking and horse riding) and platforms where users share their geolocated photos. However, due to the high heterogeneity of the information available on the Web, the sole use of these user-generated contents makes it an ambitious challenge to understand slow mobility flows as well as to detect the most visited locations in a region. Exploiting the available data on community sharing websites allows to collect near real-time open data streams and enables rigorous spatial-temporal analysis. This work presents an approach for collecting, unifying and analysing pointwise geolocated open data available from different sources with the aim of identifying the main locations and destinations of slow mobility activities. For this purpose, we collected pointwise open data from the Wikiloc platform, Twitter, Flickr and Foursquare. The analysis was confined to the data uploaded in Lombardy Region (Northern Italy) - corresponding to millions of pointwise data. Collected data was processed through the use of Free and Open Source Software (FOSS) in order to organize them into a suitable database. This allowed to run statistical analyses on data distribution in both time and space by enabling the detection of users' slow mobility preferences as well as places of interest at a regional scale.
Nuclear Condensation during Mouse Erythropoiesis Requires Caspase-3-Mediated Nuclear Opening.
Zhao, Baobing; Mei, Yang; Schipma, Matthew J; Roth, Eric Wayne; Bleher, Reiner; Rappoport, Joshua Z; Wickrema, Amittha; Yang, Jing; Ji, Peng
2016-03-07
Mammalian erythropoiesis involves chromatin condensation that is initiated in the early stage of terminal differentiation. The mechanisms of chromatin condensation during erythropoiesis are unclear. Here, we show that the mouse erythroblast forms large, transient, and recurrent nuclear openings that coincide with the condensation process. The opening lacks nuclear lamina, nuclear pore complexes, and nuclear membrane, but it is distinct from nuclear envelope changes that occur during apoptosis and mitosis. A fraction of the major histones are released from the nuclear opening and degraded in the cytoplasm. We demonstrate that caspase-3 is required for the nuclear opening formation throughout terminal erythropoiesis. Loss of caspase-3 or ectopic expression of a caspase-3 non-cleavable lamin B mutant blocks nuclear opening formation, histone release, chromatin condensation, and terminal erythroid differentiation. We conclude that caspase-3-mediated nuclear opening formation accompanied by histone release from the opening is a critical step toward chromatin condensation during erythropoiesis in mice. Copyright © 2016 Elsevier Inc. All rights reserved.
Nuclear condensation during mouse erythropoiesis requires caspase-3-mediated nuclear opening
Zhao, Baobing; Mei, Yang; Schipma, Matthew J; Roth, Eric Wayne; Bleher, Reiner; Rappoport, Joshua Z.; Wickrema, Amittha; Yang, Jing; Ji, Peng
2016-01-01
SUMMARY Mammalian erythropoiesis involves chromatin condensation that is initiated in the early stage of terminal differentiation. The mechanisms of chromatin condensation during erythropoiesis are unclear. Here, we show that the mouse erythroblast forms large, transient, and recurrent nuclear openings that coincide with the condensation process. The opening lacks nuclear lamina, nuclear pore complexes, and nuclear membrane, but it is distinct from nuclear envelope changes that occur during apoptosis and mitosis. A fraction of the major histones are released from the nuclear opening and degraded in the cytoplasm. We demonstrate that caspase-3 is required for the nuclear opening formation throughout terminal erythropoiesis. Loss of caspase-3 or ectopic expression of a caspase-3 non-cleavable lamin B mutant blocks nuclear opening formation, histone release, chromatin condensation, and terminal erythroid differentiation. We conclude that caspase-3-mediated nuclear opening formation accompanied by histone release from the opening is a critical step towards chromatin condensation during erythropoiesis in mice. PMID:26954545
NASA Astrophysics Data System (ADS)
De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel
2015-04-01
Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )
jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.
Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris
2014-07-03
The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .
Bartoletti, Theodore M.; Jackman, Skyler L.; Babai, Norbert; Mercer, Aaron J.; Kramer, Richard H.
2011-01-01
Light hyperpolarizes cone photoreceptors, causing synaptic voltage-gated Ca2+ channels to open infrequently. To understand neurotransmission under these conditions, we determined the number of L-type Ca2+ channel openings necessary for vesicle fusion at the cone ribbon synapse. Ca2+ currents (ICa) were activated in voltage-clamped cones, and excitatory postsynaptic currents (EPSCs) were recorded from horizontal cells in the salamander retina slice preparation. Ca2+ channel number and single-channel current amplitude were calculated by mean-variance analysis of ICa. Two different comparisons—one comparing average numbers of release events to average ICa amplitude and the other involving deconvolution of both EPSCs and simultaneously recorded cone ICa—suggested that fewer than three Ca2+ channel openings accompanied fusion of each vesicle at the peak of release during the first few milliseconds of stimulation. Opening fewer Ca2+ channels did not enhance fusion efficiency, suggesting that few unnecessary channel openings occurred during strong depolarization. We simulated release at the cone synapse, using empirically determined synaptic dimensions, vesicle pool size, Ca2+ dependence of release, Ca2+ channel number, and Ca2+ channel properties. The model replicated observations when a barrier was added to slow Ca2+ diffusion. Consistent with the presence of a diffusion barrier, dialyzing cones with diffusible Ca2+ buffers did not affect release efficiency. The tight clustering of Ca2+ channels, along with a high-Ca2+ affinity release mechanism and diffusion barrier, promotes a linear coupling between Ca2+ influx and vesicle fusion. This may improve detection of small light decrements when cones are hyperpolarized by bright light. PMID:21880934
Bartoletti, Theodore M; Jackman, Skyler L; Babai, Norbert; Mercer, Aaron J; Kramer, Richard H; Thoreson, Wallace B
2011-12-01
Light hyperpolarizes cone photoreceptors, causing synaptic voltage-gated Ca(2+) channels to open infrequently. To understand neurotransmission under these conditions, we determined the number of L-type Ca(2+) channel openings necessary for vesicle fusion at the cone ribbon synapse. Ca(2+) currents (I(Ca)) were activated in voltage-clamped cones, and excitatory postsynaptic currents (EPSCs) were recorded from horizontal cells in the salamander retina slice preparation. Ca(2+) channel number and single-channel current amplitude were calculated by mean-variance analysis of I(Ca). Two different comparisons-one comparing average numbers of release events to average I(Ca) amplitude and the other involving deconvolution of both EPSCs and simultaneously recorded cone I(Ca)-suggested that fewer than three Ca(2+) channel openings accompanied fusion of each vesicle at the peak of release during the first few milliseconds of stimulation. Opening fewer Ca(2+) channels did not enhance fusion efficiency, suggesting that few unnecessary channel openings occurred during strong depolarization. We simulated release at the cone synapse, using empirically determined synaptic dimensions, vesicle pool size, Ca(2+) dependence of release, Ca(2+) channel number, and Ca(2+) channel properties. The model replicated observations when a barrier was added to slow Ca(2+) diffusion. Consistent with the presence of a diffusion barrier, dialyzing cones with diffusible Ca(2+) buffers did not affect release efficiency. The tight clustering of Ca(2+) channels, along with a high-Ca(2+) affinity release mechanism and diffusion barrier, promotes a linear coupling between Ca(2+) influx and vesicle fusion. This may improve detection of small light decrements when cones are hyperpolarized by bright light.
Observational and Theoretical Challenges to Wave or Turbulence Accelerations of the Fast Solar Wind
NASA Technical Reports Server (NTRS)
Roberts, D. Aaron
2008-01-01
We use both observations and theoretical considerations to show that hydromagnetic waves or turbulence cannot produce the acceleration of the fast solar wind and the related heating of the open solar corona. Waves do exist as shown by Hinode and other observations, and can play a role in the differential heating and acceleration of minor ions but their amplitudes are not sufficient to power the wind, as demonstrated by extrapolation of magnetic spectra from Helios and Ulysses observations. Dissipation mechanisms invoked to circumvent this conclusion cannot be effective for a variety of reasons. In particular, turbulence does not play a strong role in the corona as shown by both eclipse observations of coronal striations and theoretical considerations of line-tying to a nonturbulent photosphere, nonlocality of interactions, and the nature of kinetic dissipation. In the absence of wave heating and acceleration, the chromosphere and transition region become the natural source of open coronal energization. We suggest a variant of the velocity filtration approach in which the emergence and complex churning of the magnetic flux in the chromosphere and transition region continuously and ubiquitously produces the nonthermal distributions required. These particles are then released by magnetic carpet reconnection at a wide range of scales and produce the wind as described in kinetic approaches. Since the carpet reconnection is not the main source of the energization of the plasma, there is no expectation of an observable release of energy in nanoflares.
Rani, Manviri; Shim, Won Joon; Jang, Mi; Han, Gi Myung; Hong, Sang Hee
2017-10-01
Expanded polystyrene (EPS) is a major component of marine debris globally. Recently, hazardous hexabromocyclododecanes (HBCDDs) were detected in EPS buoys used for aquaculture farming. Subsequently, enrichment of HBCDDs was found in nearby marine sediments and mussels growing on EPS buoys. It was suspected that EPS buoys and their debris might be sources of HBCDDs. To confirm this, the release of HBCDDs from EPS spherules detached from a buoy to seawater was investigated under field (open sea surface and closed outdoor chambers with sun exposure and in the dark) and laboratory (particle-size) conditions. In all exposure groups, initial rapid leaching of HBCDDs was followed by slow desorption over time. Abundant release of HBCDDs was observed from EPS spherules exposed to the open sea surface (natural) and on exposure to sunlight irradiation or in the dark in controlled saline water. Water leaching and UV-light/temperature along with possibly biodegradation were responsible for about 37% and 12% of HBCDDs flux, respectively. Crumbled EPS particles (≤1 mm) in samples deployed on the sea surface for 6 months showed a high degree of weathering. This implies that surface erosion and further fragmentation of EPS via environmental weathering could enhance the leaching of HBCDDs from the surface of EPS. Overall, in the marine environment, HBCDDs could be released to a great extent from EPS products and their debris due to the cumulative effects of the movement of large volumes of water (dilution), biodegradation, UV-light/temperature, wave action (shaking), salinity and further fragmentation of EPS spherules. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Alden, Caroline B.; Ghosh, Subhomoy; Coburn, Sean; Sweeney, Colm; Karion, Anna; Wright, Robert; Coddington, Ian; Rieker, Gregory B.; Prasad, Kuldeep
2018-03-01
Advances in natural gas extraction technology have led to increased activity in the production and transport sectors in the United States and, as a consequence, an increased need for reliable monitoring of methane leaks to the atmosphere. We present a statistical methodology in combination with an observing system for the detection and attribution of fugitive emissions of methane from distributed potential source location landscapes such as natural gas production sites. We measure long (> 500 m), integrated open-path concentrations of atmospheric methane using a dual frequency comb spectrometer and combine measurements with an atmospheric transport model to infer leak locations and strengths using a novel statistical method, the non-zero minimum bootstrap (NZMB). The new statistical method allows us to determine whether the empirical distribution of possible source strengths for a given location excludes zero. Using this information, we identify leaking source locations (i.e., natural gas wells) through rejection of the null hypothesis that the source is not leaking. The method is tested with a series of synthetic data inversions with varying measurement density and varying levels of model-data mismatch. It is also tested with field observations of (1) a non-leaking source location and (2) a source location where a controlled emission of 3.1 × 10-5 kg s-1 of methane gas is released over a period of several hours. This series of synthetic data tests and outdoor field observations using a controlled methane release demonstrates the viability of the approach for the detection and sizing of very small leaks of methane across large distances (4+ km2 in synthetic tests). The field tests demonstrate the ability to attribute small atmospheric enhancements of 17 ppb to the emitting source location against a background of combined atmospheric (e.g., background methane variability) and measurement uncertainty of 5 ppb (1σ), when measurements are averaged over 2 min. The results of the synthetic and field data testing show that the new observing system and statistical approach greatly decreases the incidence of false alarms (that is, wrongly identifying a well site to be leaking) compared with the same tests that do not use the NZMB approach and therefore offers increased leak detection and sizing capabilities.
The sources of Antarctic bottom water in a global ice ocean model
NASA Astrophysics Data System (ADS)
Goosse, Hugues; Campin, Jean-Michel; Tartinville, Benoı̂t
Two mechanisms contribute to the formation of Antarctic bottom water (AABW). The first, and probably the most important, is initiated by the brine released on the Antarctic continental shelf during ice formation which is responsible for an increase in salinity. After mixing with ambient water at the shelf break, this salty and dense water sinks along the shelf slope and invades the deepest part of the global ocean. For the second one, the increase of surface water density is due to strong cooling at the ocean-atmosphere interface, together with a contribution from brine release. This induces deep convection and the renewal of deep waters. The relative importance of these two mechanisms is investigated in a global coupled ice-ocean model. Chlorofluorocarbon (CFC) concentrations simulated by the model compare favourably with observations, suggesting a reasonable deep water ventilation in the Southern Ocean, except close to Antarctica where concentrations are too high. Two artificial passive tracers released at surface on the Antarctic continental shelf and in the open-ocean allow to show clearly that the two mechanisms contribute significantly to the renewal of AABW in the model. This indicates that open-ocean convection is overestimated in our simulation. Additional experiments show that the amount of AABW production due to the export of dense shelf waters is quite sensitive to the parameterisation of the effect of downsloping and meso-scale eddies. Nevertheless, shelf waters always contribute significantly to deep water renewal. Besides, increasing the P.R. Gent, J.C. McWilliams [Journal of Physical Oceanography 20 (1990) 150-155] thickness diffusion can nearly suppress the AABW formation by open-ocean convection.
Practical guide: Tools and methodologies for an oil and gas industry emission inventory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, C.C.; Killian, T.L.
1996-12-31
During the preparation of Title V Permit applications, the quantification and speciation of emission sources from oil and gas facilities were reevaluated to determine the {open_quotes}potential-to-emit.{close_quotes} The existing emissions were primarily based on EPA emission factors such as AP-42, for tanks, combustion sources, and fugitive emissions from component leaks. Emissions from insignificant activities and routine operations that are associated with maintenance, startups and shutdowns, and releases to control devices also required quantification. To reconcile EPA emission factors with test data, process knowledge, and manufacturer`s data, a careful review of other estimation options was performed. This paper represents the results ofmore » this analysis of emission sources at oil and gas facilities, including exploration and production, compressor stations and gas plants.« less
The elementary events of Ca2+ release elicited by membrane depolarization in mammalian muscle
Csernoch, L; Zhou, J; Stern, M D; Brum, G; Ríos, E
2004-01-01
Cytosolic [Ca2+] transients elicited by voltage clamp depolarization were examined by confocal line scanning of rat skeletal muscle fibres. Ca2+ sparks were observed in the fibres' membrane-permeabilized ends, but not in responses to voltage in the membrane-intact area. Elementary events of the depolarization-evoked response could be separated either at low voltages (near −50 mV) or at −20mV in partially inactivated cells. These were of lower amplitude, narrower and of much longer duration than sparks, similar to ‘lone embers’ observed in the permeabilized segments. Their average amplitude was 0.19 and spatial half-width 1.3 μm. Other parameters depended on voltage. At −50 mV average duration was 111 ms and latency 185 ms. At −20 mV duration was 203 ms and latency 24 ms. Ca2+ release current, calculated on an average of events, was nearly steady at 0.5–0.6 pA. Accordingly, simulations of the fluorescence event elicited by a subresolution source of 0.5 pA open for 100 ms had morphology similar to the experimental average. Because 0.5 pA is approximately the current measured for single RyR channels in physiological conditions, the elementary fluorescence events in rat muscle probably reflect opening of a single RyR channel. A reconstruction of cell-averaged release flux at −20 mV based on the observed distribution of latencies and calculated elementary release had qualitatively correct but slower kinetics than the release flux in prior whole-cell measurements. The qualitative agreement indicates that global Ca2+ release flux results from summation of these discrete events. The quantitative discrepancies suggest that the partial inactivation strategy may lead to events of greater duration than those occurring physiologically in fully polarized cells. PMID:14990680
McKernan, Kevin Judd
2016-11-01
We sequenced several cannabis genomes in 2011 of June and the first and the longest contigs to emerge were the chloroplast and mitochondrial genomes. Having been a contributor to the Human Genome Project and an eye-witness to the real benefits of immediate data release, I have first hand experience with the potential mal-investment of millions of dollars of tax payer money narrowly averted due to the adopted global rapid data release policy. The policy was vital in reducing duplication of effort and economic waste. As a result, we felt obligated to publish the Cannabis genome data in a similar spirit and placed them immediately on a cloud based Amazon server in August of 2011. While these rapid data release practices were heralded by many in the media, we still find some authors fail to find or reference said work and hope to compel the readership that this omission has more pervasive repercussions than bruised egos and is a regression for our community.
Rhamnogalacturonan-I Based Microcapsules for Targeted Drug Release
Kusic, Anja; De Gobba, Cristian; Larsen, Flemming H.; Sassene, Philip; Zhou, Qi; van de Weert, Marco; Mullertz, Anette; Jørgensen, Bodil; Ulvskov, Peter
2016-01-01
Drug targeting to the colon via the oral administration route for local treatment of e.g. inflammatory bowel disease and colonic cancer has several advantages such as needle-free administration and low infection risk. A new source for delivery is plant-polysaccharide based delivery platforms such as Rhamnogalacturonan-I (RG-I). In the gastro-intestinal tract the RG-I is only degraded by the action of the colonic microflora. For assessment of potential drug delivery properties, RG-I based microcapsules (~1 μm in diameter) were prepared by an interfacial poly-addition reaction. The cross-linked capsules were loaded with a fluorescent dye (model drug). The capsules showed negligible and very little in vitro release when subjected to media simulating gastric and intestinal fluids, respectively. However, upon exposure to a cocktail of commercial RG-I cleaving enzymes, ~ 9 times higher release was observed, demonstrating that the capsules can be opened by enzymatic degradation. The combined results suggest a potential platform for targeted drug delivery in the terminal gastro-intestinal tract. PMID:27992455
NASA Astrophysics Data System (ADS)
Yver-Kwok, C. E.; Müller, D.; Caldow, C.; Lebègue, B.; Mønster, J. G.; Rella, C. W.; Scheutz, C.; Schmidt, M.; Ramonet, M.; Warneke, T.; Broquet, G.; Ciais, P.
2015-03-01
This study presents two methods for estimating methane emissions from a waste water treatment plant (WWTP) along with results from a measurement campaign at a WWTP in Valence, France. These methods, chamber measurements and tracer release, rely on Fourier Transform Infrared (FTIR) spectroscopy and Cavity Ring Down Spectroscopy (CRDS) instruments. We show that the tracer release method is suitable to quantify facility- and some process-scale emissions, while the chamber measurements, provide insight into individual process emissions. Uncertainties for the two methods are described and discussed. Applying the methods to CH4 emissions of the WWTP, we confirm that the open basins are not a major source of CH4 on the WWTP (about 10% of the total emissions), but that the pretreatment and sludge treatment are the main emitters. Overall, the waste water treatment plant represents a small part (about 1.5%) of the methane emissions of the city of Valence and its surroundings, which is lower than the national inventories.
NASA Astrophysics Data System (ADS)
Katata, Genki; Chino, Masamichi; Terada, Hiroaki; Kobayashi, Takuya; Ota, Masakazu; Nagai, Haruyasu; Kajino, Mizuo
2014-05-01
Temporal variations of release amounts of radionuclides during the Fukushima Dai-ichi Nuclear Power Plant (FNPP1) accident and their dispersion process are essential to evaluate the environmental impacts and resultant radiological doses to the public. Here, we estimated a detailed time trend of atmospheric releases during the accident by combining environmental monitoring data and coupling atmospheric and oceanic dispersion simulations by WSPEEDI-II (Worldwide version of System for Prediction of Environmental Emergency Dose Information) and SEA-GEARN developed by the authors. New schemes for wet, dry, and fog depositions of radioactive iodine gas (I2 and CH3I) and other particles (I-131, Te-132, Cs-137, and Cs-134) were incorporated into WSPEEDI-II. The deposition calculated by WSPEEDI-II was used as input data of ocean dispersion calculations by SEA-GEARN. The reverse estimation method based on the simulation by both models assuming unit release rate (1 Bq h-1) was adopted to estimate the source term at the FNPP1 using air dose rate, and air sea surface concentrations. The results suggested that the major release of radionuclides from the FNPP1 occurred in the following periods during March 2011: afternoon on the 12th when the venting and hydrogen explosion occurred at Unit 1, morning on the 13th after the venting event at Unit 3, midnight on the 14th when several openings of SRV (steam relief valve) were conducted at Unit 2, morning and night on the 15th, and morning on the 16th. The modified WSPEEDI-II using the newly estimated source term well reproduced local and regional patterns of air dose rate and surface deposition of I-131 and Cs-137 obtained by airborne observations. Our dispersion simulations also revealed that the highest radioactive contamination areas around FNPP1 were created from 15th to 16th March by complicated interactions among rainfall (wet deposition), plume movements, and phase properties (gas or particle) of I-131 and release rates associated with reactor pressure variations in Units 2 and 3.
NASA Astrophysics Data System (ADS)
Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.
2014-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.
Wide-Open: Accelerating public data release by automating detection of overdue datasets
Poon, Hoifung; Howe, Bill
2017-01-01
Open data is a vital pillar of open science and a key enabler for reproducibility, data reuse, and novel discoveries. Enforcement of open-data policies, however, largely relies on manual efforts, which invariably lag behind the increasingly automated generation of biological data. To address this problem, we developed a general approach to automatically identify datasets overdue for public release by applying text mining to identify dataset references in published articles and parse query results from repositories to determine if the datasets remain private. We demonstrate the effectiveness of this approach on 2 popular National Center for Biotechnology Information (NCBI) repositories: Gene Expression Omnibus (GEO) and Sequence Read Archive (SRA). Our Wide-Open system identified a large number of overdue datasets, which spurred administrators to respond directly by releasing 400 datasets in one week. PMID:28594819
Wide-Open: Accelerating public data release by automating detection of overdue datasets.
Grechkin, Maxim; Poon, Hoifung; Howe, Bill
2017-06-01
Open data is a vital pillar of open science and a key enabler for reproducibility, data reuse, and novel discoveries. Enforcement of open-data policies, however, largely relies on manual efforts, which invariably lag behind the increasingly automated generation of biological data. To address this problem, we developed a general approach to automatically identify datasets overdue for public release by applying text mining to identify dataset references in published articles and parse query results from repositories to determine if the datasets remain private. We demonstrate the effectiveness of this approach on 2 popular National Center for Biotechnology Information (NCBI) repositories: Gene Expression Omnibus (GEO) and Sequence Read Archive (SRA). Our Wide-Open system identified a large number of overdue datasets, which spurred administrators to respond directly by releasing 400 datasets in one week.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan
A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan
A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less
Goloborodko, Anton A; Levitsky, Lev I; Ivanov, Mark V; Gorshkov, Mikhail V
2013-02-01
Pyteomics is a cross-platform, open-source Python library providing a rich set of tools for MS-based proteomics. It provides modules for reading LC-MS/MS data, search engine output, protein sequence databases, theoretical prediction of retention times, electrochemical properties of polypeptides, mass and m/z calculations, and sequence parsing. Pyteomics is available under Apache license; release versions are available at the Python Package Index http://pypi.python.org/pyteomics, the source code repository at http://hg.theorchromo.ru/pyteomics, documentation at http://packages.python.org/pyteomics. Pyteomics.biolccc documentation is available at http://packages.python.org/pyteomics.biolccc/. Questions on installation and usage can be addressed to pyteomics mailing list: pyteomics@googlegroups.com.
MSAViewer: interactive JavaScript visualization of multiple sequence alignments.
Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E; Rost, Burkhard; Goldberg, Tatyana
2016-11-15
The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is 'web ready': written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/Supplementary information: Supplementary data are available at Bioinformatics online. msa@bio.sh. © The Author 2016. Published by Oxford University Press.
ProtVista: visualization of protein sequence annotations.
Watkins, Xavier; Garcia, Leyla J; Pundir, Sangya; Martin, Maria J
2017-07-01
ProtVista is a comprehensive visualization tool for the graphical representation of protein sequence features in the UniProt Knowledgebase, experimental proteomics and variation public datasets. The complexity and relationships in this wealth of data pose a challenge in interpretation. Integrative visualization approaches such as provided by ProtVista are thus essential for researchers to understand the data and, for instance, discover patterns affecting function and disease associations. ProtVista is a JavaScript component released as an open source project under the Apache 2 License. Documentation and source code are available at http://ebi-uniprot.github.io/ProtVista/ . martin@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
MSAViewer: interactive JavaScript visualization of multiple sequence alignments
Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E.; Rost, Burkhard; Goldberg, Tatyana
2016-01-01
Summary: The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is ‘web ready’: written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. Availability and Implementation: The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: msa@bio.sh PMID:27412096
Mantle to surface degassing of alkalic magmas at Erebus volcano, Antarctica
Oppenheimer, C.; Moretti, R.; Kyle, P.R.; Eschenbacher, A.; Lowenstern, J. B.; Hervig, R.L.; Dunbar, N.W.
2011-01-01
Continental intraplate volcanoes, such as Erebus volcano, Antarctica, are associated with extensional tectonics, mantle upwelling and high heat flow. Typically, erupted magmas are alkaline and rich in volatiles (especially CO2), inherited from low degrees of partial melting of mantle sources. We examine the degassing of the magmatic system at Erebus volcano using melt inclusion data and high temporal resolution open-path Fourier transform infrared (FTIR) spectroscopic measurements of gas emissions from the active lava lake. Remarkably different gas signatures are associated with passive and explosive gas emissions, representative of volatile contents and redox conditions that reveal contrasting shallow and deep degassing sources. We show that this unexpected degassing signature provides a unique probe for magma differentiation and transfer of CO2-rich oxidised fluids from the mantle to the surface, and evaluate how these processes operate in time and space. Extensive crystallisation driven by CO2 fluxing is responsible for isobaric fractionation of parental basanite magmas close to their source depth. Magma deeper than 4kbar equilibrates under vapour-buffered conditions. At shallower depths, CO2-rich fluids accumulate and are then released either via convection-driven, open-system gas loss or as closed-system slugs that ascend and result in Strombolian eruptions in the lava lake. The open-system gases have a reduced state (below the QFM buffer) whereas the closed-system gases preserve their deep oxidised signatures (close to the NNO buffer). ?? 2011 Elsevier B.V.
Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets.
Clark, Alex M; Dole, Krishna; Coulon-Spektor, Anna; McNutt, Andrew; Grass, George; Freundlich, Joel S; Reynolds, Robert C; Ekins, Sean
2015-06-22
On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user's own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery.
Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets
2015-01-01
On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user’s own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery. PMID:25994950
quanTLC, an online open-source solution for videodensitometric quantification.
Fichou, Dimitri; Morlock, Gertrud E
2018-07-27
The image is the key feature of planar chromatography. Videodensitometry by digital image conversion is the fastest way of its evaluation. Instead of scanning single sample tracks one after the other, only few clicks are needed to convert all tracks at one go. A minimalistic software was newly developed, termed quanTLC, that allowed the quantitative evaluation of samples in few minutes. quanTLC includes important assets such as open-source, online, free of charge, intuitive to use and tailored to planar chromatography, as none of the nine existent software for image evaluation covered these aspects altogether. quanTLC supports common image file formats for chromatogram upload. All necessary steps were included, i.e., videodensitogram extraction, preprocessing, automatic peak integration, calibration, statistical data analysis, reporting and data export. The default options for each step are suitable for most analyses while still being tunable, if needed. A one-minute video was recorded to serve as user manual. The software capabilities are shown on the example of a lipophilic dye mixture separation. The quantitative results were verified by comparison with those obtained by commercial videodensitometry software and opto-mechanical slit-scanning densitometry. The data can be exported at each step to be processed in further software, if required. The code was released open-source to be exploited even further. The software itself is online useable without installation and directly accessible at http://shinyapps.ernaehrung.uni-giessen.de/quanTLC. Copyright © 2018 Elsevier B.V. All rights reserved.
OSCAR4: a flexible architecture for chemical text-mining.
Jessop, David M; Adams, Sam E; Willighagen, Egon L; Hawizy, Lezan; Murray-Rust, Peter
2011-10-14
The Open-Source Chemistry Analysis Routines (OSCAR) software, a toolkit for the recognition of named entities and data in chemistry publications, has been developed since 2002. Recent work has resulted in the separation of the core OSCAR functionality and its release as the OSCAR4 library. This library features a modular API (based on reduction of surface coupling) that permits client programmers to easily incorporate it into external applications. OSCAR4 offers a domain-independent architecture upon which chemistry specific text-mining tools can be built, and its development and usage are discussed.
pyam: Python Implementation of YaM
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan
2012-01-01
pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.
Pierce, Todd P; Issa, Kimona; Gilbert, Benjamin T; Hanly, Brian; Festa, Anthony; McInerney, Vincent K; Scillia, Anthony J
2017-06-01
To compare complications, function, pain, and patient satisfaction after conventional open, percutaneous, or arthroscopic release of the extensor origin for the treatment of lateral epicondylitis. A thorough review of 4 databases-PubMed, EBSCOhost, CINAHL (Cumulative Index to Nursing and Allied Health Literature) Plus, and Scopus-was performed to identify all studies that addressed surgical management of lateral epicondylitis. We included (1) studies published between 2000 and 2015 and (2) studies with clearly defined surgical techniques. We excluded (1) non-English-language manuscripts, (2) isolated case reports, (3) studies with fewer than 10 subjects, (4) animal studies, (5) studies with additional adjunctive procedures aside from release of the extensor origin, (6) clinical or systematic review manuscripts, (7) studies with a follow-up period of 6 months or less, and (8) studies in which less than 80% of patients completed follow-up. Each study was analyzed for complication rates, functional outcomes, pain, and patient satisfaction. Thirty reports were identified that included 848 open, 578 arthroscopic, and 178 percutaneous releases. Patients within each release group had a similar age (46 years vs 46 years vs 48 years; P = .9 and P = .4, respectively), whereas there was a longer follow-up time in patients who underwent surgery by an open technique (49.4 months vs 42.6 months vs 23 months, P < .001). There were no differences in complication rates among these techniques (3.8% vs 2.9% vs 3.9%; P = .5 and P = .9, respectively). However, open techniques were correlated with higher surgical-site infection rates than arthroscopic techniques (0.7% vs 0%, P = .04). Mean Disabilities of the Arm, Shoulder and Hand scores were substantially better with both open and arthroscopic techniques than with percutaneous release (19.9 points vs 21.3 points vs 29 points, P < .001). In addition, there was less pain reported in the arthroscopic and percutaneous release groups as opposed to their open counterparts (1.9 points vs 1.4 points vs 1.3 points, P < .0001). There were no differences among the techniques in patient satisfaction rate (93.7% vs 89% vs 88%; P = .08 and P = .07, respectively). Functional outcomes of open and arthroscopic releases may be superior to those of percutaneous release. In addition, patients may report less pain with arthroscopic and percutaneous techniques. Although the risk of complications is similar regardless of technique, patients may be counseled that their risk of infectious complications may be slightly higher with open releases. However, it is important to note that this statistical difference may not necessarily portend noticeable clinical differences. Level IV, systematic review of Level III and IV evidence. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wittmann, H.; von Blanckenburg, F.; Mohtadi, M.; Christl, M.; Bernhardt, A.
2017-12-01
Meteoric 10Be to stable 9Be ratios combine a cosmogenic nuclide produced in the atmosphere at a rate known from reconstructions of magnetic field strength with a stable isotope that records the present and past continental weathering and erosion flux. In seawater, the 10Be/9Be ratio provides important information on metal release from bottom sediments, called boundary exchange, and the oceanic mixing of reactive trace metals due to the inherently different sources of the two isotopes. When measured in the authigenic phase of marine sediments, the 10Be/9Be ratio allows deriving the feedbacks between erosion, weathering, and climate in the geologic past. At an ocean margin site 37°S offshore Chile, we use the 10Be/9Be ratio to trace changes in terrestrial particulate composition due to exchange with seawater. We analyzed the reactive (sequentially extracted) phase of marine surface sediments along a coast-perpendicular transect, and compared to samples from their riverine source. We find evidence for growth of authigenic rims through co-precipitation, not via reversible adsorption, that incorporate an open ocean 10Be/9Be signature from a deep water source only 30 km from the coast, thereby overprinting terrestrial riverine 10Be/9Be signatures. We show that the measured 10Be/9Be ratios in marine sediments comprise a mixture between seawater-derived and riverine-sourced phases. As 10Be/9Be ratios increase due to exchange with seawater, particulate-bound Fe concentrations increase, which we attribute to release of Fe-rich pore waters during boundary exchange in the sediment. The implications for the use of 10Be/9Be in sedimentary records for paleo-denudation flux reconstructions are that in coast-proximal sites that are neither affected by deeper water nor by narrow boundary currents, the authigenic record will be a direct recorder of terrigenous denudation of the adjacent river catchments. Hence archive location and past oceanic circulation have to be accounted for when reconstructing continental erosion and weathering, and only at open ocean sites that are fully reset by seawater global signals can be reconstructed.
Climate Signals: An On-Line Digital Platform for Mapping Climate Change Impacts in Real Time
NASA Astrophysics Data System (ADS)
Cutting, H.
2016-12-01
Climate Signals is an on-line digital platform for cataloging and mapping the impacts of climate change. The CS platform specifies and details the chains of connections between greenhouse gas emissions and individual climate events. Currently in open-beta release, the platform is designed to to engage and serve the general public, news media, and policy-makers, particularly in real-time during extreme climate events. Climate Signals consists of a curated relational database of events and their links to climate change, a mapping engine, and a gallery of climate change monitors offering real-time data. For each event in the database, an infographic engine provides a custom attribution "tree" that illustrates the connections to climate change. In addition, links to key contextual resources are aggregated and curated for each event. All event records are fully annotated with detailed source citations and corresponding hyper links. The system of attribution used to link events to climate change in real-time is detailed here. This open-beta release is offered for public user testing and engagement. Launched in May 2016, the operation of this platform offers lessons for public engagement in climate change impacts.
BioFVM: an efficient, parallelized diffusive transport solver for 3-D biological simulations
Ghaffarizadeh, Ahmadreza; Friedman, Samuel H.; Macklin, Paul
2016-01-01
Motivation: Computational models of multicellular systems require solving systems of PDEs for release, uptake, decay and diffusion of multiple substrates in 3D, particularly when incorporating the impact of drugs, growth substrates and signaling factors on cell receptors and subcellular systems biology. Results: We introduce BioFVM, a diffusive transport solver tailored to biological problems. BioFVM can simulate release and uptake of many substrates by cell and bulk sources, diffusion and decay in large 3D domains. It has been parallelized with OpenMP, allowing efficient simulations on desktop workstations or single supercomputer nodes. The code is stable even for large time steps, with linear computational cost scalings. Solutions are first-order accurate in time and second-order accurate in space. The code can be run by itself or as part of a larger simulator. Availability and implementation: BioFVM is written in C ++ with parallelization in OpenMP. It is maintained and available for download at http://BioFVM.MathCancer.org and http://BioFVM.sf.net under the Apache License (v2.0). Contact: paul.macklin@usc.edu. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26656933
Coupled Physics Environment (CouPE) library - Design, Implementation, and Release
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Vijay S.
Over several years, high fidelity, validated mono-physics solvers with proven scalability on peta-scale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a unified mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. In this report, we present details on the design decisions and developments on CouPE, an acronym that stands for Coupled Physics Environment that orchestrates a coupled physics solver through the interfaces exposed by MOAB array-based unstructured mesh, both of which are part of SIGMA (Scalable Interfaces for Geometry and Mesh-Based Applications) toolkit.more » The SIGMA toolkit contains libraries that enable scalable geometry and unstructured mesh creation and handling in a memory and computationally efficient implementation. The CouPE version being prepared for a full open-source release along with updated documentation will contain several useful examples that will enable users to start developing their applications natively using the native MOAB mesh and couple their models to existing physics applications to analyze and solve real world problems of interest. An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is also being investigated as part of the NEAMS RPL, to tightly couple neutron transport, thermal-hydraulics and structural mechanics physics under the SHARP framework. This report summarizes the efforts that have been invested in CouPE to bring together several existing physics applications namely PROTEUS (neutron transport code), Nek5000 (computational fluid-dynamics code) and Diablo (structural mechanics code). The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The design of CouPE along with motivations that led to implementation choices are also discussed. The first release of the library will be different from the current version of the code that integrates the components in SHARP and explanation on the need for forking the source base will also be provided. Enhancements in the functionality and improved user guides will be available as part of the release. CouPE v0.1 is scheduled for an open-source release in December 2014 along with SIGMA v1.1 components that provide support for language-agnostic mesh loading, traversal and query interfaces along with scalable solution transfer of fields between different physics codes. The coupling methodology and software interfaces of the library are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the CouPE library.« less
Wenig, Philip; Odermatt, Juergen
2010-07-30
Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. This work describes a native approach to handle chromatographic data files. The approach can be extended in its functionality such as facilities to detect baselines, to detect, integrate and identify peaks and to compare mass spectra, as well as the ability to internationalize the application. Additionally, filters can be applied on the chromatographic data to enhance its quality, for example to remove background and noise. Extended operations like do, undo and redo are supported. OpenChrom is a software application to edit and analyze mass spectrometric chromatographic data. It is extensible in many different ways, depending on the demands of the users or the analytical procedures and algorithms. It offers a customizable graphical user interface. The software is independent of the operating system, due to the fact that the Rich Client Platform is written in Java. OpenChrom is released under the Eclipse Public License 1.0 (EPL). There are no license constraints regarding extensions. They can be published using open source as well as proprietary licenses. OpenChrom is available free of charge at http://www.openchrom.net.
Lessons Learned through the Development and Publication of AstroImageJ
NASA Astrophysics Data System (ADS)
Collins, Karen
2018-01-01
As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.
NASA Astrophysics Data System (ADS)
Barlow, P. M.; Filali-Meknassi, Y.; Sanford, W. E.; Winston, R. B.; Kuniansky, E.; Dawson, C.
2015-12-01
UNESCO's HOPE Initiative—the Hydro Free and (or) Open-source Platform of Experts—was launched in June 2013 as part of UNESCO's International Hydrological Programme. The Initiative arose in response to a recognized need to make free and (or) open-source water-resources software more widely accessible to Africa's water sector. A kit of software is being developed to provide African water authorities, teachers, university lecturers, and researchers with a set of programs that can be enhanced and (or) applied to the development of efficient and sustainable management strategies for Africa's water resources. The Initiative brings together experts from the many fields of water resources to identify software that might be included in the kit, to oversee an objective process for selecting software for the kit, and to engage in training and other modes of capacity building to enhance dissemination of the software. To date, teams of experts from the fields of wastewater treatment, groundwater hydrology, surface-water hydrology, and data management have been formed to identify relevant software from their respective fields. An initial version of the HOPE Software Kit was released in late August 2014 and consists of the STOAT model for wastewater treatment developed by the Water Research Center (United Kingdom) and the MODFLOW-2005 model for groundwater-flow simulation developed by the U.S. Geological Survey. The Kit is available on the UNESCO HOPE website (http://www.hope-initiative.net/).Training in the theory and use of MODFLOW-2005 is planned in southern Africa in conjunction with UNESCO's study of the Kalahari-Karoo/Stampriet Transboundary Aquifer, which extends over an area that includes parts of Botswana, Namibia, and South Africa, and in support of the European Commission's Horizon 2020 FREEWAT project (FREE and open source software tools for WATer resource management; see the UNESCO HOPE website).
Yu, Zhanyang; Liu, Ning; Li, Yadan; Xu, Jianfeng; Wang, Xiaoying
2013-08-01
Neuroglobin (Ngb) is an endogenous neuroprotective molecule against hypoxic/ischemic brain injury, but the underlying mechanisms remain largely undefined. Our recent study revealed that Ngb can bind to voltage-dependent anion channel (VDAC), a regulator of mitochondria permeability transition (MPT). In this study we examined the role of Ngb in MPT pore (mPTP) opening following oxygen-glucose deprivation (OGD) in primary cultured mouse cortical neurons. Co-immunoprecipitation (Co-IP) and immunocytochemistry showed that the binding between Ngb and VDAC was increased after OGD compared to normoxia, indicating the OGD-enhanced Ngb-VDAC interaction. Ngb overexpression protected primary mouse cortical neurons from OGD-induced neuronal death, to an extent comparable to mPTP opening inhibitor, cyclosporine A (CsA) pretreatment. We further measured the role of Ngb in OGD-induced mPTP opening using Ngb overexpression and knockdown approaches in primary cultured neurons, and recombinant Ngb exposure to isolated mitochondria. Same as CsA pretreatment, Ngb overexpression significantly reduced OGD-induced mPTP opening markers including mitochondria swelling, mitochondrial NAD(+) release, and cytochrome c (Cyt c) release in primary cultured neurons. Recombinant Ngb incubation significantly reduced OGD-induced NAD(+) release and Cyt c release from isolated mitochondria. In contrast, Ngb knockdown significantly increased OGD-induced neuron death, and increased OGD-induced mitochondrial NAD(+) release and Cyt c release as well, and these outcomes could be rescued by CsA pretreatment. In summary, our results demonstrated that Ngb overexpression can inhibit OGD-induced mPTP opening in primary cultured mouse cortical neurons, which may be one of the molecular mechanisms of Ngb's neuroprotection. Copyright © 2013 Elsevier Inc. All rights reserved.
Helium as a tracer for fluids released from Juan de Fuca lithosphere beneath the Cascadia forearc
McCrory, Patricia A.; Constantz, James E.; Hunt, Andrew G.; Blair, James Luke
2016-01-01
The ratio between helium isotopes (3He/4He) provides an excellent geochemical tracer for investigating the sources of fluids sampled at the Earth's surface. 3He/4He values observed in 25 mineral springs and wells above the Cascadia forearc document a significant component of mantle-derived helium above Juan de Fuca lithosphere, as well as variability in 3He enrichment across the forearc. Sample sites arcward of the forearc mantle corner (FMC) generally yield significantly higher ratios (1.2-4.0 RA) than those seaward of the corner (0.03-0.7 RA). The highest ratios in the Cascadia forearc coincide with slab depths (40-45 km) where metamorphic dehydration of young oceanic lithosphere is expected to release significant fluid and where tectonic tremor occurs, whereas little fluid is expected to be released from the slab depths (25-30 km) beneath sites seaward of the corner.Tremor (considered a marker for high fluid pressure) and high RA values in the forearc are spatially correlated. The Cascadia tremor band is centered on its FMC, and we tentatively postulate that hydrated forearc mantle beneath Cascadia deflects a significant portion of slab-derived fluids updip along the subduction interface, to vent in the vicinity of its corner. Furthermore, high RA values within the tremor band just arcward of the FMC, suggest that the innermost mantle wedge is relatively permeable.Conceptual models require: (1) a deep fluid source as a medium to transport primordial 3He; (2) conduits through the lithosphere which serve to speed fluid ascent to the surface before significant dilution from radiogenic 4He can occur; and (3) near lithostatic fluid pressure to keep conduits open. Our spatial correlation between high RA values and tectonic tremor provides independent evidence that tremor is associated with deep fluids, and it further suggests that high pore pressures associated with tremor may serve to keep fractures open for 3He migration through ductile upper mantle and lower crust.
PlasmaPy: initial development of a Python package for plasma physics
NASA Astrophysics Data System (ADS)
Murphy, Nicholas; Leonard, Andrew J.; Stańczak, Dominik; Haggerty, Colby C.; Parashar, Tulasi N.; Huang, Yu-Min; PlasmaPy Community
2017-10-01
We report on initial development of PlasmaPy: an open source community-driven Python package for plasma physics. PlasmaPy seeks to provide core functionality that is needed for the formation of a fully open source Python ecosystem for plasma physics. PlasmaPy prioritizes code readability, consistency, and maintainability while using best practices for scientific computing such as version control, continuous integration testing, embedding documentation in code, and code review. We discuss our current and planned capabilities, including features presently under development. The development roadmap includes features such as fluid and particle simulation capabilities, a Grad-Shafranov solver, a dispersion relation solver, atomic data retrieval methods, and tools to analyze simulations and experiments. We describe several ways to contribute to PlasmaPy. PlasmaPy has a code of conduct and is being developed under a BSD license, with a version 0.1 release planned for 2018. The success of PlasmaPy depends on active community involvement, so anyone interested in contributing to this project should contact the authors. This work was partially supported by the U.S. Department of Energy.
2MASS Catalog Server Kit Version 2.1
NASA Astrophysics Data System (ADS)
Yamauchi, C.
2013-10-01
The 2MASS Catalog Server Kit is open source software for use in easily constructing a high performance search server for important astronomical catalogs. This software utilizes the open source RDBMS PostgreSQL, therefore, any users can setup the database on their local computers by following step-by-step installation guide. The kit provides highly optimized stored functions for positional searchs similar to SDSS SkyServer. Together with these, the powerful SQL environment of PostgreSQL will meet various user's demands. We released 2MASS Catalog Server Kit version 2.1 in 2012 May, which supports the latest WISE All-Sky catalog (563,921,584 rows) and 9 major all-sky catalogs. Local databases are often indispensable for observatories with unstable or narrow-band networks or severe use, such as retrieving large numbers of records within a small period of time. This software is the best for such purposes, and increasing supported catalogs and improvements of version 2.1 can cover a wider range of applications including advanced calibration system, scientific studies using complicated SQL queries, etc. Official page: http://www.ir.isas.jaxa.jp/~cyamauch/2masskit/
Structure of the skeletal muscle calcium release channel activated with Ca2+ and AMP-PCP.
Serysheva, I I; Schatz, M; van Heel, M; Chiu, W; Hamilton, S L
1999-01-01
The functional state of the skeletal muscle Ca2+ release channel is modulated by a number of endogenous molecules during excitation-contraction. Using electron cryomicroscopy and angular reconstitution techniques, we determined the three-dimensional (3D) structure of the skeletal muscle Ca2+ release channel activated by a nonhydrolyzable analog of ATP in the presence of Ca2+. These ligands together produce almost maximum activation of the channel and drive the channel population toward a predominately open state. The resulting 30-A 3D reconstruction reveals long-range conformational changes in the cytoplasmic region that might affect the interaction of the Ca2+ release channel with the t-tubule voltage sensor. In addition, a central opening and mass movements, detected in the transmembrane domain of both the Ca(2+)- and the Ca2+/nucleotide-activated channels, suggest a mechanism for channel opening similar to opening-closing of the iris in a camera diaphragm. PMID:10512814
National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents
NASA Astrophysics Data System (ADS)
Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.
2014-12-01
The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.
Youpi: YOUr processing PIpeline
NASA Astrophysics Data System (ADS)
Monnerville, Mathias; Sémah, Gregory
2012-03-01
Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.
Chaste: An Open Source C++ Library for Computational Physiology and Biology
Mirams, Gary R.; Arthurs, Christopher J.; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G.; Harvey, Daniel G.; Marsh, Megan E.; Osborne, James M.; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J.
2013-01-01
Chaste — Cancer, Heart And Soft Tissue Environment — is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to ‘re-invent the wheel’ with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials. PMID:23516352
MetaboLights: An Open-Access Database Repository for Metabolomics Data.
Kale, Namrata S; Haug, Kenneth; Conesa, Pablo; Jayseelan, Kalaivani; Moreno, Pablo; Rocca-Serra, Philippe; Nainala, Venkata Chandrasekhar; Spicer, Rachel A; Williams, Mark; Li, Xuefei; Salek, Reza M; Griffin, Julian L; Steinbeck, Christoph
2016-03-24
MetaboLights is the first general purpose, open-access database repository for cross-platform and cross-species metabolomics research at the European Bioinformatics Institute (EMBL-EBI). Based upon the open-source ISA framework, MetaboLights provides Metabolomics Standard Initiative (MSI) compliant metadata and raw experimental data associated with metabolomics experiments. Users can upload their study datasets into the MetaboLights Repository. These studies are then automatically assigned a stable and unique identifier (e.g., MTBLS1) that can be used for publication reference. The MetaboLights Reference Layer associates metabolites with metabolomics studies in the archive and is extensively annotated with data fields such as structural and chemical information, NMR and MS spectra, target species, metabolic pathways, and reactions. The database is manually curated with no specific release schedules. MetaboLights is also recommended by journals for metabolomics data deposition. This unit provides a guide to using MetaboLights, downloading experimental data, and depositing metabolomics datasets using user-friendly submission tools. Copyright © 2016 John Wiley & Sons, Inc.
OpenDrop: An Integrated Do-It-Yourself Platform for Personal Use of Biochips
Alistar, Mirela; Gaudenz, Urs
2017-01-01
Biochips, or digital labs-on-chip, are developed with the purpose of being used by laboratory technicians or biologists in laboratories or clinics. In this article, we expand this vision with the goal of enabling everyone, regardless of their expertise, to use biochips for their own personal purposes. We developed OpenDrop, an integrated electromicrofluidic platform that allows users to develop and program their own bio-applications. We address the main challenges that users may encounter: accessibility, bio-protocol design and interaction with microfluidics. OpenDrop consists of a do-it-yourself biochip, an automated software tool with visual interface and a detailed technique for at-home operations of microfluidics. We report on two years of use of OpenDrop, released as an open-source platform. Our platform attracted a highly diverse user base with participants originating from maker communities, academia and industry. Our findings show that 47% of attempts to replicate OpenDrop were successful, the main challenge remaining the assembly of the device. In terms of usability, the users managed to operate their platforms at home and are working on designing their own bio-applications. Our work provides a step towards a future in which everyone will be able to create microfluidic devices for their personal applications, thereby democratizing parts of health care. PMID:28952524
Nind, Thomas; Galloway, James; McAllister, Gordon; Scobbie, Donald; Bonney, Wilfred; Hall, Christopher; Tramma, Leandro; Reel, Parminder; Groves, Martin; Appleby, Philip; Doney, Alex; Guthrie, Bruce; Jefferson, Emily
2018-05-22
The Health Informatics Centre (HIC) at the University of Dundee provides a service to securely host clinical datasets and extract relevant data for anonymised cohorts to researchers to enable them to answer key research questions. As is common in research using routine healthcare data, the service was historically delivered using ad-hoc processes resulting in the slow provision of data whose provenance was often hidden to the researchers using it. This paper describes the development and evaluation of the Research Data Management Platform (RDMP): an open source tool to load, manage, clean, and curate longitudinal healthcare data for research and provide reproducible and updateable datasets for defined cohorts to researchers. Between 2013 and 2017, RDMP tool implementation tripled the productivity of Data Analysts producing data releases for researchers from 7.1 to 25.3 per month; and reduced the error rate from 12.7% to 3.1%. The effort on data management reduced from a mean of 24.6 to 3.0 hours per data release. The waiting time for researchers to receive data after agreeing a specification reduced from approximately 6 months to less than one week. The software is scalable and currently manages 163 datasets. 1,321 data extracts for research have been produced with the largest extract linking data from 70 different datasets. The tools and processes that encompass the RDMP not only fulfil the research data management requirements of researchers but also support the seamless collaboration of data cleaning, data transformation, data summarisation and data quality assessment activities by different research groups.
Weaver, Steven; Shank, Stephen D; Spielman, Stephanie J; Li, Michael; Muse, Spencer V; Kosakovsky Pond, Sergei L
2018-01-02
Inference of how evolutionary forces have shaped extant genetic diversity is a cornerstone of modern comparative sequence analysis. Advances in sequence generation and increased statistical sophistication of relevant methods now allow researchers to extract ever more evolutionary signal from the data, albeit at an increased computational cost. Here, we announce the release of Datamonkey 2.0, a completely re-engineered version of the Datamonkey web-server for analyzing evolutionary signatures in sequence data. For this endeavor, we leveraged recent developments in open-source libraries that facilitate interactive, robust, and scalable web application development. Datamonkey 2.0 provides a carefully curated collection of methods for interrogating coding-sequence alignments for imprints of natural selection, packaged as a responsive (i.e. can be viewed on tablet and mobile devices), fully interactive, and API-enabled web application. To complement Datamonkey 2.0, we additionally release HyPhy Vision, an accompanying JavaScript application for visualizing analysis results. HyPhy Vision can also be used separately from Datamonkey 2.0 to visualize locally-executed HyPhy analyses. Together, Datamonkey 2.0 and HyPhy Vision showcase how scientific software development can benefit from general-purpose open-source frameworks. Datamonkey 2.0 is freely and publicly available at http://www.datamonkey. org, and the underlying codebase is available from https://github.com/veg/datamonkey-js. © The Author 2018. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
USDA-ARS?s Scientific Manuscript database
Augmentation biocontrol is a commercially viable pest management tactic in enclosed glasshouse environments, but is far less effective in open-field agriculture where newly released enemies rapidly disperse from release sites. We tested the potential for behavior-modifying semiochemicals to increase...
A Versatile and Reproducible Multi-Frequency Electrical Impedance Tomography System
Avery, James; Dowrick, Thomas; Faulkner, Mayo; Goren, Nir; Holder, David
2017-01-01
A highly versatile Electrical Impedance Tomography (EIT) system, nicknamed the ScouseTom, has been developed. The system allows control over current amplitude, frequency, number of electrodes, injection protocol and data processing. Current is injected using a Keithley 6221 current source, and voltages are recorded with a 24-bit EEG system with minimum bandwidth of 3.2 kHz. Custom PCBs interface with a PC to control the measurement process, electrode addressing and triggering of external stimuli. The performance of the system was characterised using resistor phantoms to represent human scalp recordings, with an SNR of 77.5 dB, stable across a four hour recording and 20 Hz to 20 kHz. In studies of both haeomorrhage using scalp electrodes, and evoked activity using epicortical electrode mats in rats, it was possible to reconstruct images matching established literature at known areas of onset. Data collected using scalp electrode in humans matched known tissue impedance spectra and was stable over frequency. The experimental procedure is software controlled and is readily adaptable to new paradigms. Where possible, commercial or open-source components were used, to minimise the complexity in reproduction. The hardware designs and software for the system have been released under an open source licence, encouraging contributions and allowing for rapid replication. PMID:28146122
Metric Evaluation Pipeline for 3d Modeling of Urban Scenes
NASA Astrophysics Data System (ADS)
Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.
2017-05-01
Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and... CATEGORIES National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and...
Pore opening dynamics in the exocytosis of serotonin
NASA Astrophysics Data System (ADS)
Ramirez-Santiago, Guillermo; Cercos, Montserrat G.; Martinez-Valencia, Alejandro; Salinas Hernandez, Israel; Rodríguez-Sosa, Leonardo; de-Miguel, Francisco F.
2015-03-01
The current view of the exocytosis of transmitter molecules is that it starts with the formation of a fusion pore that connects the intravesicular and the extracellular spaces, and is completed by the release of the rest of the transmitter contained in the vesicle upon the full fusion and collapse of the vesicle with the plasma membrane. However, under certain circumstances, a rapid closure of the pore before the full vesicle fusion produces only a partial release of the transmitter. Here we show that whole release of the transmitter occurs through fusion pores that remain opened for tens of milliseconds without vesicle collapse. This was demonstrated through amperometric measurements of serotonin release from electrodense vesicles in the axon of leech Retzius neurons and mathematical modelling. By modeling transmitter release with a diffusion equation subjected to boundary conditions that are defined by the experiment, we showed that those pores with a fast half rise time constant remained opened and allowed the full quantum release without vesicle collapse, whereas pores with a slow rise time constant closed rapidly, thus producing partial release. We conclude that a full transmitter release may occur through the fusion pore in the absence of vesicle collapse. This work was founded by a DGAPA-UNAM grants IN200914 and IN118410 CONACYT GRANT 130031, and CONACyT doctoral fellowships.
Transient behavior of a flare-associated solar wind. I - Gas dynamics in a radial open field region
NASA Technical Reports Server (NTRS)
Nagai, F.
1984-01-01
A numerical investigation is conducted into the way in which a solar wind model initially satisfying both steady state and energy balance conditions is disturbed and deformed, under the assumption of heating that correspoonds to the energy release of solar flares of an importance value of approximately 1 which occur in radial open field regions. Flare-associated solar wind transient behavior is modeled for 1-8 solar radii. The coronal temperature around the heat source region rises, and a large thermal conductive flux flows inward to the chromosphere and outward to interplanetary space along field lines. The speed of the front of expanding chromospheric material generated by the impingement of the conduction front on the upper chromosphere exceeds the local sound velocity in a few minutes and eventually exceeds 100 million cm/sec.
Ranson, Matthew; Cox, Brendan; Keenan, Cheryl; Teitelbaum, Daniel
2015-11-03
Between 1991 and 2012, the facilities that reported to the U.S. Environmental Protection Agency's Toxic Release Inventory (TRI) Program conducted 370,000 source reduction projects. We use this data set to conduct the first quasi-experimental retrospective evaluation of how implementing a source reduction (pollution prevention) project affects the quantity of toxic chemicals released to the environment by an average industrial facility. We use a differences-in-differences methodology, which measures how implementing a source reduction project affects a facility's releases of targeted chemicals, relative to releases of (a) other untargeted chemicals from the same facility, or (b) the same chemical from other facilities in the same industry. We find that the average source reduction project causes a 9-16% decrease in releases of targeted chemicals in the year of implementation. Source reduction techniques vary in effectiveness: for example, raw material modification causes a large decrease in releases, while inventory control has no detectable effect. Our analysis suggests that in aggregate, the source reduction projects carried out in the U.S. since 1991 have prevented between 5 and 14 billion pounds of toxic releases.
One-dimensional statistical parametric mapping in Python.
Pataky, Todd C
2012-01-01
Statistical parametric mapping (SPM) is a topological methodology for detecting field changes in smooth n-dimensional continua. Many classes of biomechanical data are smooth and contained within discrete bounds and as such are well suited to SPM analyses. The current paper accompanies release of 'SPM1D', a free and open-source Python package for conducting SPM analyses on a set of registered 1D curves. Three example applications are presented: (i) kinematics, (ii) ground reaction forces and (iii) contact pressure distribution in probabilistic finite element modelling. In addition to offering a high-level interface to a variety of common statistical tests like t tests, regression and ANOVA, SPM1D also emphasises fundamental concepts of SPM theory through stand-alone example scripts. Source code and documentation are available at: www.tpataky.net/spm1d/.
Szentesi, Péter; Szappanos, Henrietta; Szegedi, Csaba; Gönczi, Monika; Jona, István; Cseri, Julianna; Kovács, László; Csernoch, László
2004-03-01
The effects of thymol on steps of excitation-contraction coupling were studied on fast-twitch muscles of rodents. Thymol was found to increase the depolarization-induced release of calcium from the sarcoplasmic reticulum, which could not be attributed to a decreased calcium-dependent inactivation of calcium release channels/ryanodine receptors or altered intramembrane charge movement, but rather to a more efficient coupling of depolarization to channel opening. Thymol increased ryanodine binding to heavy sarcoplasmic reticulum vesicles, with a half-activating concentration of 144 micro M and a Hill coefficient of 1.89, and the open probability of the isolated and reconstituted ryanodine receptors, from 0.09 +/- 0.03 to 0.22 +/- 0.04 at 30 micro M. At higher concentrations the drug induced long-lasting open events on a full conducting state. Elementary calcium release events imaged using laser scanning confocal microscopy in the line-scan mode were reduced in size, 0.92 +/- 0.01 vs. 0.70 +/- 0.01, but increased in duration, 56 +/- 1 vs. 79 +/- 1 ms, by 30 micro M thymol, with an increase in the relative proportion of lone embers. Higher concentrations favored long events, resembling embers in control, with duration often exceeding 500 ms. These findings provide direct experimental evidence that the opening of a single release channel will generate an ember, rather than a spark, in mammalian skeletal muscle.
NASA Astrophysics Data System (ADS)
Singh, Sarvesh Kumar; Rani, Raj
2015-10-01
The study addresses the identification of multiple point sources, emitting the same tracer, from their limited set of merged concentration measurements. The identification, here, refers to the estimation of locations and strengths of a known number of simultaneous point releases. The source-receptor relationship is described in the framework of adjoint modelling by using an analytical Gaussian dispersion model. A least-squares minimization framework, free from an initialization of the release parameters (locations and strengths), is presented to estimate the release parameters. This utilizes the distributed source information observable from the given monitoring design and number of measurements. The technique leads to an exact retrieval of the true release parameters when measurements are noise free and exactly described by the dispersion model. The inversion algorithm is evaluated using the real data from multiple (two, three and four) releases conducted during Fusion Field Trials in September 2007 at Dugway Proving Ground, Utah. The release locations are retrieved, on average, within 25-45 m of the true sources with the distance from retrieved to true source ranging from 0 to 130 m. The release strengths are also estimated within a factor of three to the true release rates. The average deviations in retrieval of source locations are observed relatively large in two release trials in comparison to three and four release trials.
Willighagen, Egon L; Mayfield, John W; Alvarsson, Jonathan; Berg, Arvid; Carlsson, Lars; Jeliazkova, Nina; Kuhn, Stefan; Pluskal, Tomáš; Rojas-Chertó, Miquel; Spjuth, Ola; Torrance, Gilleain; Evelo, Chris T; Guha, Rajarshi; Steinbeck, Christoph
2017-06-06
The Chemistry Development Kit (CDK) is a widely used open source cheminformatics toolkit, providing data structures to represent chemical concepts along with methods to manipulate such structures and perform computations on them. The library implements a wide variety of cheminformatics algorithms ranging from chemical structure canonicalization to molecular descriptor calculations and pharmacophore perception. It is used in drug discovery, metabolomics, and toxicology. Over the last 10 years, the code base has grown significantly, however, resulting in many complex interdependencies among components and poor performance of many algorithms. We report improvements to the CDK v2.0 since the v1.2 release series, specifically addressing the increased functional complexity and poor performance. We first summarize the addition of new functionality, such atom typing and molecular formula handling, and improvement to existing functionality that has led to significantly better performance for substructure searching, molecular fingerprints, and rendering of molecules. Second, we outline how the CDK has evolved with respect to quality control and the approaches we have adopted to ensure stability, including a code review mechanism. This paper highlights our continued efforts to provide a community driven, open source cheminformatics library, and shows that such collaborative projects can thrive over extended periods of time, resulting in a high-quality and performant library. By taking advantage of community support and contributions, we show that an open source cheminformatics project can act as a peer reviewed publishing platform for scientific computing software. Graphical abstract CDK 2.0 provides new features and improved performance.
Percutaneous Trigger Finger Release: A Cost-effectiveness Analysis.
Gancarczyk, Stephanie M; Jang, Eugene S; Swart, Eric P; Makhni, Eric C; Kadiyala, Rajendra Kumar
2016-07-01
Percutaneous trigger finger releases (TFRs) performed in the office setting are becoming more prevalent. This study compares the costs of in-hospital open TFRs, open TFRs performed in ambulatory surgical centers (ASCs), and in-office percutaneous releases. An expected-value decision-analysis model was constructed from the payer perspective to estimate total costs of the three competing treatment strategies for TFR. Model parameters were estimated based on the best available literature and were tested using multiway sensitivity analysis. Percutaneous TFR performed in the office and then, if needed, revised open TFR performed in the ASC, was the most cost-effective strategy, with an attributed cost of $603. The cost associated with an initial open TFR performed in the ASC was approximately 7% higher. Initial open TFR performed in the hospital was the least cost-effective, with an attributed cost nearly twice that of primary percutaneous TFR. An initial attempt at percutaneous TFR is more cost-effective than an open TFR. Currently, only about 5% of TFRs are performed in the office; therefore, a substantial opportunity exists for cost savings in the future. Decision model level II.
NASA Astrophysics Data System (ADS)
Yaqoob, Usman; Chung, Gwiy-Sang
2017-09-01
This study investigates the effect of reduced graphene oxide (rGO) on the energy harvesting performance of poly(vinylidenefluoride-trifluoroethylene)-barium titanate (P(VDF-TrFE)-BTO) nanocomposite devices. Several piezoelectric nanogenerators with different rGO contents were prepared, among them PBR5-NG (rGO = 0.5%) exhibited maximum output performance. PBR5-NG showed a maximum open circuit voltage of 8.5 Vpk-pk and short circuit current of 2 μApk-pk at an applied force of 2 N. Moreover, PBR5-NG displayed an output power of 4.5 μW at 2 MΩ load resistance. To confirm device stability, the fabricated device was subjected to several pressing-releasing cycles. The device had excellent stability, even after 1000 pressing-releasing cycles. Together, our results indicate that our fabricated PBR5-NG is a promising energy source for future flexible electronics.
Release of the gPhoton Database of GALEX Photon Events
NASA Astrophysics Data System (ADS)
Fleming, Scott W.; Million, Chase; Shiao, Bernie; Tucker, Michael; Loyd, R. O. Parke
2016-01-01
The GALEX spacecraft surveyed much of the sky in two ultraviolet bands between 2003 and 2013 with non-integrating microchannel plate detectors. The Mikulski Archive for Space Telescopes (MAST) has made more than one trillion photon events observed by the spacecraft available, stored as a 130 TB database, along with an open-source, python-based software package to query this database and create calibrated lightcurves or images from these data at user-defined spatial and temporal scales. In particular, MAST users can now conduct photometry at the intra-visit level (timescales of seconds and minutes). The software, along with the fully populated database, was officially released in Aug. 2015, and improvements to both software functionality and data calibration are ongoing. We summarize the current calibration status of the gPhoton software, along with examples of early science enabled by gPhoton that include stellar flares, AGN, white dwarfs, exoplanet hosts, novae, and nearby galaxies.
The Emergence of Open-Source Software in China
ERIC Educational Resources Information Center
Pan, Guohua; Bonk, Curtis J.
2007-01-01
The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…
A Study of Clinically Related Open Source Software Projects
Hogarth, Michael A.; Turner, Stuart
2005-01-01
Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056
Ikemoto, Takaaki; Endo, Makoto
2001-01-01
To characterize the effect of clofibric acid (Clof) on the Ca2+ release mechanism in the sarcoplasmic reticulum (SR) of skeletal muscle, we analysed the properties of Clof-induced Ca2+ release under various conditions using chemically skinned skeletal muscle fibres of the mouse.Clof (>0.5 mM) released Ca2+ from the SR under Ca2+-free conditions buffered with 10 mM EGTA (pCa >8).Co-application of ryanodine and Clof at pCa >8 but not ryanodine alone reduced the Ca2+ uptake capacity of the SR. Thus, Ca2+ release induced by Clof at pCa >8 must be a result of the activation of the ryanodine receptor (RyR).At pCa >8, (i) Clof-induced Ca2+ release was inhibited by adenosine monophosphate (AMP), (ii) the inhibitory effect of Mg2+ on the Clof-induced Ca2+ release was saturated at about 1 mM, and (iii) Clof-induced Ca2+ release was not inhibited by procaine (10 mM). These results indicate that Clof may activate the RyR-Ca2+ release channels in a manner different from Ca2+-induced Ca2+ release (CICR).In addition to this unique mode of opening, Clof also enhanced the CICR mode of opening of RyR-Ca2+ release channels.Apart from CICR, a high concentration of Ca2+ might also enhance the unique mode of opening by Clof.These results suggest that some features of Ca2+ release activated by Clof are similar to those of physiological Ca2+ release (PCR) in living muscle cells and raise the possibility that Clof may be useful in elucidating the mechanism of PCR in skeletal muscle. PMID:11606311
2014-11-13
It is about two weeks later in Inca City and the season is officially spring. Numerous changes have occurred. Large blotches of dust cover the araneiforms. Dark spots on the ridge show places where the seasonal polar ice cap has ruptured, releasing gas and fine material from the surface below. At the bottom of the image fans point in more than one direction from a single source, showing that the wind has changed direction while gas and dust were flowing out. Was the flow continuous or has the vent opened and closed? http://photojournal.jpl.nasa.gov/catalog/PIA18893
Action Recommendation for Cyber Resilience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhury, Sutanay; Rodriguez, Luke R.; Curtis, Darren S.
2015-09-01
This paper presents an unifying graph-based model for representing the infrastructure, behavior and missions of an enterprise. We describe how the model can be used to achieve resiliency against a wide class of failures and attacks. We introduce an algorithm for recommending resilience establishing actions based on dynamic updates to the models. Without loss of generality, we show the effectiveness of the algorithm for preserving latency based quality of service (QoS). Our models and the recommendation algorithms are implemented in a software framework that we seek to release as an open source framework for simulating resilient cyber systems.
SCARF: maximizing next-generation EST assemblies for evolutionary and population genomic analyses.
Barker, Michael S; Dlugosch, Katrina M; Reddy, A Chaitanya C; Amyotte, Sarah N; Rieseberg, Loren H
2009-02-15
Scaffolded and Corrected Assembly of Roche 454 (SCARF) is a next-generation sequence assembly tool for evolutionary genomics that is designed especially for assembling 454 EST sequences against high-quality reference sequences from related species. The program was created to knit together 454 contigs that do not assemble during traditional de novo assembly, using a reference sequence library to orient the 454 sequences. SCARF is freely available at http://msbarker.com/software.htm, and is released under the open source GPLv3 license (http://www.opensource.org/licenses/gpl-3.0.html.
pez: phylogenetics for the environmental sciences.
Pearse, William D; Cadotte, Marc W; Cavender-Bares, Jeannine; Ives, Anthony R; Tucker, Caroline M; Walker, Steve C; Helmus, Matthew R
2015-09-01
pez is an R package that permits measurement, modelling and simulation of phylogenetic structure in ecological data. pez contains the first implementation of many methods in R, and aggregates existing data structures and methods into a single, coherent package. pez is released under the GPL v3 open-source license, available on the Internet from CRAN (http://cran.r-project.org). The package is under active development, and the authors welcome contributions (see http://github.com/willpearse/pez). will.pearse@gmail.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Picante: R tools for integrating phylogenies and ecology.
Kembel, Steven W; Cowan, Peter D; Helmus, Matthew R; Cornwell, William K; Morlon, Helene; Ackerly, David D; Blomberg, Simon P; Webb, Campbell O
2010-06-01
Picante is a software package that provides a comprehensive set of tools for analyzing the phylogenetic and trait diversity of ecological communities. The package calculates phylogenetic diversity metrics, performs trait comparative analyses, manipulates phenotypic and phylogenetic data, and performs tests for phylogenetic signal in trait distributions, community structure and species interactions. Picante is a package for the R statistical language and environment written in R and C, released under a GPL v2 open-source license, and freely available on the web (http://picante.r-forge.r-project.org) and from CRAN (http://cran.r-project.org).
OSCAR4: a flexible architecture for chemical text-mining
2011-01-01
The Open-Source Chemistry Analysis Routines (OSCAR) software, a toolkit for the recognition of named entities and data in chemistry publications, has been developed since 2002. Recent work has resulted in the separation of the core OSCAR functionality and its release as the OSCAR4 library. This library features a modular API (based on reduction of surface coupling) that permits client programmers to easily incorporate it into external applications. OSCAR4 offers a domain-independent architecture upon which chemistry specific text-mining tools can be built, and its development and usage are discussed. PMID:21999457
ERIC Educational Resources Information Center
Krishnamurthy, M.
2008-01-01
Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…
Geologic map of the Priest Rapids 1:100,000 quadrangle, Washington
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reidel, S.P.; Fecht, K.R.
1993-09-01
This map of the Priest Rapids 1:100,000-scale quadrangle, Washington, shows the geology of one of fifteen complete or partial 1:100,000-scale quadrangles that cover the southeast quadrant of Washington. Geologic maps of these quadrangles have been compiled by geologists with the Washington Division of Geology and Earth Resources (DGER) and Washington State University and are the principal data sources for a 1:250,000scale geologic map of the southeast quadrant of Washington, which is in preparation. Eleven of those quadrangles are being released as DGER open-file reports (listed below). The map of the Wenatchee quadrangle has been published by the US Geological Surveymore » (Tabor and others, 1982), and the Moses Lake (Gulick, 1990a), Ritzville (Gulick, 1990b), and Rosalia (Waggoner, 1990) quadrangles have already been released. The geology of the Priest Rapids quadrangle has not previously been compiled at 1:100,000 scale. Furthermore, this is the first 1:100,000 or smaller scale geologic map of the area to incorporate both bedrock and surficial geology. This map was compiled in 1992, using published and unpublished geologic maps as sources of data.« less
Life cycle of PCBs and contamination of the environment and of food products from animal origin.
Weber, Roland; Herold, Christine; Hollert, Henner; Kamphues, Josef; Ungemach, Linda; Blepp, Markus; Ballschmiter, Karlheinz
2018-06-01
This report gives a summary of the historic use, former management and current release of polychlorinated biphenyls (PCBs) in Germany and assesses the impact of the life cycle of PCBs on the contamination of the environment and of food products of animal origin. In Germany 60,000 t of PCBs were used in transformers, capacitors or as hydraulic oils. The use of PCB oils in these "closed applications", has been banned in Germany in 2000. Thirty to 50% of these PCBs were not appropriately managed. In West Germany, 24,000 t of PCBs were used in open applications, mainly as additive (plasticiser, flame retardant) in sealants and paints in buildings and other construction. The continued use in open applications has not been banned, and in 2013, an estimated more than 12,000 t of PCBs were still present in buildings and other constructions. These open PCB applications continuously emit PCBs into the environment with an estimated release of 7-12 t per year. This amount is in agreement with deposition measurements (estimated to 18 t) and emission estimates for Switzerland. The atmospheric PCB releases still have an relevant impact on vegetation and livestock feed. In addition, PCBs in open applications on farms are still a sources of contamination for farmed animals. Furthermore, the historic production, use, recycling and disposal of PCBs have contaminated soils along the lifecycle. This legacy of contaminated soils and contaminated feed, individually or collectively, can lead to exceedance of maximum levels in food products from animals. In beef and chicken, soil levels of 5 ng PCB-TEQ/kg and for chicken with high soil exposure even 2 ng PCB-TEQ/kg can lead to exceedance of EU limits in meat and eggs. Areas at and around industries having produced or used or managed PCBs, or facilities and areas where PCBs were disposed need to be assessed in respect to potential contamination of food-producing animals. For a large share of impacted land, management measures applicable on farm level might be sufficient to continue with food production. Open PCB applications need to be inventoried and better managed. Other persistent and toxic chemicals used as alternatives to PCBs, e.g. short chain chlorinated paraffins (SCCPs), should be assessed in the life cycle for exposure of food-producing animals and humans.
Carbon Sequestration through Sustainably Sourced Algal Fertilizer: Deep Ocean Water.
NASA Astrophysics Data System (ADS)
Sherman, M. T.
2014-12-01
Drawing down carbon from the atmosphere happens in the oceans when marine plants are growing due to the use of carbon dioxide for biological processes and by raising the pH of the water. Macro- and microscopic marine photosynthesizers are limited in their growth by the availability of light and nutrients (nitrogen, phosphorous, iron, etc.) Deep ocean water (DOW), oceanic water from bellow about 1000m, is a natural medium for marine algae, which contains all (except in rare circumstances) necessary components for algal growth and represents over 90% of the volume of the ocean. The introduction of DOW to a tropical or summer sea can increase chlorophyll from near zero to 60 mg per M3 or more. The form of the utilization infrastructure for DOW can roughly be divided into two effective types; the unconstrained release and the open pond system. Unconstrained release has the advantage of having relatively low infrastructure investment and is available to any area of the ocean. The open pond system has high infrastructure costs but enables intensive use of DOW for harvesting macro- and microalgae and sustainable mariculture. It also enables greater concomitant production of DOW's other potential products such as electricity or potable water. However, unlike an unconstrained release the open pond system can capture much of the biomaterial from the water and limits the impact to the surrounding ecosystem. The Tidal Irrigation and Electrical System (TIESystem), is an open pond that is to be constructed on a continental shelf. It harnesses the tidal flux to pump DOW into the pond on the rising tide and then uses the falling tide to pump biologically rich material out of the pond. This biomaterial represents fixed CO2 and can be used for biofuel or fertilizers. The TIESystem benefits from an economy of scale that increases at a rate that is roughly equal to the relationship of the circumference of a circle (the barrier that creates the open pond) to the area of the pond multiplied by the tidal flux on that particular area of the continental shelf. Despite the large construction costs of artificial islands and structures robust enough to withstand the conditions of the continental shelf, the system will become economic as it grows in size. However, extensive research will be required to maximize the output of each subsystem and minimize the risk of pollution.
Metal lost and found: dissipative uses and releases of copper in the United States 1975-2000.
Lifset, Reid J; Eckelman, Matthew J; Harper, E M; Hausfather, Zeke; Urbina, Gonzalo
2012-02-15
Metals are used in a variety of ways, many of which lead to dissipative releases to the environment. Such releases are relevant from both a resource use and an environmental impact perspective. We present a historical analysis of copper dissipative releases in the United States from 1975 to 2000. We situate all dissipative releases in copper's life cycle and introduce a conceptual framework by which copper dissipative releases may be categorized in terms of intentionality of use and release. We interpret our results in the context of larger trends in production and consumption and government policies that have served as drivers of intentional copper releases from the relevant sources. Intentional copper releases are found to be both significant in quantity and highly variable. In 1975, for example, the largest source of intentional releases was from the application of copper-based pesticides, and this decreased more than 50% over the next 25 years; all other sources of intentional releases increased during that period. Overall, intentional copper releases decreased by approximately 15% from 1975 to 2000. Intentional uses that are unintentionally released such as copper from roofing, increased by the same percentage. Trace contaminant sources such as fossil fuel combustion, i.e., sources where both the use and the release are unintended, increased by nearly 50%. Intentional dissipative uses are equivalent to 60% of unintentional copper dissipative releases and more than five times that from trace sources. Dissipative copper releases are revealed to be modest when compared to bulk copper flows in the economy, and we introduce a metric, the dissipation index, which may be considered an economy-wide measure of resource efficiency for a particular substance. We assess the importance of dissipative releases in the calculation of recycling rates, concluding that the inclusion of dissipation in recycling rate calculations has a small, but discernible, influence, and should be included in such calculations. Copyright © 2011 Elsevier B.V. All rights reserved.
General Mission Analysis Tool (GMAT) Architectural Specification. Draft
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Conway, Darrel, J.
2007-01-01
Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.
Characteristics of sound radiation from turbulent premixed flames
NASA Astrophysics Data System (ADS)
Rajaram, Rajesh
Turbulent combustion processes are inherently unsteady and, thus, a source of acoustic radiation, which occurs due to the unsteady expansion of reacting gases. While prior studies have extensively characterized the total sound power radiated by turbulent flames, their spectral characteristics are not well understood. The objective of this research work is to measure the flow and acoustic properties of an open turbulent premixed jet flame and explain the spectral trends of combustion noise. The flame dynamics were characterized using high speed chemiluminescence images of the flame. A model based on the solution of the wave equation with unsteady heat release as the source was developed and was used to relate the measured chemiluminescence fluctuations to its acoustic emission. Acoustic measurements were performed in an anechoic environment for several burner diameters, flow velocities, turbulence intensities, fuels, and equivalence ratios. The acoustic emissions are shown to be characterized by four parameters: peak frequency (Fpeak), low frequency slope (beta), high frequency slope (alpha) and Overall Sound Pressure Level (OASPL). The peak frequency (Fpeak) is characterized by a Strouhal number based on the mean velocity and a flame length. The transfer function between the acoustic spectrum and the spectrum of heat release fluctuations has an f2 dependence at low frequencies, while it converged to a constant value at high frequencies. Furthermore, the OASPL was found to be characterized by (Fpeak mfH)2, which resembles the source term in the wave equation.
Romanov, Roman A; Lasher, Robert S; High, Brigit; Savidge, Logan E; Lawson, Adam; Rogachevskaja, Olga A; Zhao, Haitian; Rogachevsky, Vadim V; Bystrova, Marina F; Churbanov, Gleb D; Adameyko, Igor; Harkany, Tibor; Yang, Ruibiao; Kidd, Grahame J; Marambaud, Philippe; Kinnamon, John C; Kolesnikov, Stanislav S; Finger, Thomas E
2018-05-08
Conventional chemical synapses in the nervous system involve a presynaptic accumulation of neurotransmitter-containing vesicles, which fuse with the plasma membrane to release neurotransmitters that activate postsynaptic receptors. In taste buds, type II receptor cells do not have conventional synaptic features but nonetheless show regulated release of their afferent neurotransmitter, ATP, through a large-pore, voltage-gated channel, CALHM1. Immunohistochemistry revealed that CALHM1 was localized to points of contact between the receptor cells and sensory nerve fibers. Ultrastructural and super-resolution light microscopy showed that the CALHM1 channels were consistently associated with distinctive, large (1- to 2-μm) mitochondria spaced 20 to 40 nm from the presynaptic membrane. Pharmacological disruption of the mitochondrial respiratory chain limited the ability of taste cells to release ATP, suggesting that the immediate source of released ATP was the mitochondrion rather than a cytoplasmic pool of ATP. These large mitochondria may serve as both a reservoir of releasable ATP and the site of synthesis. The juxtaposition of the large mitochondria to areas of membrane displaying CALHM1 also defines a restricted compartment that limits the influx of Ca 2+ upon opening of the nonselective CALHM1 channels. These findings reveal a distinctive organelle signature and functional organization for regulated, focal release of purinergic signals in the absence of synaptic vesicles. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
Martin, Erika G; Helbig, Natalie; Birkhead, Guthrie S
2015-01-01
Governments are rapidly developing open data platforms to improve transparency and make information more accessible. New York is a leader, with currently the only state platform devoted to health. Although these platforms could build public health departments' capabilities to serve more researchers, agencies have little guidance on releasing meaningful and usable data. Structured focus groups with researchers and practitioners collected stakeholder feedback on potential uses of open health data and New York's open data strategy. Researchers and practitioners attended a 1-day November 2013 workshop on New York State's open health data resources. After learning about the state's open data platform and vision for open health data, participants were organized into 7 focus groups to discuss the essential elements of open data sets, practical challenges to obtaining and using health data, and potential uses of open data. Participants included 33 quantitative health researchers from State University of New York campuses and private partners and 10 practitioners from the New York State Department of Health. There was low awareness of open data, with 67% of researchers reporting never using open data portals prior to the workshop. Participants were interested in data sets that were geocoded, longitudinal, or aggregated to small area granularity and capabilities to link multiple data sets. Multiple environmental conditions and barriers hinder their capacity to use health data for research. Although open data platforms cannot address all barriers, they provide multiple opportunities for public health research and practice, and participants were overall positive about the state's efforts to release open data. Open data are not ideal for some researchers because they do not contain individually identifiable data, indicating a need for tiered data release strategies. However, they do provide important new opportunities to facilitate research and foster collaborations among agencies, researchers, and practitioners.
Okandan, Murat; Nielson, Gregory N
2014-12-09
Accessing a workpiece object in semiconductor processing is disclosed. The workpiece object includes a mechanical support substrate, a release layer over the mechanical support substrate, and an integrated circuit substrate coupled over the release layer. The integrated circuit substrate includes a device layer having semiconductor devices. The method also includes etching through-substrate via (TSV) openings through the integrated circuit substrate that have buried ends at or within the release layer including using the release layer as an etch stop. TSVs are formed by introducing one or more conductive materials into the TSV openings. A die singulation trench is etched at least substantially through the integrated circuit substrate around a perimeter of an integrated circuit die. The integrated circuit die is at least substantially released from the mechanical support substrate.
Rule, Ana M; Evans, Sean L; Silbergeld, Ellen K
2008-01-01
Use of antimicrobial feed additives in food animal production is associated with selection for drug resistance in bacterial pathogens, which can then be released into the environment through occupational exposures, high volume ventilation of animal houses, and land application of animal wastes. We tested the hypothesis that current methods of transporting food animals from farms to slaughterhouses may result in pathogen releases and potential exposures of persons in vehicles traveling on the same road. Air and surface samples were taken from cars driving behind poultry trucks for 17 miles. Air conditioners and fans were turned off and windows fully opened. Background and blank samples were used for quality control. Samples were analyzed for susceptible and drug-resistant strains. Results indicate an increase in the number of total aerobic bacteria including both susceptible and drug-resistant enterococci isolated from air and surface samples, and suggest that food animal transport in open crates introduces a novel route of exposure to harmful microorganisms and may disseminate these pathogens into the general environment. These findings support the need for further exposure characterization, and attention to improving methods of food animal transport, especially in highly trafficked regions of high density farming such as the Delmarva Peninsula.
Comparison of in vitro systems of protein digestion using either mammal or fish proteolytic enzymes.
Moyano, F J; Savoie, L
2001-02-01
Hydrolysis of three different proteins by either crude fish digestive extracts or purified mammal proteases was assayed using two different in vitro systems. The closed system was a modification of the pH-stat method including a previous acid digestion. The open system used a digestion cell containing a semi-permeable membrane which allowed continuous separation of the final products of hydrolysis with a molecular cut-off of 1000 Da. Assays in both systems resulted a similar arrangement of the tested proteins in relation to their ability to be hydrolyzed, with casein>fish meal> or =soybean meal. With the exception of casein, no significant differences were found between results produced by any of the enzyme sources using the closed system. In constrast, significantly higher hydrolysis of all proteins was produced by mammal enzymes under conditions operating in the open system. Differences in the rate of release of amino acids measured in this latter system were related both to the type of protein and the origin of the enzymes. When using purified mammal enzymes, release of lysine or phenylalanine from casein and soybean was high, but low from fishmeal. Isoleucine and valine present in fishmeal were preferentially hydrolyzed by commercial enzymes, but glycine and proline by fish enzymes.
JASPAR 2010: the greatly expanded open-access database of transcription factor binding profiles
Portales-Casamar, Elodie; Thongjuea, Supat; Kwon, Andrew T.; Arenillas, David; Zhao, Xiaobei; Valen, Eivind; Yusuf, Dimas; Lenhard, Boris; Wasserman, Wyeth W.; Sandelin, Albin
2010-01-01
JASPAR (http://jaspar.genereg.net) is the leading open-access database of matrix profiles describing the DNA-binding patterns of transcription factors (TFs) and other proteins interacting with DNA in a sequence-specific manner. Its fourth major release is the largest expansion of the core database to date: the database now holds 457 non-redundant, curated profiles. The new entries include the first batch of profiles derived from ChIP-seq and ChIP-chip whole-genome binding experiments, and 177 yeast TF binding profiles. The introduction of a yeast division brings the convenience of JASPAR to an active research community. As binding models are refined by newer data, the JASPAR database now uses versioning of matrices: in this release, 12% of the older models were updated to improved versions. Classification of TF families has been improved by adopting a new DNA-binding domain nomenclature. A curated catalog of mammalian TFs is provided, extending the use of the JASPAR profiles to additional TFs belonging to the same structural family. The changes in the database set the system ready for more rapid acquisition of new high-throughput data sources. Additionally, three new special collections provide matrix profile data produced by recent alternative high-throughput approaches. PMID:19906716
Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments.
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-01-05
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-01-01
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. PMID:26745406
Ingargiola, A.; Laurence, T. A.; Boutelle, R.; ...
2015-12-23
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less
The successes and challenges of open-source biopharmaceutical innovation.
Allarakhia, Minna
2014-05-01
Increasingly, open-source-based alliances seek to provide broad access to data, research-based tools, preclinical samples and downstream compounds. The challenge is how to create value from open-source biopharmaceutical innovation. This value creation may occur via transparency and usage of data across the biopharmaceutical value chain as stakeholders move dynamically between open source and open innovation. In this article, several examples are used to trace the evolution of biopharmaceutical open-source initiatives. The article specifically discusses the technological challenges associated with the integration and standardization of big data; the human capacity development challenges associated with skill development around big data usage; and the data-material access challenge associated with data and material access and usage rights, particularly as the boundary between open source and open innovation becomes more fluid. It is the author's opinion that the assessment of when and how value creation will occur, through open-source biopharmaceutical innovation, is paramount. The key is to determine the metrics of value creation and the necessary technological, educational and legal frameworks to support the downstream outcomes of now big data-based open-source initiatives. The continued focus on the early-stage value creation is not advisable. Instead, it would be more advisable to adopt an approach where stakeholders transform open-source initiatives into open-source discovery, crowdsourcing and open product development partnerships on the same platform.
Preliminary Geologic Map of the Topanga 7.5' Quadrangle, Southern California: A Digital Database
Yerkes, R.F.; Campbell, R.H.
1995-01-01
INTRODUCTION This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1994). More specific information about the units may be available in the original sources. The content and character of the database and methods of obtaining it are described herein. The geologic map database itself, consisting of three ARC coverages and one base layer, can be obtained over the Internet or by magnetic tape copy as described below. The processes of extracting the geologic map database from the tar file, and importing the ARC export coverages (procedure described herein), will result in the creation of an ARC workspace (directory) called 'topnga.' The database was compiled using ARC/INFO version 7.0.3, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). It is stored in uncompressed ARC export format (ARC/INFO version 7.x) in a compressed UNIX tar (tape archive) file. The tar file was compressed with gzip, and may be uncompressed with gzip, which is available free of charge via the Internet from the gzip Home Page (http://w3.teaser.fr/~jlgailly/gzip). A tar utility is required to extract the database from the tar file. This utility is included in most UNIX systems, and can be obtained free of charge via the Internet from Internet Literacy's Common Internet File Formats Webpage http://www.matisse.net/files/formats.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView (version 1.0 for Windows 3.1 to 3.11 is available for free from ESRI's web site: http://www.esri.com). 1. Different base layer - The original digital database included separates clipped out of the Los Angeles 1:100,000 sheet. This release includes a vectorized scan of a scale-stable negative of the Topanga 7.5 minute quadrangle. 2. Map projection - The files in the original release were in polyconic projection. The projection used in this release is state plane, which allows for the tiling of adjacent quadrangles. 3. File compression - The files in the original release were compressed with UNIX compression. The files in this release are compressed with gzip.
Digital geologic map and GIS database of Venezuela
Garrity, Christopher P.; Hackley, Paul C.; Urbani, Franco
2006-01-01
The digital geologic map and GIS database of Venezuela captures GIS compatible geologic and hydrologic data from the 'Geologic Shaded Relief Map of Venezuela,' which was released online as U.S. Geological Survey Open-File Report 2005-1038. Digital datasets and corresponding metadata files are stored in ESRI geodatabase format; accessible via ArcGIS 9.X. Feature classes in the geodatabase include geologic unit polygons, open water polygons, coincident geologic unit linework (contacts, faults, etc.) and non-coincident geologic unit linework (folds, drainage networks, etc.). Geologic unit polygon data were attributed for age, name, and lithologic type following the Lexico Estratigrafico de Venezuela. All digital datasets were captured from source data at 1:750,000. Although users may view and analyze data at varying scales, the authors make no guarantee as to the accuracy of the data at scales larger than 1:750,000.
ERIC Educational Resources Information Center
Voyles, Bennett
2007-01-01
People know about the Sakai Project (open source course management system); they may even know about Kuali (open source financials). So, what is the next wave in open source software? This article discusses business intelligence (BI) systems. Though open source BI may still be only a rumor in most campus IT departments, some brave early adopters…
Python-Assisted MODFLOW Application and Code Development
NASA Astrophysics Data System (ADS)
Langevin, C.
2013-12-01
The U.S. Geological Survey (USGS) has a long history of developing and maintaining free, open-source software for hydrological investigations. The MODFLOW program is one of the most popular hydrologic simulation programs released by the USGS, and it is considered to be the most widely used groundwater flow simulation code. MODFLOW was written using a modular design and a procedural FORTRAN style, which resulted in code that could be understood, modified, and enhanced by many hydrologists. The code is fast, and because it uses standard FORTRAN it can be run on most operating systems. Most MODFLOW users rely on proprietary graphical user interfaces for constructing models and viewing model results. Some recent efforts, however, have focused on construction of MODFLOW models using open-source Python scripts. Customizable Python packages, such as FloPy (https://code.google.com/p/flopy), can be used to generate input files, read simulation results, and visualize results in two and three dimensions. Automating this sequence of steps leads to models that can be reproduced directly from original data and rediscretized in space and time. Python is also being used in the development and testing of new MODFLOW functionality. New packages and numerical formulations can be quickly prototyped and tested first with Python programs before implementation in MODFLOW. This is made possible by the flexible object-oriented design capabilities available in Python, the ability to call FORTRAN code from Python, and the ease with which linear systems of equations can be solved using SciPy, for example. Once new features are added to MODFLOW, Python can then be used to automate comprehensive regression testing and ensure reliability and accuracy of new versions prior to release.
AWIPS II in the University Community: Unidata's efforts and capabilities of the software
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; James, Michael
2015-04-01
The Advanced Weather Interactive Processing System, version II (AWIPS II) is a weather forecasting, display and analysis tool that is used by the National Oceanic and Atmospheric Administration/National Weather Service (NOAA/NWS) and the National Centers for Environmental Prediction (NCEP) to ingest analyze and disseminate operational weather data. The AWIPS II software is built on a Service Oriented Architecture, takes advantage of open source software, and its design affords expandability, flexibility, and portability. Since many university meteorology programs are eager to use the same tools used by NWS forecasters, Unidata community interest in AWIPS II is high. The Unidata Program Center (UPC) has worked closely with NCEP staff during AWIPS II development in order to devise a way to make it available to the university. The Unidata AWIPS II software was released in beta form in 2014, and it incorporates a number of key changes to the baseline U. S. National Weather Service release to process and display additional data formats and run all components in a single-server standalone configuration. In addition to making available open-source instances of the software libraries that can be downloaded and run at any university, Unidata has also deployed the data-server side of AWIPS II, known as EDEX, in the Amazon Web Service and Microsoft Azure cloud environments. In this set up, universities receive all of the data from remote cloud instances, while they only have to run the AWIPS II client, known as CAVE, to analyze and visualize the data. In this presentation, we will describe Unidata's AWIPS II efforts, including the capabilities of the software in visualizing many different types of real-time meteorological data and its myriad uses in university and other settings.
1993-01-01
A contact interaction is proposed to exist between the voltage sensor of the transverse tubular membrane of skeletal muscle and the calcium release channel of the sarcoplasmic reticulum. This interaction is given a quantitative formulation inspired in the Monod, Wyman, and Changeux model of allosteric transitions in hemoglobin (Monod, J., J. Wyman, and J.-P. Changeux. 1965. Journal of Molecular Biology. 12:88- 118), and analogous to one proposed by Marks and Jones for voltage- dependent Ca channels (Marks, T. N., and S. W. Jones. 1992. Journal of General Physiology. 99:367-390). The allosteric protein is the calcium release channel, a homotetramer, with two accessible states, closed and open. The kinetics and equilibrium of this transition are modulated by voltage sensors (dihydropyridine receptors) pictured as four units per release channel, each undergoing independent voltage-driven transitions between two states (resting and activating). For each voltage sensor that moves to the activating state, the tendency of the channel to open increases by an equal (large) factor. The equilibrium and kinetic equations of the model are solved and shown to reproduce well a number of experimentally measured relationships including: charge movement (Q) vs. voltage, open probability of the release channel (Po) vs. voltage, the transfer function relationship Po vs. Q, and the kinetics of charge movement, release activation, and deactivation. The main consequence of the assumption of allosteric coupling is that primary effects on the release channel are transmitted backward to the voltage sensor and give secondary effects. Thus, the model reproduces well the effects of perchlorate, described in the two previous articles, under the assumption that the primary effect is to increase the intrinsic tendency of the release channel to open, with no direct effects on the voltage sensor. This modification of the open-closed equilibrium of the release channel causes a shift in the equilibrium dependency of charge movement with voltage. The paradoxical slowing of charge movement by perchlorate also results from reciprocal effects of the channel on the allosterically coupled voltage sensors. The observations of the previous articles plus the simulations in this article constitute functional evidence of allosteric transmission. PMID:8245819
NASA Astrophysics Data System (ADS)
Thompson, Drew; Leparoux, Marc; Jaeggi, Christian; Buha, Jelena; Pui, David Y. H.; Wang, Jing
2013-12-01
In this study, the synthesis of silicon carbide (SiC) nanoparticles in a prototype inductively coupled thermal plasma reactor and other supporting processes, such as the handling of precursor material, the collection of nanoparticles, and the cleaning of equipment, were monitored for particle emissions and potential worker exposure. The purpose of this study was to evaluate the effectiveness of engineering controls and best practice guidelines developed for the production and handling of nanoparticles, identify processes which result in a nanoparticle release, characterize these releases, and suggest possible administrative or engineering controls which may eliminate or control the exposure source. No particle release was detected during the synthesis and collection of SiC nanoparticles and the cleaning of the reactor. This was attributed to most of these processes occurring in closed systems operated at slight underpressure. Other tasks occurring in more open spaces, such as the disconnection of a filter assembly from the reactor system and the use of compressed air for the cleaning of filters where synthesized SiC nanoparticles were collected, resulted in releases of submicrometer particles with a mode size of 170-180 nm. Observation of filter samples under scanning electron microscope confirmed that the particles were agglomerates of SiC nanoparticles.
NASA Astrophysics Data System (ADS)
Zeuner, Katharina D.; Paul, Matthias; Lettner, Thomas; Reuterskiöld Hedlund, Carl; Schweickert, Lucas; Steinhauer, Stephan; Yang, Lily; Zichi, Julien; Hammar, Mattias; Jöns, Klaus D.; Zwiller, Val
2018-04-01
The implementation of fiber-based long-range quantum communication requires tunable sources of single photons at the telecom C-band. Stable and easy-to-implement wavelength-tunability of individual sources is crucial to (i) bring remote sources into resonance, (ii) define a wavelength standard, and (iii) ensure scalability to operate a quantum repeater. So far, the most promising sources for true, telecom single photons are semiconductor quantum dots, due to their ability to deterministically and reliably emit single and entangled photons. However, the required wavelength-tunability is hard to attain. Here, we show a stable wavelength-tunable quantum light source by integrating strain-released InAs quantum dots on piezoelectric substrates. We present triggered single-photon emission at 1.55 μm with a multi-photon emission probability as low as 0.097, as well as photon pair emission from the radiative biexciton-exciton cascade. We achieve a tuning range of 0.25 nm which will allow us to spectrally overlap remote quantum dots or tuning distant quantum dots into resonance with quantum memories. This opens up realistic avenues for the implementation of photonic quantum information processing applications at telecom wavelengths.
A collection of open source applications for mass spectrometry data mining.
Gallardo, Óscar; Ovelleiro, David; Gay, Marina; Carrascal, Montserrat; Abian, Joaquin
2014-10-01
We present several bioinformatics applications for the identification and quantification of phosphoproteome components by MS. These applications include a front-end graphical user interface that combines several Thermo RAW formats to MASCOT™ Generic Format extractors (EasierMgf), two graphical user interfaces for search engines OMSSA and SEQUEST (OmssaGui and SequestGui), and three applications, one for the management of databases in FASTA format (FastaTools), another for the integration of search results from up to three search engines (Integrator), and another one for the visualization of mass spectra and their corresponding database search results (JsonVisor). These applications were developed to solve some of the common problems found in proteomic and phosphoproteomic data analysis and were integrated in the workflow for data processing and feeding on our LymPHOS database. Applications were designed modularly and can be used standalone. These tools are written in Perl and Python programming languages and are supported on Windows platforms. They are all released under an Open Source Software license and can be freely downloaded from our software repository hosted at GoogleCode. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf
2016-06-01
This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.
Open source integrated modeling environment Delta Shell
NASA Astrophysics Data System (ADS)
Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.
2012-04-01
In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.
Wu, Zhenqin; Ramsundar, Bharath; Feinberg, Evan N.; Gomes, Joseph; Geniesse, Caleb; Pappu, Aneesh S.; Leswing, Karl
2017-01-01
Molecular machine learning has been maturing rapidly over the last few years. Improved methods and the presence of larger datasets have enabled machine learning algorithms to make increasingly accurate predictions about molecular properties. However, algorithmic progress has been limited due to the lack of a standard benchmark to compare the efficacy of proposed methods; most new algorithms are benchmarked on different datasets making it challenging to gauge the quality of proposed methods. This work introduces MoleculeNet, a large scale benchmark for molecular machine learning. MoleculeNet curates multiple public datasets, establishes metrics for evaluation, and offers high quality open-source implementations of multiple previously proposed molecular featurization and learning algorithms (released as part of the DeepChem open source library). MoleculeNet benchmarks demonstrate that learnable representations are powerful tools for molecular machine learning and broadly offer the best performance. However, this result comes with caveats. Learnable representations still struggle to deal with complex tasks under data scarcity and highly imbalanced classification. For quantum mechanical and biophysical datasets, the use of physics-aware featurizations can be more important than choice of particular learning algorithm. PMID:29629118
The Commercial Open Source Business Model
NASA Astrophysics Data System (ADS)
Riehle, Dirk
Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.
Concierge: Personal Database Software for Managing Digital Research Resources
Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro
2007-01-01
This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ingargiola, A.; Laurence, T. A.; Boutelle, R.
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less
ERIC Educational Resources Information Center
Giannotti, F.; Cortesi, F.; Cerquiglini, A.; Bernabei, P.
2006-01-01
Long-term effectiveness of controlled-release melatonin in 25 children, aged 2.6-9.6 years with autism without other coexistent pathologies was evaluated openly. Sleep patterns were studied using Children's Sleep Habits Questionnaire (CSHQ) and sleep diaries at baseline, after 1-3-6 months melatonin treatment and 1 month after discontinuation.…
NASA Astrophysics Data System (ADS)
Udell, C.; Selker, J. S.
2017-12-01
The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.
Tue, Nguyen Minh; Goto, Akitoshi; Takahashi, Shin; Itai, Takaaki; Asante, Kwadwo Ansong; Kunisue, Tatsuya; Tanabe, Shinsuke
2016-01-25
Although complex mixtures of dioxin-related compounds (DRCs) can be released from informal e-waste recycling, DRC contamination in African e-waste recycling sites has not been investigated. This study examined the concentrations of DRCs including chlorinated, brominated, mixed halogenated dibenzo-p-dioxins/dibenzofurans (PCDD/Fs, PBDD/Fs, PXDD/Fs) and dioxin-like polychlorinated biphenyls (DL-PCBs) in surface soil samples from the Agbogbloshie e-waste recycling site in Ghana. PCDD/F and PBDD/F concentrations in open burning areas (18-520 and 83-3800 ng/g dry, respectively) were among the highest reported in soils from informal e-waste sites. The concentrations of PCDFs and PBDFs were higher than those of the respective dibenzo-p-dioxins, suggesting combustion and PBDE-containing plastics as principal sources. PXDFs were found as more abundant than PCDFs, and higher brominated analogues occurred at higher concentrations. The median total WHO toxic equivalent (TEQ) concentration in open burning soils was 7 times higher than the U.S. action level (1000 pg/g), with TEQ contributors in the order of PBDFs>PCDD/Fs>PXDFs. DRC emission to soils over the e-waste site as of 2010 was estimated, from surface soil lightness based on the correlations between concentrations and lightness, at 200mg (95% confidence interval 93-540 mg) WHO-TEQ over three years. People living in Agbogbloshie are potentially exposed to high levels of not only chlorinated but also brominated DRCs, and human health implications need to be assessed in future studies. Copyright © 2015 Elsevier B.V. All rights reserved.
Open-source software: not quite endsville.
Stahl, Matthew T
2005-02-01
Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.
SunPy 0.8 - Python for Solar Physics
NASA Astrophysics Data System (ADS)
Inglis, Andrew; Bobra, Monica; Christe, Steven; Hewett, Russell; Ireland, Jack; Mumford, Stuart; Martinez Oliveros, Juan Carlos; Perez-Suarez, David; Reardon, Kevin P.; Savage, Sabrina; Shih, Albert Y.; Ryan, Daniel; Sipocz, Brigitta; Freij, Nabil
2017-08-01
SunPy is a community-developed open-source software library for solar physics. It is written in Python, a free, cross-platform, general-purpose, high-level programming language which is being increasingly adopted throughout the scientific community. Python is one of the top ten most often used programming languages, as such it provides a wide array of software packages, such as numerical computation (NumPy, SciPy), machine learning (scikit-learn), signal processing (scikit-image, statsmodels) to visualization and plotting (matplotlib, mayavi). SunPy aims to provide the software for obtaining and analyzing solar and heliospheric data. This poster introduces a new major release of SunPy (0.8). This release includes two major new functionalities, as well as a number of bug fixes. It is based on 1120 contributions from 34 unique contributors. Fido is the new primary interface to download data. It provides a consistent and powerful search interface to all major data sources provides including VSO, JSOC, as well as individual data sources such as GOES XRS time series and and is fully pluggable to add new data sources, i.e. DKIST. In anticipation of Solar Orbiter and the Parker Solar Probe, SunPy now provides a powerful way of representing coordinates, allowing conversion between coordinate systems and viewpoints of different instruments, including preliminary reprojection capabilities. Other new features including new timeseries capabilities with better support for concatenation and metadata, updated documentation and example gallery. SunPy is distributed through pip and conda and all of its code is publicly available (sunpy.org).
Application of OpenStreetMap (OSM) to Support the Mapping Village in Indonesia
NASA Astrophysics Data System (ADS)
Swasti Kanthi, Nurin; Hery Purwanto, Taufik
2016-11-01
Geospatial Information is a important thing in this era, because the need for location information is needed to know the condition of a region. In 2015 the Indonesian government release detailed mapping in village level and their Parent maps Indonesian state regulatory standards set forth in Rule form Norm Standards, Procedures and Criteria for Mapping Village (NSPK). Over time Web and Mobile GIS was developed with a wide range of applications. The merger between detailed mapping and Web GIS is still rarely performed and not used optimally. OpenStreetMap (OSM) is a WebGIS which can be utilized as Mobile GIS providing sufficient information to the representative levels of the building and can be used for mapping the village.Mapping Village using OSM was conducted using remote sensing approach and Geographical Information Systems (GIS), which's to interpret remote sensing imagery from OSM. The study was conducted to analyzed how far the role of OSM to support the mapping of the village, it's done by entering the house number data, administrative boundaries, public facilities and land use into OSM with reference data and data image Village Plan. The results of the mapping portion villages in OSM as a reference map-making village and analyzed in accordance with NSPK for detailed mapping Rukun Warga (RW) is part of the village mapping. The use of OSM greatly assists the process of mapping the details of the region with data sources in the form of images and can be accessed for Open Source. But still need their care and updating the data source to maintain the validity of the data.
Open-Source Data and the Study of Homicide.
Parkin, William S; Gruenewald, Jeff
2015-07-20
To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David
A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooledmore » fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the current state of knowledge is extensive, and in most areas may be sufficient. Several knowledge gaps were identified, such as uncertainty in release from molten fuel and availability of thermodynamic data for lanthanides and actinides in liquid sodium. However, the overall findings suggest that high retention rates can be expected within the fuel and primary sodium for all radionuclides other than noble gases.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Serebrov, A. P., E-mail: serebrov@pnpi.spb.ru; Kislitsin, B. V.; Onegin, M. S.
2016-12-15
Results of calculations of energy releases and temperature fields in the ultracold neutron source under design at the WWR-M reactor are presented. It is shown that, with the reactor power of 18 MW, the power of energy release in the 40-L volume of the source with superfluid helium will amount to 28.5 W, while 356 W will be released in a liquid-deuterium premoderator. The lead shield between the reactor core and the source reduces the radiative heat release by an order of magnitude. A thermal power of 22 kW is released in it, which is removed by passage of water.more » The distribution of temperatures in all components of the vacuum structure is presented, and the temperature does not exceed 100°C at full reactor power. The calculations performed make it possible to go to design of the source.« less
Newman, Todd P
2017-10-01
Using the immediate release of the Working Group 1 Summary for Policymakers of the Intergovernmental Panel on Climate Change Fifth Assessment Report as a case study, this article seeks to describe what type of actors were most active during the summary release, the substance of the most propagated tweets during the summary release, and the media sources that attracted the most attention during the summary release. The results from the study suggest that non-elite actors, such as individual bloggers and concerned citizens, accounted for the majority of the most propagated tweets in the sample. This study also finds that the majority of the most propagated tweets in the sample focused on public understanding of the report. Finally, while mainstream media sources were the most frequently discussed media sources, a number of new media and science news and information sources compete for audience attention.
A robust scientific workflow for assessing fire danger levels using open-source software
NASA Astrophysics Data System (ADS)
Vitolo, Claudia; Di Giuseppe, Francesca; Smith, Paul
2017-04-01
Modelling forest fires is theoretically and computationally challenging because it involves the use of a wide variety of information, in large volumes and affected by high uncertainties. In-situ observations of wildfire, for instance, are highly sparse and need to be complemented by remotely sensed data measuring biomass burning to achieve homogeneous coverage at global scale. Fire models use weather reanalysis products to measure energy release and rate of spread but can only assess the potential predictability of fire danger as the actual ignition is due to human behaviour and, therefore, very unpredictable. Lastly, fire forecasting systems rely on weather forecasts to extend the advance warning but are currently calibrated using fire danger thresholds that are defined at global scale and do not take into account the spatial variability of fuel availability. As a consequence, uncertainties sharply increase cascading from the observational to the modelling stage and they might be further inflated by non-reproducible analyses. Although uncertainties in observations will only decrease with technological advances over the next decades, the other uncertainties (i.e. generated during modelling and post-processing) can already be addressed by developing transparent and reproducible analysis workflows, even more if implemented within open-source initiatives. This is because reproducible workflows aim to streamline the processing task as they present ready-made solutions to handle and manipulate complex and heterogeneous datasets. Also, opening the code to the scrutiny of other experts increases the chances to implement more robust solutions and avoids duplication of efforts. In this work we present our contribution to the forest fire modelling community: an open-source tool called "caliver" for the calibration and verification of forest fire model results. This tool is developed in the R programming language and publicly available under an open license. We will present the caliver R package, illustrate the main functionalities and show the results of our preliminary experiments calculating fire danger thresholds for various regions on Earth. We will compare these with the existing global thresholds and, lastly, demonstrate how these newly-calculated regional thresholds can lead to improved calibration of fire forecast models in an operational setting.
Mønster, Jacob G; Samuelsson, Jerker; Kjeldsen, Peter; Rella, Chris W; Scheutz, Charlotte
2014-08-01
Using a dual species methane/acetylene instrument based on cavity ring down spectroscopy (CRDS), the dynamic plume tracer dispersion method for quantifying the emission rate of methane was successfully tested in four measurement campaigns: (1) controlled methane and trace gas release with different trace gas configurations, (2) landfill with unknown emission source locations, (3) landfill with closely located emission sources, and (4) comparing with an Fourier transform infrared spectroscopy (FTIR) instrument using multiple trace gasses for source separation. The new real-time, high precision instrument can measure methane plumes more than 1.2 km away from small sources (about 5 kg h(-1)) in urban areas with a measurement frequency allowing plume crossing at normal driving speed. The method can be used for quantification of total methane emissions from diffuse area sources down to 1 kg per hour and can be used to quantify individual sources with the right choice of wind direction and road distance. The placement of the trace gas is important for obtaining correct quantification and uncertainty of up to 36% can be incurred when the trace gas is not co-located with the methane source. Measurements made at greater distances are less sensitive to errors in trace gas placement and model calculations showed an uncertainty of less than 5% in both urban and open-country for placing the trace gas 100 m from the source, when measurements were done more than 3 km away. Using the ratio of the integrated plume concentrations of tracer gas and methane gives the most reliable results for measurements at various distances to the source, compared to the ratio of the highest concentration in the plume, the direct concentration ratio and using a Gaussian plume model. Under suitable weather and road conditions, the CRDS system can quantify the emission from different sources located close to each other using only one kind of trace gas due to the high time resolution, while the FTIR system can measure multiple trace gasses but with a lower time resolution. Copyright © 2014 Elsevier Ltd. All rights reserved.
Land surface Verification Toolkit (LVT)
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.
2017-01-01
LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.
MIRA: An R package for DNA methylation-based inference of regulatory activity.
Lawson, John T; Tomazou, Eleni M; Bock, Christoph; Sheffield, Nathan C
2018-03-01
DNA methylation contains information about the regulatory state of the cell. MIRA aggregates genome-scale DNA methylation data into a DNA methylation profile for independent region sets with shared biological annotation. Using this profile, MIRA infers and scores the collective regulatory activity for each region set. MIRA facilitates regulatory analysis in situations where classical regulatory assays would be difficult and allows public sources of open chromatin and protein binding regions to be leveraged for novel insight into the regulatory state of DNA methylation datasets. R package available on Bioconductor: http://bioconductor.org/packages/release/bioc/html/MIRA.html. nsheffield@virginia.edu.
Livermore Big Artificial Neural Network Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Essen, Brian Van; Jacobs, Sam; Kim, Hyojin
2016-07-01
LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.
NetProt: Complex-based Feature Selection.
Goh, Wilson Wen Bin; Wong, Limsoon
2017-08-04
Protein complex-based feature selection (PCBFS) provides unparalleled reproducibility with high phenotypic relevance on proteomics data. Currently, there are five PCBFS paradigms, but not all representative methods have been implemented or made readily available. To allow general users to take advantage of these methods, we developed the R-package NetProt, which provides implementations of representative feature-selection methods. NetProt also provides methods for generating simulated differential data and generating pseudocomplexes for complex-based performance benchmarking. The NetProt open source R package is available for download from https://github.com/gohwils/NetProt/releases/ , and online documentation is available at http://rpubs.com/gohwils/204259 .
Integer sequence discovery from small graphs
Hoppe, Travis; Petrone, Anna
2015-01-01
We have exhaustively enumerated all simple, connected graphs of a finite order and have computed a selection of invariants over this set. Integer sequences were constructed from these invariants and checked against the Online Encyclopedia of Integer Sequences (OEIS). 141 new sequences were added and six sequences were extended. From the graph database, we were able to programmatically suggest relationships among the invariants. It will be shown that we can readily visualize any sequence of graphs with a given criteria. The code has been released as an open-source framework for further analysis and the database was constructed to be extensible to invariants not considered in this work. PMID:27034526
Computer Simulation of the VASIMR Engine
NASA Technical Reports Server (NTRS)
Garrison, David
2005-01-01
The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.
NASA Astrophysics Data System (ADS)
Barber, Duncan Henry
During some postulated accidents at nuclear power stations, fuel cooling may be impaired. In such cases, the fuel heats up and the subsequent increased fission-gas release from the fuel to the gap may result in fuel sheath failure. After fuel sheath failure, the barrier between the coolant and the fuel pellets is lost or impaired, gases and vapours from the fuel-to-sheath gap and other open voids in the fuel pellets can be vented. Gases and steam from the coolant can enter the broken fuel sheath and interact with the fuel pellet surfaces and the fission-product inclusion on the fuel surface (including material at the surface of the fuel matrix). The chemistry of this interaction is an important mechanism to model in order to assess fission-product releases from fuel. Starting in 1995, the computer program SOURCE 2.0 was developed by the Canadian nuclear industry to model fission-product release from fuel during such accidents. SOURCE 2.0 has employed an early thermochemical model of irradiated uranium dioxide fuel developed at the Royal Military College of Canada. To overcome the limitations of computers of that time, the implementation of the RMC model employed lookup tables to pre-calculated equilibrium conditions. In the intervening years, the RMC model has been improved, the power of computers has increased significantly, and thermodynamic subroutine libraries have become available. This thesis is the result of extensive work based on these three factors. A prototype computer program (referred to as SC11) has been developed that uses a thermodynamic subroutine library to calculate thermodynamic equilibria using Gibbs energy minimization. The Gibbs energy minimization requires the system temperature (T) and pressure (P), and the inventory of chemical elements (n) in the system. In order to calculate the inventory of chemical elements in the fuel, the list of nuclides and nuclear isomers modelled in SC11 had to be expanded from the list used by SOURCE 2.0. A benchmark calculation demonstrates the improvement in agreement of the total inventory of those chemical elements included in the RMC fuel model to an ORIGEN-S calculation. ORIGEN-S is the Oak Ridge isotope generation and depletion computer program. The Gibbs energy minimizer requires a chemical database containing coefficients from which the Gibbs energy of pure compounds, gas and liquid mixtures, and solid solutions can be calculated. The RMC model of irradiated uranium dioxide fuel has been converted into the required format. The Gibbs energy minimizer has been incorporated into a new model of fission-product vaporization from the fuel surface. Calculated release fractions using the new code have been compared to results calculated with SOURCE IST 2.0P11 and to results of tests used in the validation of SOURCE 2.0. The new code shows improvements in agreement with experimental releases for a number of nuclides. Of particular significance is the better agreement between experimental and calculated release fractions for 140La. The improved agreement reflects the inclusion in the RMC model of the solubility of lanthanum (III) oxide (La2O3) in the fuel matrix. Calculated lanthanide release fractions from earlier computer programs were a challenge to environmental qualification analysis of equipment for some accident scenarios. The new prototype computer program would alleviate this concern. Keywords: Nuclear Engineering; Material Science; Thermodynamics; Radioactive Material, Gibbs Energy Minimization, Actinide Generation and Depletion, FissionProduct Generation and Depletion.
76 FR 22395 - Federal Advisory Committee Act; Open Internet Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-21
... FEDERAL COMMUNICATIONS COMMISSION Federal Advisory Committee Act; Open Internet Advisory Committee... Advisory Committee, known as the ``Open Internet Advisory Committee'' (hereinafter ``the Committee''), is... effects of the FCC's Open Internet rules (available at http://www.fcc.gov/Daily_Releases/Daily_Business...
ERIC Educational Resources Information Center
Kapor, Mitchell
2005-01-01
Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…
Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne
2011-09-28
Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.
Zephyr: Open-source Parallel Seismic Waveform Inversion in an Integrated Python-based Framework
NASA Astrophysics Data System (ADS)
Smithyman, B. R.; Pratt, R. G.; Hadden, S. M.
2015-12-01
Seismic Full-Waveform Inversion (FWI) is an advanced method to reconstruct wave properties of materials in the Earth from a series of seismic measurements. These methods have been developed by researchers since the late 1980s, and now see significant interest from the seismic exploration industry. As researchers move towards implementing advanced numerical modelling (e.g., 3D, multi-component, anisotropic and visco-elastic physics), it is desirable to make use of a modular approach, minimizing the effort developing a new set of tools for each new numerical problem. SimPEG (http://simpeg.xyz) is an open source project aimed at constructing a general framework to enable geophysical inversion in various domains. In this abstract we describe Zephyr (https://github.com/bsmithyman/zephyr), which is a coupled research project focused on parallel FWI in the seismic context. The software is built on top of Python, Numpy and IPython, which enables very flexible testing and implementation of new features. Zephyr is an open source project, and is released freely to enable reproducible research. We currently implement a parallel, distributed seismic forward modelling approach that solves the 2.5D (two-and-one-half dimensional) viscoacoustic Helmholtz equation at a range modelling frequencies, generating forward solutions for a given source behaviour, and gradient solutions for a given set of observed data. Solutions are computed in a distributed manner on a set of heterogeneous workers. The researcher's frontend computer may be separated from the worker cluster by a network link to enable full support for computation on remote clusters from individual workstations or laptops. The present codebase introduces a numerical discretization equivalent to that used by FULLWV, a well-known seismic FWI research codebase. This makes it straightforward to compare results from Zephyr directly with FULLWV. The flexibility introduced by the use of a Python programming environment makes extension of the codebase with new methods much more straightforward. This enables comparison and integration of new efforts with existing results.
The SCEC/UseIT Intern Program: Creating Open-Source Visualization Software Using Diverse Resources
NASA Astrophysics Data System (ADS)
Francoeur, H.; Callaghan, S.; Perry, S.; Jordan, T.
2004-12-01
The Southern California Earthquake Center undergraduate IT intern program (SCEC UseIT) conducts IT research to benefit collaborative earth science research. Through this program, interns have developed real-time, interactive, 3D visualization software using open-source tools. Dubbed LA3D, a distribution of this software is now in use by the seismic community. LA3D enables the user to interactively view Southern California datasets and models of importance to earthquake scientists, such as faults, earthquakes, fault blocks, digital elevation models, and seismic hazard maps. LA3D is now being extended to support visualizations anywhere on the planet. The new software, called SCEC-VIDEO (Virtual Interactive Display of Earth Objects), makes use of a modular, plugin-based software architecture which supports easy development and integration of new data sets. Currently SCEC-VIDEO is in beta testing, with a full open-source release slated for the future. Both LA3D and SCEC-VIDEO were developed using a wide variety of software technologies. These, which included relational databases, web services, software management technologies, and 3-D graphics in Java, were necessary to integrate the heterogeneous array of data sources which comprise our software. Currently the interns are working to integrate new technologies and larger data sets to increase software functionality and value. In addition, both LA3D and SCEC-VIDEO allow the user to script and create movies. Thus program interns with computer science backgrounds have been writing software while interns with other interests, such as cinema, geology, and education, have been making movies that have proved of great use in scientific talks, media interviews, and education. Thus, SCEC UseIT incorporates a wide variety of scientific and human resources to create products of value to the scientific and outreach communities. The program plans to continue with its interdisciplinary approach, increasing the relevance of the software and expanding its use in the scientific community.
Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.
Benson, Tim
2016-07-04
Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.
NASA Astrophysics Data System (ADS)
de Barros, Felipe P. J.
2018-07-01
Quantifying the uncertainty in solute mass discharge at an environmentally sensitive location is key to assess the risks due to groundwater contamination. Solute mass fluxes are strongly affected by the spatial variability of hydrogeological properties as well as release conditions at the source zone. This paper provides a methodological framework to investigate the interaction between the ubiquitous heterogeneity of the hydraulic conductivity and the mass release rate at the source zone on the uncertainty of mass discharge. Through the use of perturbation theory, we derive analytical and semi-analytical expressions for the statistics of the solute mass discharge at a control plane in a three-dimensional aquifer while accounting for the solute mass release rates at the source. The derived solutions are limited to aquifers displaying low-to-mild heterogeneity. Results illustrate the significance of the source zone mass release rate in controlling the mass discharge uncertainty. The relative importance of the mass release rate on the mean solute discharge depends on the distance between the source and the control plane. On the other hand, we find that the solute release rate at the source zone has a strong impact on the variance of the mass discharge. Within a risk context, we also compute the peak mean discharge as a function of the parameters governing the spatial heterogeneity of the hydraulic conductivity field and mass release rates at the source zone. The proposed physically-based framework is application-oriented, computationally efficient and capable of propagating uncertainty from different parameters onto risk metrics. Furthermore, it can be used for preliminary screening purposes to guide site managers to perform system-level sensitivity analysis and better allocate resources.
Open Source Software Development
2011-01-01
Software, 2002, 149(1), 3-17. 3. DiBona , C., Cooper, D., and Stone, M. (Eds.), Open Sources 2.0, 2005, O’Reilly Media, Sebastopol, CA. Also see, C... DiBona , S. Ockman, and M. Stone (Eds.). Open Sources: Vocides from the Open Source Revolution, 1999. O’Reilly Media, Sebastopol, CA. 4. Ducheneaut, N
The Dark Energy Survey First Data Release
NASA Astrophysics Data System (ADS)
Carrasco Kind, Matias
2018-01-01
In this talk I will announce and highlight the main components of the first public data release (DR1) coming from the Dark Energy Survey (DES).In January 2016, the DES survey made available, in a simple unofficial release to the astronomical community, the first set of products. This data was taken and studied during the DES Science Verification period consisting on roughly 250 sq. degrees and 25 million objects at a mean depth of i=23.7 that led to over 80 publications from DES scientist.The DR1 release is the first official release from the main survey and it consists on the observations taken during the first 3 seasons from August 2013 to February 2016 (about 100 nights each season) of the survey which cover the entire DES footprint. All of the Single Epoch Images and the Year 3 Coadded images distributed in 10223 tiles are available for download in this release. The catalogs provide astrometry, photometry and basic classification for near 400M objects in roughly 5000 sq. degrees on the southern hemisphere with a approximate mean depth of i=23.3. Complementary footprint, masking and depth information is also available. All of the software used during the generation of these products are open sourced and have been made available through the Github DES Organization. Images, data and other sub products have been possible through the international and collaborative effort of all 25 institutions involved in DES and are available for exploration and download through the interfaces provided by a partnership between NCSA, NOAO and LIneA.
VizieR Online Data Catalog: The Chandra Source Catalog, Release 1.1 (Evans+ 2012)
NASA Astrophysics Data System (ADS)
Evans, I. N.; Primini, F. A.; Glotfelty, C. S.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G.; Grier, J. D.; Hain, R. M.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Kashyap, V. L.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Mossman, A. E.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2014-01-01
This version of the catalog is release 1.1. It includes the information contained in release 1.0.1, plus point and compact source data extracted from HRC imaging observations, and catch-up ACIS observations released publicly prior to the end of 2009. (1 data file).
Nikolaus, Joerg; Karatekin, Erdem
2016-01-01
In the ubiquitous process of membrane fusion the opening of a fusion pore establishes the first connection between two formerly separate compartments. During neurotransmitter or hormone release via exocytosis, the fusion pore can transiently open and close repeatedly, regulating cargo release kinetics. Pore dynamics also determine the mode of vesicle recycling; irreversible resealing results in transient, "kiss-and-run" fusion, whereas dilation leads to full fusion. To better understand what factors govern pore dynamics, we developed an assay to monitor membrane fusion using polarized total internal reflection fluorescence (TIRF) microscopy with single molecule sensitivity and ~15 msec time resolution in a biochemically well-defined in vitro system. Fusion of fluorescently labeled small unilamellar vesicles containing v-SNARE proteins (v-SUVs) with a planar bilayer bearing t-SNAREs, supported on a soft polymer cushion (t-SBL, t-supported bilayer), is monitored. The assay uses microfluidic flow channels that ensure minimal sample consumption while supplying a constant density of SUVs. Exploiting the rapid signal enhancement upon transfer of lipid labels from the SUV to the SBL during fusion, kinetics of lipid dye transfer is monitored. The sensitivity of TIRF microscopy allows tracking single fluorescent lipid labels, from which lipid diffusivity and SUV size can be deduced for every fusion event. Lipid dye release times can be much longer than expected for unimpeded passage through permanently open pores. Using a model that assumes retardation of lipid release is due to pore flickering, a pore "openness", the fraction of time the pore remains open during fusion, can be estimated. A soluble marker can be encapsulated in the SUVs for simultaneous monitoring of lipid and soluble cargo release. Such measurements indicate some pores may reseal after losing a fraction of the soluble cargo. PMID:27585113
Zaninelli, M; Campagnoli, A; Reyes, M; Rojas, V
2012-11-01
In order to improve the hospital information system of the Chilean University Hospital, the Veterinary Medicine School of Universidad de Chile made a research cooperation with Università San Raffaele Roma to develop and test a new release of the O3-Vet software application. O3-Vet was selected by the Chilean University mainly for two reasons: (1) it uses human medicine standardized technologies such as "Health Level 7" (HL7) and "Integrating the Healthcare Enterprise" (IHE), which allow a good level of data sharing and hospital management; (2) it is open source, which means it can be adapted to specific hospital needs. In the new release, a subset of diagnostic terms was added from the "Systematized Nomenclature of Medicine Clinical Terms" (SNOMED CT), selected by the "American Animal Hospital Association" (AAHA) to standardize the filing of clinical data and its retrieval. Results from a limited survey of veterinarians of the University (n=9) show that the new release improved the management of the Chilean University Hospital and the ability to retrieve useful clinical data. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
HITCal: a software tool for analysis of video head impulse test responses.
Rey-Martinez, Jorge; Batuecas-Caletrio, Angel; Matiño, Eusebi; Perez Fernandez, Nicolás
2015-09-01
The developed software (HITCal) may be a useful tool in the analysis and measurement of the saccadic video head impulse test (vHIT) responses and with the experience obtained during its use the authors suggest that HITCal is an excellent method for enhanced exploration of vHIT outputs. To develop a (software) method to analyze and explore the vHIT responses, mainly saccades. HITCal was written using a computational development program; the function to access a vHIT file was programmed; extended head impulse exploration and measurement tools were created and an automated saccade analysis was developed using an experimental algorithm. For pre-release HITCal laboratory tests, a database of head impulse tests (HITs) was created with the data collected retrospectively in three reference centers. This HITs database was evaluated by humans and was also computed with HITCal. The authors have successfully built HITCal and it has been released as open source software; the developed software was fully operative and all the proposed characteristics were incorporated in the released version. The automated saccades algorithm implemented in HITCal has good concordance with the assessment by human observers (Cohen's kappa coefficient = 0.7).
Prahm, Cosima; Eckstein, Korbinian; Ortiz-Catalan, Max; Dorffner, Georg; Kaniusas, Eugenijus; Aszmann, Oskar C
2016-08-31
Controlling a myoelectric prosthesis for upper limbs is increasingly challenging for the user as more electrodes and joints become available. Motion classification based on pattern recognition with a multi-electrode array allows multiple joints to be controlled simultaneously. Previous pattern recognition studies are difficult to compare, because individual research groups use their own data sets. To resolve this shortcoming and to facilitate comparisons, open access data sets were analysed using components of BioPatRec and Netlab pattern recognition models. Performances of the artificial neural networks, linear models, and training program components were compared. Evaluation took place within the BioPatRec environment, a Matlab-based open source platform that provides feature extraction, processing and motion classification algorithms for prosthetic control. The algorithms were applied to myoelectric signals for individual and simultaneous classification of movements, with the aim of finding the best performing algorithm and network model. Evaluation criteria included classification accuracy and training time. Results in both the linear and the artificial neural network models demonstrated that Netlab's implementation using scaled conjugate training algorithm reached significantly higher accuracies than BioPatRec. It is concluded that the best movement classification performance would be achieved through integrating Netlab training algorithms in the BioPatRec environment so that future prosthesis training can be shortened and control made more reliable. Netlab was therefore included into the newest release of BioPatRec (v4.0).
Open-source hardware for medical devices
2016-01-01
Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528
Open-source hardware for medical devices.
Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold
2016-04-01
Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.
The case for open-source software in drug discovery.
DeLano, Warren L
2005-02-01
Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.
Vargas, Roger I; Long, Jay; Miller, Neil W; Delate, Kathleen; Jackson, Charles G; Uchida, Grant K; Bautista, Renato C; Harris, Ernie J
2004-10-01
Ivy gourd, Coccinia grandis (L.) Voigt, patches throughout Kailua-Kona, Hawaii Island, HI, were identified as persistent sources of melon fly, Bactrocera cucurbitae (Coquillett). These patches had a low incidence of Psyttalia fletcheri (Silvestri), its major braconid parasitoid natural enemy in Hawaii, and were used to evaluate augmentative releases of P. fletcheri against melon fly. In field cage studies of releases, numbers of melon flies emerging from ivy gourd fruit placed inside treatment cages were reduced up to 21-fold, and numbers of parasitoids were increased 11-fold. In open field releases of P. fletcheri into ivy gourd patches, parasitization rates were increased 4.7 times in release plots compared with those in control plots. However, there was no significant reduction in emergence of melon flies from fruit. In subsequent cage tests with sterile melon flies and P. fletcheri, combinations of sterile flies and P. fletcheri produced the greatest reduction (9-fold) in melon fly emergence from zucchini, Cucurbita pepo L. Reductions obtained with sterile flies alone or in combination with parasitoids were significantly greater than those in the control, whereas those for parasitoids alone were not. Although these results suggest that the effects of sterile flies were greater than those for parasitoids, from a multitactic melon fly management strategy, sterile flies would complement the effects of P. fletcheri. Cost and sustainability of these nonchemical approaches will be examined further in an ongoing areawide pest management program for melon fly in Hawaii.
Data Release of UV to Submillimeter Broadband Fluxes for Simulated Galaxies from the EAGLE Project
NASA Astrophysics Data System (ADS)
Camps, Peter; Trčka, Ana; Trayford, James; Baes, Maarten; Theuns, Tom; Crain, Robert A.; McAlpine, Stuart; Schaller, Matthieu; Schaye, Joop
2018-02-01
We present dust-attenuated and dust emission fluxes for sufficiently resolved galaxies in the EAGLE suite of cosmological hydrodynamical simulations, calculated with the SKIRT radiative transfer code. The post-processing procedure includes specific components for star formation regions, stellar sources, and diffuse dust and takes into account stochastic heating of dust grains to obtain realistic broadband fluxes in the wavelength range from ultraviolet to submillimeter. The mock survey includes nearly half a million simulated galaxies with stellar masses above {10}8.5 {M}ȯ across six EAGLE models. About two-thirds of these galaxies, residing in 23 redshift bins up to z = 6, have a sufficiently resolved metallic gas distribution to derive meaningful dust attenuation and emission, with the important caveat that the same dust properties were used at all redshifts. These newly released data complement the already publicly available information about the EAGLE galaxies, which includes intrinsic properties derived by aggregating the properties of the smoothed particles representing matter in the simulation. We further provide an open-source framework of Python procedures for post-processing simulated galaxies with the radiative transfer code SKIRT. The framework allows any third party to calculate synthetic images, spectral energy distributions, and broadband fluxes for EAGLE galaxies, taking into account the effects of dust attenuation and emission.
The Chandra Source Catalog 2.0
NASA Astrophysics Data System (ADS)
Evans, Ian N.; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; McLaughlin, Warren; Miller, Joseph; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula
2018-01-01
The current version of the Chandra Source Catalog (CSC) continues to be well utilized by the astronomical community. Usage over the past year has continued to average more than 15,000 searches per month. Version 1.1 of the CSC, released in 2010, includes properties and data for 158,071 detections, corresponding to 106,586 distinct X-ray sources on the sky. The second major release of the catalog, CSC 2.0, will be made available to the user community in early 2018, and preliminary lists of detections and sources are available now. Release 2.0 will roughly triple the size of the current version of the catalog to an estimated 375,000 detections, corresponding to ~315,000 unique X-ray sources. Compared to release 1.1, the limiting sensitivity for compact sources in CSC 2.0 is significantly enhanced. This improvement is achieved by using a two-stage approach that involves stacking (co-adding) multiple observations of the same field prior to source detection, and then using an improved source detection approach that enables us to detect point source down to ~5 net counts on-axis for exposures shorter than ~15 ks. In addition to enhanced source detection capabilities, improvements to the Bayesian aperture photometry code included in release 2.0 provides robust photometric probability density functions (PDFs) in crowded fields even for low count detections. All post-aperture photometry properties (e.g., hardness ratios, source variability) work directly from the PDFs in release 2.0. CSC 2.0 also adds a Bayesian Blocks analysis of the multi-band aperture photometry PDFs to identify multiple observations of the same source that have similar photometric properties, and therefore can be analyzed simultaneously to improve S/N.We briefly describe these and other updates that significantly enhance the scientific utility of CSC 2.0 when compared to the earlier catalog release.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
Choosing Open Source ERP Systems: What Reasons Are There For Doing So?
NASA Astrophysics Data System (ADS)
Johansson, Björn; Sudzina, Frantisek
Enterprise resource planning (ERP) systems attract a high attention and open source software does it as well. The question is then if, and if so, when do open source ERP systems take off. The paper describes the status of open source ERP systems. Based on literature review of ERP system selection criteria based on Web of Science articles, it discusses reported reasons for choosing open source or proprietary ERP systems. Last but not least, the article presents some conclusions that could act as input for future research. The paper aims at building up a foundation for the basic question: What are the reasons for an organization to adopt open source ERP systems.
caGrid 1.0: An Enterprise Grid Infrastructure for Biomedical Research
Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel
2008-01-01
Objective To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG™) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. Measurements The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. Conclusions While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community. PMID:18096909
caGrid 1.0: an enterprise Grid infrastructure for biomedical research.
Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel
2008-01-01
To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community.
Gaseous and particulate emissions from prescribed burning in Georgia.
Lee, Sangil; Baumann, Karsten; Schauer, James J; Sheesley, Rebecca J; Naeher, Luke P; Meinardi, Simone; Blake, Donald R; Edgerton, Eric S; Russell, Armistead G; Clements, Mark
2005-12-01
Prescribed burning is a significant source of fine particulate matter (PM2.5) in the southeastern United States. However, limited data exist on the emission characteristics from this source. Various organic and inorganic compounds both in the gas and particle phase were measured in the emissions of prescribed burnings conducted at two pine-dominated forest areas in Georgia. The measurements of volatile organic compounds (VOCs) and PM2.5 allowed the determination of emission factors for the flaming and smoldering stages of prescribed burnings. The VOC emission factors from smoldering were distinctly higher than those from flaming except for ethene, ethyne, and organic nitrate compounds. VOC emission factors show that emissions of certain aromatic compounds and terpenes such as alpha and beta-pinenes, which are important precursors for secondary organic aerosol (SOA), are much higher from active prescribed burnings than from fireplace wood and laboratory open burning studies. Levoglucosan is the major particulate organic compound (POC) emitted for all these studies, though its emission relative to total organic carbon (mg/g OC) differs significantly. Furthermore, cholesterol, an important fingerprint for meat cooking, was observed only in our in situ study indicating a significant release from the soil and soil organisms during open burning. Source apportionment of ambient primary fine particulate OC measured at two urban receptor locations 20-25 km downwind yields 74 +/- 11% during and immediately after the burns using our new in situ profile. In comparison with the previous source profile from laboratory simulations, however, this OC contribution is on average 27 +/- 5% lower.
In 2006, EPA published an inventory of sources and environmental releases of dioxin-like compounds in the United States. This draft report presents an update and revision to that dioxin source inventory. It also presents updated estimates of environmental releases of dioxin-like...
Developing open-source codes for electromagnetic geophysics using industry support
NASA Astrophysics Data System (ADS)
Key, K.
2017-12-01
Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.
Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo
2013-07-01
The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.
Behind Linus's Law: Investigating Peer Review Processes in Open Source
ERIC Educational Resources Information Center
Wang, Jing
2013-01-01
Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…
ERIC Educational Resources Information Center
Kisworo, Marsudi Wahyu
2016-01-01
Information and Communication Technology (ICT)-supported learning using free and open source platform draws little attention as open source initiatives were focused in secondary or tertiary educations. This study investigates possibilities of ICT-supported learning using open source platform for primary educations. The data of this study is taken…
An Analysis of Open Source Security Software Products Downloads
ERIC Educational Resources Information Center
Barta, Brian J.
2014-01-01
Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…
Research on OpenStack of open source cloud computing in colleges and universities’ computer room
NASA Astrophysics Data System (ADS)
Wang, Lei; Zhang, Dandan
2017-06-01
In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.
eQuilibrator--the biochemical thermodynamics calculator.
Flamholz, Avi; Noor, Elad; Bar-Even, Arren; Milo, Ron
2012-01-01
The laws of thermodynamics constrain the action of biochemical systems. However, thermodynamic data on biochemical compounds can be difficult to find and is cumbersome to perform calculations with manually. Even simple thermodynamic questions like 'how much Gibbs energy is released by ATP hydrolysis at pH 5?' are complicated excessively by the search for accurate data. To address this problem, eQuilibrator couples a comprehensive and accurate database of thermodynamic properties of biochemical compounds and reactions with a simple and powerful online search and calculation interface. The web interface to eQuilibrator (http://equilibrator.weizmann.ac.il) enables easy calculation of Gibbs energies of compounds and reactions given arbitrary pH, ionic strength and metabolite concentrations. The eQuilibrator code is open-source and all thermodynamic source data are freely downloadable in standard formats. Here we describe the database characteristics and implementation and demonstrate its use.
eQuilibrator—the biochemical thermodynamics calculator
Flamholz, Avi; Noor, Elad; Bar-Even, Arren; Milo, Ron
2012-01-01
The laws of thermodynamics constrain the action of biochemical systems. However, thermodynamic data on biochemical compounds can be difficult to find and is cumbersome to perform calculations with manually. Even simple thermodynamic questions like ‘how much Gibbs energy is released by ATP hydrolysis at pH 5?’ are complicated excessively by the search for accurate data. To address this problem, eQuilibrator couples a comprehensive and accurate database of thermodynamic properties of biochemical compounds and reactions with a simple and powerful online search and calculation interface. The web interface to eQuilibrator (http://equilibrator.weizmann.ac.il) enables easy calculation of Gibbs energies of compounds and reactions given arbitrary pH, ionic strength and metabolite concentrations. The eQuilibrator code is open-source and all thermodynamic source data are freely downloadable in standard formats. Here we describe the database characteristics and implementation and demonstrate its use. PMID:22064852
esATAC: An Easy-to-use Systematic pipeline for ATAC-seq data analysis.
Wei, Zheng; Zhang, Wei; Fang, Huan; Li, Yanda; Wang, Xiaowo
2018-03-07
ATAC-seq is rapidly emerging as one of the major experimental approaches to probe chromatin accessibility genome-wide. Here, we present "esATAC", a highly integrated easy-to-use R/Bioconductor package, for systematic ATAC-seq data analysis. It covers essential steps for full analyzing procedure, including raw data processing, quality control and downstream statistical analysis such as peak calling, enrichment analysis and transcription factor footprinting. esATAC supports one command line execution for preset pipelines, and provides flexible interfaces for building customized pipelines. esATAC package is open source under the GPL-3.0 license. It is implemented in R and C ++. Source code and binaries for Linux, MAC OS X and Windows are available through Bioconductor https://www.bioconductor.org/packages/release/bioc/html/esATAC.html). xwwang@tsinghua.edu.cn. Supplementary data are available at Bioinformatics online.
Is it time to tackle PM(2.5) air pollutions in China from biomass-burning emissions?
Zhang, Yan-Lin; Cao, Fang
2015-07-01
An increase in haze days has been observed in China over the past two decades due to the rapid industrialization, urbanization and energy consumptions. To address this server issue, Chinese central government has recently released the Action Plan on Prevention and Control of Air Pollution, which mainly focuses on regulation of indusial and transport-related emissions with major energy consumption from fossil fuels. This comprehensive and toughest plan is definitely a major step in the right direction aiming at beautiful and environmental-friendly China; however, based on recent source apportionment results, we suggest that strengthening regulation emissions from biomass-burning sources in both urban and rural areas is needed to meet a rigorous reduction target. Here, household biofuel and open biomass burning are highlighted, as impacts of these emissions can cause local and regional pollution. Copyright © 2015 Elsevier Ltd. All rights reserved.
Contaminant transport from point source on water surface in open channel flow with bed absorption
NASA Astrophysics Data System (ADS)
Guo, Jinlan; Wu, Xudong; Jiang, Weiquan; Chen, Guoqian
2018-06-01
Studying solute dispersion in channel flows is of significance for environmental and industrial applications. Two-dimensional concentration distribution for a most typical case of a point source release on the free water surface in a channel flow with bed absorption is presented by means of Chatwin's long-time asymptotic technique. Five basic characteristics of Taylor dispersion and vertical mean concentration distribution with skewness and kurtosis modifications are also analyzed. The results reveal that bed absorption affects both the longitudinal and vertical concentration distributions and causes the contaminant cloud to concentrate in the upper layer. Additionally, the cross-sectional concentration distribution shows an asymptotic Gaussian distribution at large time which is unaffected by the bed absorption. The vertical concentration distribution is found to be nonuniform even at large time. The obtained results are essential for practical implements with strict environmental standards.
Software support for SBGN maps: SBGN-ML and LibSBGN.
van Iersel, Martijn P; Villéger, Alice C; Czauderna, Tobias; Boyd, Sarah E; Bergmann, Frank T; Luna, Augustin; Demir, Emek; Sorokin, Anatoly; Dogrusoz, Ugur; Matsuoka, Yukiko; Funahashi, Akira; Aladjem, Mirit I; Mi, Huaiyu; Moodie, Stuart L; Kitano, Hiroaki; Le Novère, Nicolas; Schreiber, Falk
2012-08-01
LibSBGN is a software library for reading, writing and manipulating Systems Biology Graphical Notation (SBGN) maps stored using the recently developed SBGN-ML file format. The library (available in C++ and Java) makes it easy for developers to add SBGN support to their tools, whereas the file format facilitates the exchange of maps between compatible software applications. The library also supports validation of maps, which simplifies the task of ensuring compliance with the detailed SBGN specifications. With this effort we hope to increase the adoption of SBGN in bioinformatics tools, ultimately enabling more researchers to visualize biological knowledge in a precise and unambiguous manner. Milestone 2 was released in December 2011. Source code, example files and binaries are freely available under the terms of either the LGPL v2.1+ or Apache v2.0 open source licenses from http://libsbgn.sourceforge.net. sbgn-libsbgn@lists.sourceforge.net.
BioCIDER: a Contextualisation InDEx for biological Resources discovery
Horro, Carlos; Cook, Martin; Attwood, Teresa K.; Brazas, Michelle D.; Hancock, John M.; Palagi, Patricia; Corpas, Manuel; Jimenez, Rafael
2017-01-01
Abstract Summary The vast, uncoordinated proliferation of bioinformatics resources (databases, software tools, training materials etc.) makes it difficult for users to find them. To facilitate their discovery, various services are being developed to collect such resources into registries. We have developed BioCIDER, which, rather like online shopping ‘recommendations’, provides a contextualization index to help identify biological resources relevant to the content of the sites in which it is embedded. Availability and Implementation BioCIDER (www.biocider.org) is an open-source platform. Documentation is available online (https://goo.gl/Klc51G), and source code is freely available via GitHub (https://github.com/BioCIDER). The BioJS widget that enables websites to embed contextualization is available from the BioJS registry (http://biojs.io/). All code is released under an MIT licence. Contact carlos.horro@earlham.ac.uk or rafael.jimenez@elixir-europe.org or manuel@repositive.io PMID:28407033
What can the programming language Rust do for astrophysics?
NASA Astrophysics Data System (ADS)
Blanco-Cuaresma, Sergi; Bolmont, Emeline
2017-06-01
The astrophysics community uses different tools for computational tasks such as complex systems simulations, radiative transfer calculations or big data. Programming languages like Fortran, C or C++ are commonly present in these tools and, generally, the language choice was made based on the need for performance. However, this comes at a cost: safety. For instance, a common source of error is the access to invalid memory regions, which produces random execution behaviors and affects the scientific interpretation of the results. In 2015, Mozilla Research released the first stable version of a new programming language named Rust. Many features make this new language attractive for the scientific community, it is open source and it guarantees memory safety while offering zero-cost abstraction. We explore the advantages and drawbacks of Rust for astrophysics by re-implementing the fundamental parts of Mercury-T, a Fortran code that simulates the dynamical and tidal evolution of multi-planet systems.
2011-01-01
Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914
15 CFR 734.7 - Published information and software.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Ready availability at libraries open to the public or at university libraries (See supplement No. 1 to this part, Question A(6)); (3) Patents and open (published) patent applications available at any patent office; and (4) Release at an open conference, meeting, seminar, trade show, or other open gathering. (i...
15 CFR 734.7 - Published information and software.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Ready availability at libraries open to the public or at university libraries (See Supplement No. 1 to this part, Question A(6)); (3) Patents and open (published) patent applications available at any patent office; and (4) Release at an open conference, meeting, seminar, trade show, or other open gathering. (i...
NASA Astrophysics Data System (ADS)
Katata, G.; Chino, M.; Kobayashi, T.; Terada, H.; Ota, M.; Nagai, H.; Kajino, M.; Draxler, R.; Hort, M. C.; Malo, A.; Torii, T.; Sanada, Y.
2015-01-01
Temporal variations in the amount of radionuclides released into the atmosphere during the Fukushima Daiichi Nuclear Power Station (FNPS1) accident and their atmospheric and marine dispersion are essential to evaluate the environmental impacts and resultant radiological doses to the public. In this paper, we estimate the detailed atmospheric releases during the accident using a reverse estimation method which calculates the release rates of radionuclides by comparing measurements of air concentration of a radionuclide or its dose rate in the environment with the ones calculated by atmospheric and oceanic transport, dispersion and deposition models. The atmospheric and oceanic models used are WSPEEDI-II (Worldwide version of System for Prediction of Environmental Emergency Dose Information) and SEA-GEARN-FDM (Finite difference oceanic dispersion model), both developed by the authors. A sophisticated deposition scheme, which deals with dry and fog-water depositions, cloud condensation nuclei (CCN) activation, and subsequent wet scavenging due to mixed-phase cloud microphysics (in-cloud scavenging) for radioactive iodine gas (I2 and CH3I) and other particles (CsI, Cs, and Te), was incorporated into WSPEEDI-II to improve the surface deposition calculations. The results revealed that the major releases of radionuclides due to the FNPS1 accident occurred in the following periods during March 2011: the afternoon of 12 March due to the wet venting and hydrogen explosion at Unit 1, midnight of 14 March when the SRV (safety relief valve) was opened three times at Unit 2, the morning and night of 15 March, and the morning of 16 March. According to the simulation results, the highest radioactive contamination areas around FNPS1 were created from 15 to 16 March by complicated interactions among rainfall, plume movements, and the temporal variation of release rates. The simulation by WSPEEDI-II using the new source term reproduced the local and regional patterns of cumulative surface deposition of total 131I and 137Cs and air dose rate obtained by airborne surveys. The new source term was also tested using three atmospheric dispersion models (Modèle Lagrangien de Dispersion de Particules d'ordre zéro: MLDP0, Hybrid Single Particle Lagrangian Integrated Trajectory Model: HYSPLIT, and Met Office's Numerical Atmospheric-dispersion Modelling Environment: NAME) for regional and global calculations, and the calculated results showed good agreement with observed air concentration and surface deposition of 137Cs in eastern Japan.
The 2017 Bioinformatics Open Source Conference (BOSC)
Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather
2017-01-01
The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year’s theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest. PMID:29118973
The 2017 Bioinformatics Open Source Conference (BOSC).
Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather
2017-01-01
The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year's theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest.
The HYPE Open Source Community
NASA Astrophysics Data System (ADS)
Strömbäck, L.; Pers, C.; Isberg, K.; Nyström, K.; Arheimer, B.
2013-12-01
The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model. It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. In the model, the landscape is divided into classes according to soil type, vegetation and altitude. The soil representation is stratified and can be divided in up to three layers. Water and substances are routed through the same flow paths and storages (snow, soil, groundwater, streams, rivers, lakes) considering turn-over and transformation on the way towards the sea. HYPE has been successfully used in many hydrological applications at SMHI. For Europe, we currently have three different models; The S-HYPE model for Sweden; The BALT-HYPE model for the Baltic Sea; and the E-HYPE model for the whole Europe. These models simulate hydrological conditions and nutrients for their respective areas and are used for characterization, forecasts, and scenario analyses. Model data can be downloaded from hypeweb.smhi.se. In addition, we provide models for the Arctic region, the Arab (Middle East and Northern Africa) region, India, the Niger River basin, the La Plata Basin. This demonstrates the applicability of the HYPE model for large scale modeling in different regions of the world. An important goal with our work is to make our data and tools available as open data and services. For this aim we created the HYPE Open Source Community (OSC) that makes the source code of HYPE available for anyone interested in further development of HYPE. The HYPE OSC (hype.sourceforge.net) is an open source initiative under the Lesser GNU Public License taken by SMHI to strengthen international collaboration in hydrological modeling and hydrological data production. The hypothesis is that more brains and more testing will result in better models and better code. The code is transparent and can be changed and learnt from. New versions of the main code are delivered frequently. HYPE OSC is open to everyone interested in hydrology, hydrological modeling and code development - e.g. scientists, authorities, and consultancies. By joining the HYPE OSC you get access a state-of-the-art operational hydrological model. The HYPE source code is designed to efficiently handle large scale modeling for forecast, hindcast and climate applications. The code is under constant development to improve the hydrological processes, efficiency and readability. In the beginning of 2013 we released a version with new and better modularization based on hydrological processes. This will make the code easier to understand and further develop for a new user. An important challenge in this process is to produce code that is easy for anyone to understand and work with, but still maintain the properties that make the code efficient enough for large scale applications. Input from the HYPE Open Source Community is an important source for future improvements of the HYPE model. Therefore, by joining the community you become an active part of the development, get access to the latest features and can influence future versions of the model.
The Efficient Utilization of Open Source Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baty, Samuel R.
These are a set of slides on the efficient utilization of open source information. Open source information consists of a vast set of information from a variety of sources. Not only does the quantity of open source information pose a problem, the quality of such information can hinder efforts. To show this, two case studies are mentioned: Iran and North Korea, in order to see how open source information can be utilized. The huge breadth and depth of open source information can complicate an analysis, especially because open information has no guarantee of accuracy. Open source information can provide keymore » insights either directly or indirectly: looking at supporting factors (flow of scientists, products and waste from mines, government budgets, etc.); direct factors (statements, tests, deployments). Fundamentally, it is the independent verification of information that allows for a more complete picture to be formed. Overlapping sources allow for more precise bounds on times, weights, temperatures, yields or other issues of interest in order to determine capability. Ultimately, a "good" answer almost never comes from an individual, but rather requires the utilization of a wide range of skill sets held by a team of people.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salinger, Andrew; Phipps, Eric; Ostien, Jakob
2016-01-13
The Albany code is a general-purpose finite element code for solving partial differential equations (PDEs). Albany is a research code that demonstrates how a PDE code can be built by interfacing many of the open-source software libraries that are released under Sandia's Trilinos project. Part of the mission of Albany is to be a testbed for new Trilinos libraries, to refine their methods, usability, and interfaces. Albany includes hooks to optimization and uncertainty quantification algorithms, including those in Trilinos as well as those in the Dakota toolkit. Because of this, Albany is a desirable starting point for new code developmentmore » efforts that wish to make heavy use of Trilinos. Albany is both a framework and the host for specific finite element applications. These applications have project names, and can be controlled by configuration option when the code is compiled, but are all developed and released as part of the single Albany code base, These include LCM, QCAD, FELIX, Aeras, and ATO applications.« less
Dog Movie Stars and Dog Breed Popularity: A Case Study in Media Influence on Choice
Ghirlanda, Stefano; Acerbi, Alberto; Herzog, Harold
2014-01-01
Fashions and fads are important phenomena that influence many individual choices. They are ubiquitous in human societies, and have recently been used as a source of data to test models of cultural dynamics. Although a few statistical regularities have been observed in fashion cycles, their empirical characterization is still incomplete. Here we consider the impact of mass media on popular culture, showing that the release of movies featuring dogs is often associated with an increase in the popularity of featured breeds, for up to 10 years after movie release. We also find that a movie's impact on breed popularity correlates with the estimated number of viewers during the movie's opening weekend—a proxy of the movie's reach among the general public. Movies' influence on breed popularity was strongest in the early 20th century, and has declined since. We reach these conclusions through a new, widely applicable method to measure the cultural impact of events, capable of disentangling the event's effect from ongoing cultural trends. PMID:25208271
Dog movie stars and dog breed popularity: a case study in media influence on choice.
Ghirlanda, Stefano; Acerbi, Alberto; Herzog, Harold
2014-01-01
Fashions and fads are important phenomena that influence many individual choices. They are ubiquitous in human societies, and have recently been used as a source of data to test models of cultural dynamics. Although a few statistical regularities have been observed in fashion cycles, their empirical characterization is still incomplete. Here we consider the impact of mass media on popular culture, showing that the release of movies featuring dogs is often associated with an increase in the popularity of featured breeds, for up to 10 years after movie release. We also find that a movie's impact on breed popularity correlates with the estimated number of viewers during the movie's opening weekend--a proxy of the movie's reach among the general public. Movies' influence on breed popularity was strongest in the early 20th century, and has declined since. We reach these conclusions through a new, widely applicable method to measure the cultural impact of events, capable of disentangling the event's effect from ongoing cultural trends.
A Padawan Programmer's Guide to Developing Software Libraries.
Yurkovich, James T; Yurkovich, Benjamin J; Dräger, Andreas; Palsson, Bernhard O; King, Zachary A
2017-11-22
With the rapid adoption of computational tools in the life sciences, scientists are taking on the challenge of developing their own software libraries and releasing them for public use. This trend is being accelerated by popular technologies and platforms, such as GitHub, Jupyter, R/Shiny, that make it easier to develop scientific software and by open-source licenses that make it easier to release software. But how do you build a software library that people will use? And what characteristics do the best libraries have that make them enduringly popular? Here, we provide a reference guide, based on our own experiences, for developing software libraries along with real-world examples to help provide context for scientists who are learning about these concepts for the first time. While we can only scratch the surface of these topics, we hope that this article will act as a guide for scientists who want to write great software that is built to last. Copyright © 2017 Elsevier Inc. All rights reserved.
Yao, Qingzhen; Wang, Xiaojing; Jian, Huimin; Chen, Hongtao; Yu, Zhigang
2016-02-15
Suspended particulate matter (SPM) samples were collected along a salinity gradient in the Changjiang Estuary in June 2011. A custom-built water elutriation apparatus was used to separate the suspended sediments into five size fractions. The results indicated that Cr and Pb originated from natural weathering processes, whereas Cu, Zn, and Cd originated from other sources. The distribution of most trace metals in different particle sizes increased with decreasing particle size. The contents of Fe/Mn and organic matter were confirmed to play an important role in increasing the level of heavy metal contents. The Cu, Pb, Zn, and Cd contents varied significantly with increasing salinity in the medium-low salinity region, thus indicating the release of Cu, Pb, Zn, and Cd particles. Thus, the transfer of polluted fine particles into the open sea is probably accompanied by release of pollutants into the dissolved compartment, thereby amplifying the potential harmful effects to marine organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.
Brookes, Emre; Rocco, Mattia
2018-03-28
The UltraScan SOlution MOdeller (US-SOMO) is a comprehensive, public domain, open-source suite of computer programs centred on hydrodynamic modelling and small-angle scattering (SAS) data analysis and simulation. We describe here the advances that have been implemented since its last official release (#3087, 2017), which are available from release #3141 for Windows, Linux and Mac operating systems. A major effort has been the transition from the legacy Qt3 cross platform software development and user interface library to the modern Qt5 release. Apart from improved graphical support, this has allowed the direct implementation of the newest, almost two-orders of magnitude faster version of the ZENO hydrodynamic computation algorithm for all operating systems. Coupled with the SoMo-generated bead models with overlaps, ZENO provides the most accurate translational friction computations from atomic-level structures available (Rocco and Byron Eur Biophys J 44:417-431, 2015a), with computational times comparable with or faster than those of other methods. In addition, it has allowed us to introduce the direct representation of each atom in a structure as a (hydrated) bead, opening interesting new modelling possibilities. In the small-angle scattering (SAS) part of the suite, an indirect Fourier transform Bayesian algorithm has been implemented for the computation of the pairwise distance distribution function from SAS data. Finally, the SAS HPLC module, recently upgraded with improved baseline correction and Gaussian decomposition of not baseline-resolved peaks and with advanced statistical evaluation tools (Brookes et al. J Appl Cryst 49:1827-1841, 2016), now allows automatic top-peak frame selection and averaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul L. Wichlacz
2003-09-01
This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less
Verraedt, Els; Braem, Annabel; Chaudhari, Amol; Thevissen, Karin; Adams, Erwin; Van Mellaert, Lieve; Cammue, Bruno P A; Duyck, Joke; Anné, Jozef; Vleugels, Jef; Martens, Johan A
2011-10-31
Amorphous microporous silica (AMS) serving as a reservoir for controlled release of a bioactive agent was applied in the open porosity of a titanium coating on a Ti-6Al-4V metal substrate. The pores of the AMS emptied by calcination were loaded with chlorhexidine diacetate (CHX) via incipient wetness impregnation with CHX solution, followed by solvent evaporation. Using this CHX loaded AMS system on titanium substrate sustained release of CHX into physiological medium was obtained over a 10 day-period. CHX released from the AMS coating was demonstrated to be effective in killing planktonic cultures of the human pathogens Candida albicans and Staphylococcus epidermidis. This surface modification of titanium bodies with AMS controlled release functionality for a bioactive compound potentially can be applied on dental and orthopaedic implants to abate implant-associated microbial infection. Copyright © 2011 Elsevier B.V. All rights reserved.
Ultrasound-assisted endoscopic partial plantar fascia release.
Ohuchi, Hiroshi; Ichikawa, Ken; Shinga, Kotaro; Hattori, Soichi; Yamada, Shin; Takahashi, Kazuhisa
2013-01-01
Various surgical treatment procedures for plantar fasciitis, such as open surgery, percutaneous release, and endoscopic surgery, exist. Skin trouble, nerve disturbance, infection, and persistent pain associated with prolonged recovery time are complications of open surgery. Endoscopic partial plantar fascia release offers the surgeon clear visualization of the anatomy at the surgical site. However, the primary medial portal and portal tract used for this technique have been shown to be in close proximity to the posterior tibial nerves and their branches, and there is always the risk of nerve damage by introducing the endoscope deep to the plantar fascia. By performing endoscopic partial plantar fascia release under ultrasound assistance, we could dynamically visualize the direction of the endoscope and instrument introduction, thus preventing nerve damage from inadvertent insertion deep to the fascia. Full-thickness release of the plantar fascia at the ideal position could also be confirmed under ultrasound imaging. We discuss the technique for this new procedure.
Ultrasound-Assisted Endoscopic Partial Plantar Fascia Release
Ohuchi, Hiroshi; Ichikawa, Ken; Shinga, Kotaro; Hattori, Soichi; Yamada, Shin; Takahashi, Kazuhisa
2013-01-01
Various surgical treatment procedures for plantar fasciitis, such as open surgery, percutaneous release, and endoscopic surgery, exist. Skin trouble, nerve disturbance, infection, and persistent pain associated with prolonged recovery time are complications of open surgery. Endoscopic partial plantar fascia release offers the surgeon clear visualization of the anatomy at the surgical site. However, the primary medial portal and portal tract used for this technique have been shown to be in close proximity to the posterior tibial nerves and their branches, and there is always the risk of nerve damage by introducing the endoscope deep to the plantar fascia. By performing endoscopic partial plantar fascia release under ultrasound assistance, we could dynamically visualize the direction of the endoscope and instrument introduction, thus preventing nerve damage from inadvertent insertion deep to the fascia. Full-thickness release of the plantar fascia at the ideal position could also be confirmed under ultrasound imaging. We discuss the technique for this new procedure. PMID:24265989
The 2015 Bioinformatics Open Source Conference (BOSC 2015).
Harris, Nomi L; Cock, Peter J A; Lapp, Hilmar; Chapman, Brad; Davey, Rob; Fields, Christopher; Hokamp, Karsten; Munoz-Torres, Monica
2016-02-01
The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.
Open Source, Openness, and Higher Education
ERIC Educational Resources Information Center
Wiley, David
2006-01-01
In this article David Wiley provides an overview of how the general expansion of open source software has affected the world of education in particular. In doing so, Wiley not only addresses the development of open source software applications for teachers and administrators, he also discusses how the fundamental philosophy of the open source…
The Emergence of Open-Source Software in North America
ERIC Educational Resources Information Center
Pan, Guohua; Bonk, Curtis J.
2007-01-01
Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…
Polymersome magneto-valves for reversible capture and release of nanoparticles
van Rhee, P.G.; Rikken, R.S.M.; Abdelmohsen, L.K.E.A.; Maan, J.C.; Nolte, R.J.M.; van Hest, J.C.M.; Christianen, P.C.M.; Wilson, D.A.
2014-01-01
Stomatocytes are polymersomes with an infolded bowl-shaped architecture. This internal cavity is connected to the outside environment via a small ‘mouth’ region. Stomatocytes are assembled from diamagnetic amphiphilic block-copolymers with a highly anisotropic magnetic susceptibility, which permits to magnetically align and deform the polymeric self-assemblies. Here we show the reversible opening and closing of the mouth region of stomatocytes in homogeneous magnetic fields. The control over the size of the opening yields magneto-responsive supramolecular valves that are able to reversibly capture and release cargo. Furthermore, the increase in the size of the opening is gradual and starts at fields below 10 T, which opens the possibility of using these structures for delivery and nanoreactor applications. PMID:25248402
An open-source framework for stress-testing non-invasive foetal ECG extraction algorithms.
Andreotti, Fernando; Behar, Joachim; Zaunseder, Sebastian; Oster, Julien; Clifford, Gari D
2016-05-01
Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be avoided. Data, extraction algorithms and evaluation routines were released as part of the fecgsyn toolbox on Physionet under an GNU GPL open-source license. This contribution provides a standard framework for benchmarking and regulatory testing of NI-FECG extraction algorithms.
Lewan, Michael; Sonnenfeld, Mark D.
2017-01-01
Low-temperature hydrous pyrolysis (LTHP) at 300°C (572°F) for 24 h released retained oils from 12- to 20-meshsize samples of mature Niobrara marly chalk and marlstone cores. The released oil accumulated on the water surface of the reactor, and is compositionally similar to oil produced from the same well. The quantities of oil released from the marly chalk and marlstone by LTHP are respectively 3.4 and 1.6 times greater than those determined by tight rock analyses (TRA) on aliquots of the same samples. Gas chromatograms indicated this difference is a result of TRA oils losing more volatiles and volatilizing less heavy hydrocarbons during collection than LTHP oils. Characterization of the rocks before and after LTPH by programmable open-system pyrolysis (HAWK) indicate that under LTHP conditions no significant oil is generated and only preexisting retained oil is released. Although LTHP appears to provide better predictions of quantity and quality of retained oil in a mature source rock, it is not expected to replace the more time and sample-size efficacy of TRA. However, LTHP can be applied to composited samples from key intervals or lithologies originally recognized by TRA. Additional studies on duration, temperature, and sample size used in LTHP may further optimize its utility.
The Open Microscopy Environment: open image informatics for the biological sciences
NASA Astrophysics Data System (ADS)
Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.
2016-07-01
Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).
Open data for water-related operational services, the SWITCH-ON approach
NASA Astrophysics Data System (ADS)
Mazzoli, Paolo; Bagli, Stefano; Valerio, Luzzi; Broccoli, Davide; Piccinini, Francesca
2017-04-01
Recently, a collaborative project started called SWITCH-ON (EU FP7 project No 603587) coordinated by SMHI (http://water-switch-on.eu/) as part of the contemporary European movement imposed by the INSPIRE directive and the Open Data Strategy. Among It's R&D activities GECOsistema develops and expands inside SWITCH-ON a set of online services to tackle major water related issues, from reservoir and irrigation supply, to hydrological change adaptation and hydropower potential mapping. Here we present major releases of APRIL, HyCAW and High-resolution European HydroPower Atlas; all of which make intense use of open data. APRIL is a tool for seasonal run-off forecasts, that takes advantage of open datasets or low-cost data and performs forecasts through calibrated machine learning algorithms. HyCAW is a wizard that supports the assessment of adaptation options to cope with change in the temporal distribution of water availability as well as in the total water quantity. EU HPA provides all relevant information necessary to appraise the feasibility of a micro-hydropower plant at a specific site, taking into account hydrological as well as technical and economic factors. All the tools share a common vision of the project to address water concerns and currently untapped potential of open data for improved water management across the EU. Users are guided through a Web GIS interface, created using open source Web Mapping Applications, Open-Layers and Map Server, to explore available hydrological information in the area of interest, plot available data, perform analysis, and get reports and statistics.
Allison, Thomas L.
2015-10-06
A door opening spring assistance apparatus is set forth that will automatically apply a door opening assistance force using a combination of rods and coil springs. The release of the rods by the coil springs reduces the force required to set the door in motion.
Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on
2011-01-01
Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community. PMID:21999342
Open Genetic Code: on open source in the life sciences.
Deibel, Eric
2014-01-01
The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.
The Open Source Teaching Project (OSTP): Research Note.
ERIC Educational Resources Information Center
Hirst, Tony
The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…
Free for All: Open Source Software
ERIC Educational Resources Information Center
Schneider, Karen
2008-01-01
Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…
Ni, Hong-Gang; Lu, Shao-You; Mo, Ting; Zeng, Hui
2016-07-01
Based on the most widely used plastics in China, five plastic wastes were selected for investigation of brominated flame retardant (BFR) emission behaviors during open burning. Considerable variations were observed in the emission factors (EF) of polybrominated diphenyl ethers (PBDEs) and hexabromocyclododecanes (HBCDs) from the combustion of different plastic wastes. Distribution of BFR output mass showed that ΣPBDE was emitted mainly by the airborne particle (51%), followed by residual ash (44%) and the gas phase (5.1%); these values for ΣHBCD were 62%, 24%, and 14%, respectively. A lack of mass balance after the burning of the plastic wastes for some congeners (output/input mass ratios>1) suggested that formation and survival exceeded PBDE decomposition during the burns. However, that was not the case for HBCD. A comparison with literature data showed that the open burning of plastic waste is major source of PBDE compared to regulated combustion activities. Even for state-of-the-art waste incinerators equipped with sophisticated complex air pollution control technologies, BFRs are released on a small scale to the environment. According to our estimate, ΣPBDE release to the air and land from municipal solid waste (MSW) incineration plants in China in 2015 were 105 kg/year and 7124 kg/year. These data for ΣHBCD were 25.5 and 71.7 kg/year, respectively. Considering the fact that a growing number of cities in China are switching to incineration as the preferred method for MSW treatment, our estimate is especially important. This study provides the first data on the environmental exposure of BFRs emitted from MSW incineration in China. Copyright © 2016 Elsevier Ltd. All rights reserved.
Reflections on the role of open source in health information system interoperability.
Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G
2007-01-01
This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.
NASA Astrophysics Data System (ADS)
López García, Álvaro; Fernández del Castillo, Enol; Orviz Fernández, Pablo
In this document we present an implementation of the Open Grid Forum's Open Cloud Computing Interface (OCCI) for OpenStack, namely ooi (Openstack occi interface, 2015) [1]. OCCI is an open standard for management tasks over cloud resources, focused on interoperability, portability and integration. ooi aims to implement this open interface for the OpenStack cloud middleware, promoting interoperability with other OCCI-enabled cloud management frameworks and infrastructures. ooi focuses on being non-invasive with a vanilla OpenStack installation, not tied to a particular OpenStack release version.
A garage sale bargain: A leaking 2.2 GBq source, Phase III - The radiological cleanup
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vitkus, T.; Beck, W.L.; Freeman, B.
1996-06-01
As described in a previous paper, a private residence in Bristol, Tennessee, was extensively contaminated when the owner unknowingly handled a leaking {sup 226}Ra source of about 60 mCi. Contamination was found on both floors of the two-story house and in the yard. The most extensively contaminated area was a back porch where the owner initially opened the source containment box. Contamination was probably spread most by {open_quotes}tracking{close_quotes} by the owner, his wife, and several pet cats. One of the most contaminated objects found was a pillow, a favorite napping place for the cats, which read 25 mR h{sup -1}more » at contact. Several decon techniques were tried including stripable paints, washing with various agents, etc. However, decontamination was primarily accomplished by physical removal techniques, such as removing carpeting, scraping and sanding wooden surfaces, and by disposal of contaminated objects (much to the owners` dismay). The house and the yard were cleaned up to meet the recommended guidelines for unrestricted release with the expenditure of about 550 hours of effort. The groups assisting the Tennessee Division of Radiological Health included the Department of Energy, Tennessee Valley Authority`s Watts Bar Nuclear Plant, Oak Ridge Institute for Science and Education, Oak Ridge National Laboratory, and Scientific Ecology Group. All of the assistance was provided at no cost to the home owner or the State of Tennessee.« less
The Herschel-ATLAS Data Release 1 - II. Multi-wavelength counterparts to submillimetre sources
NASA Astrophysics Data System (ADS)
Bourne, N.; Dunne, L.; Maddox, S. J.; Dye, S.; Furlanetto, C.; Hoyos, C.; Smith, D. J. B.; Eales, S.; Smith, M. W. L.; Valiante, E.; Alpaslan, M.; Andrae, E.; Baldry, I. K.; Cluver, M. E.; Cooray, A.; Driver, S. P.; Dunlop, J. S.; Grootes, M. W.; Ivison, R. J.; Jarrett, T. H.; Liske, J.; Madore, B. F.; Popescu, C. C.; Robotham, A. G.; Rowlands, K.; Seibert, M.; Thompson, M. A.; Tuffs, R. J.; Viaene, S.; Wright, A. H.
2016-10-01
This paper is the second in a pair of papers presenting data release 1 (DR1) of the Herschel Astrophysical Terahertz Large Area Survey (H-ATLAS), the largest single open-time key project carried out with the Herschel Space Observatory. The H-ATLAS is a wide-area imaging survey carried out in five photometric bands at 100, 160, 250, 350 and 500 μm covering a total area of 600 deg2. In this paper, we describe the identification of optical counterparts to submillimetre sources in DR1, comprising an area of 161 deg2 over three equatorial fields of roughly 12 × 4.5 deg centred at 9h, 12h and 14{^h.}5, respectively. Of all the H-ATLAS fields, the equatorial regions benefit from the greatest overlap with current multi-wavelength surveys spanning ultraviolet (UV) to mid-infrared regimes, as well as extensive spectroscopic coverage. We use a likelihood ratio technique to identify Sloan Digital Sky Survey counterparts at r < 22.4 for 250-μm-selected sources detected at ≥4σ (≈28 mJy). We find `reliable' counterparts (reliability R ≥ 0.8) for 44 835 sources (39 per cent), with an estimated completeness of 73.0 per cent and contamination rate of 4.7 per cent. Using redshifts and multi-wavelength photometry from GAMA and other public catalogues, we show that H-ATLAS-selected galaxies at z < 0.5 span a wide range of optical colours, total infrared (IR) luminosities and IR/UV ratios, with no strong disposition towards mid-IR-classified active galactic nuclei in comparison with optical selection. The data described herein, together with all maps and catalogues described in the companion paper, are available from the H-ATLAS website at www.h-atlas.org.
Winslow, Luke; Zwart, Jacob A.; Batt, Ryan D.; Dugan, Hilary; Woolway, R. Iestyn; Corman, Jessica; Hanson, Paul C.; Read, Jordan S.
2016-01-01
Metabolism is a fundamental process in ecosystems that crosses multiple scales of organization from individual organisms to whole ecosystems. To improve sharing and reuse of published metabolism models, we developed LakeMetabolizer, an R package for estimating lake metabolism from in situ time series of dissolved oxygen, water temperature, and, optionally, additional environmental variables. LakeMetabolizer implements 5 different metabolism models with diverse statistical underpinnings: bookkeeping, ordinary least squares, maximum likelihood, Kalman filter, and Bayesian. Each of these 5 metabolism models can be combined with 1 of 7 models for computing the coefficient of gas exchange across the air–water interface (k). LakeMetabolizer also features a variety of supporting functions that compute conversions and implement calculations commonly applied to raw data prior to estimating metabolism (e.g., oxygen saturation and optical conversion models). These tools have been organized into an R package that contains example data, example use-cases, and function documentation. The release package version is available on the Comprehensive R Archive Network (CRAN), and the full open-source GPL-licensed code is freely available for examination and extension online. With this unified, open-source, and freely available package, we hope to improve access and facilitate the application of metabolism in studies and management of lentic ecosystems.
ProteoWizard: open source software for rapid proteomics tools development.
Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag
2008-11-01
The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.
Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness
ERIC Educational Resources Information Center
Committee for Economic Development, 2006
2006-01-01
Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…
Bayesian estimation of a source term of radiation release with approximately known nuclide ratios
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek
2016-04-01
We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
The 2015 Bioinformatics Open Source Conference (BOSC 2015)
Harris, Nomi L.; Cock, Peter J. A.; Lapp, Hilmar
2016-01-01
The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included “Data Science;” “Standards and Interoperability;” “Open Science and Reproducibility;” “Translational Bioinformatics;” “Visualization;” and “Bioinformatics Open Source Project Updates”. In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled “Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community,” that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule. PMID:26914653
NASA Astrophysics Data System (ADS)
Kundu, Sumit; Fowler, Michael W.; Simon, Leonardo C.; Abouatallah, Rami; Beydokhti, Natasha
Fuel cell material durability is an area of extensive research today. Chemical degradation of the ionomer membrane is one important degradation mechanism leading to overall failure of fuel cells. This study examined the effects of relative humidity on the chemical degradation of the membrane during open circuit voltage testing. Five Gore™ PRIMEA ® series 5510 catalyst coated membranes were degraded at 100%, 75%, 50%, and 20% RH. Open circuit potential and cumulative fluoride release were monitored over time. Additionally scanning electron microscopy images were taken at end of the test. The results showed that with decreasing RH fluoride release rate increased as did performance degradation. This was attributed to an increase in gas crossover with a decrease in RH. Further, it is also shown that interruptions in testing may heavily influence cumulative fluoride release measurements where frequent stoppages in testing will cause fluoride release to be underestimated. SEM analysis shows that degradation occurred in the ionomer layer close to the cathode catalyst. A chemical degradation model of the ionomer membrane was used to model the results. The model was able to predict fluoride release trends, including the effects of interruptions, showing that changes in gas crossover with RH could explain the experimental results.
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Šindelářová, Kateřina; Hýža, Miroslav; Stohl, Andreas
2017-10-01
In the fall of 2011, iodine-131 (131I) was detected at several radionuclide monitoring stations in central Europe. After investigation, the International Atomic Energy Agency (IAEA) was informed by Hungarian authorities that 131I was released from the Institute of Isotopes Ltd. in Budapest, Hungary. It was reported that a total activity of 342 GBq of 131I was emitted between 8 September and 16 November 2011. In this study, we use the ambient concentration measurements of 131I to determine the location of the release as well as its magnitude and temporal variation. As the location of the release and an estimate of the source strength became eventually known, this accident represents a realistic test case for inversion models. For our source reconstruction, we use no prior knowledge. Instead, we estimate the source location and emission variation using only the available 131I measurements. Subsequently, we use the partial information about the source term available from the Hungarian authorities for validation of our results. For the source determination, we first perform backward runs of atmospheric transport models and obtain source-receptor sensitivity (SRS) matrices for each grid cell of our study domain. We use two dispersion models, FLEXPART and Hysplit, driven with meteorological analysis data from the global forecast system (GFS) and from European Centre for Medium-range Weather Forecasts (ECMWF) weather forecast models. Second, we use a recently developed inverse method, least-squares with adaptive prior covariance (LS-APC), to determine the 131I emissions and their temporal variation from the measurements and computed SRS matrices. For each grid cell of our simulation domain, we evaluate the probability that the release was generated in that cell using Bayesian model selection. The model selection procedure also provides information about the most suitable dispersion model for the source term reconstruction. Third, we select the most probable location of the release with its associated source term and perform a forward model simulation to study the consequences of the iodine release. Results of these procedures are compared with the known release location and reported information about its time variation. We find that our algorithm could successfully locate the actual release site. The estimated release period is also in agreement with the values reported by IAEA and the reported total released activity of 342 GBq is within the 99 % confidence interval of the posterior distribution of our most likely model.
Mitochondrial Reactive Oxygen Species (ROS) and ROS-Induced ROS Release
Zorov, Dmitry B.; Juhaszova, Magdalena; Sollott, Steven J.
2014-01-01
Byproducts of normal mitochondrial metabolism and homeostasis include the buildup of potentially damaging levels of reactive oxygen species (ROS), Ca2+, etc., which must be normalized. Evidence suggests that brief mitochondrial permeability transition pore (mPTP) openings play an important physiological role maintaining healthy mitochondria homeostasis. Adaptive and maladaptive responses to redox stress may involve mitochondrial channels such as mPTP and inner membrane anion channel (IMAC). Their activation causes intra- and intermitochondrial redox-environment changes leading to ROS release. This regenerative cycle of mitochondrial ROS formation and release was named ROS-induced ROS release (RIRR). Brief, reversible mPTP opening-associated ROS release apparently constitutes an adaptive housekeeping function by the timely release from mitochondria of accumulated potentially toxic levels of ROS (and Ca2+). At higher ROS levels, longer mPTP openings may release a ROS burst leading to destruction of mitochondria, and if propagated from mitochondrion to mitochondrion, of the cell itself. The destructive function of RIRR may serve a physiological role by removal of unwanted cells or damaged mitochondria, or cause the pathological elimination of vital and essential mitochondria and cells. The adaptive release of sufficient ROS into the vicinity of mitochondria may also activate local pools of redox-sensitive enzymes involved in protective signaling pathways that limit ischemic damage to mitochondria and cells in that area. Maladaptive mPTP- or IMAC-related RIRR may also be playing a role in aging. Because the mechanism of mitochondrial RIRR highlights the central role of mitochondria-formed ROS, we discuss all of the known ROS-producing sites (shown in vitro) and their relevance to the mitochondrial ROS production in vivo. PMID:24987008
The 2016 Bioinformatics Open Source Conference (BOSC).
Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather
2016-01-01
Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science.
Menze, Michael A; Hutchinson, Kirk; Laborde, Susan M; Hand, Steven C
2005-07-01
When mammalian mitochondria are exposed to high calcium and phosphate, a massive swelling, uncoupling of respiration, and release of cytochrome c occur. These changes are mediated by opening of the mitochondrial permeability transition pore (MPTP). Activation of the MPTP in vivo in response to hypoxic and oxidative stress leads to necrotic and apoptotic cell death. Considering that embryos of the brine shrimp Artemia franciscana tolerate anoxia for years, we investigated the MPTP in this crustacean to reveal whether pore opening occurs. Minimum molecular constituents of the regulated MPTP in mammals are believed to be the voltage-dependent anion channel, the adenine nucleotide translocators, and cyclophilin D. Western blot analysis revealed that mitochondria from A. franciscana possess all three required components. When measured with a calcium-sensitive fluorescent probe, rat liver mitochondria are shown to release matrix calcium after addition of >/=100 microM extramitochondrial calcium (MPTP opening), whereas brine shrimp mitochondria continue to take up extramitochondrial calcium and do not release internal stores even up to 1.0 mM exogenously added calcium (no MPTP opening). Furthermore, no swelling of A. franciscana mitochondria in response to added calcium was observed, and no release of cytochrome c could be detected. HgCl(2)-dependent swelling and cytochrome c release were readily confirmed, which is consistent with the presence of an "unregulated pore." Although the absence of a regulated MPTP in A. franciscana mitochondria could contribute to the extreme hypoxia tolerance in this species, we speculate that absence of the regulated MPTP may be a general feature of invertebrates.
ERIC Educational Resources Information Center
Villano, Matt
2006-01-01
This article presents an interview with Jim Hirsch, an associate superintendent for technology at Piano Independent School District in Piano, Texas. Hirsch serves as a liaison for the open technologies committee of the Consortium for School Networking. In this interview, he shares his opinion on the significance of open source in K-12.
Integration of tools for binding archetypes to SNOMED CT.
Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan
2008-10-27
The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.
Integration of tools for binding archetypes to SNOMED CT
Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Åhlfeldt, Hans; Rector, Alan
2008-01-01
Background The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Methods Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. Results An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Conclusion Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail. PMID:19007444
Microscale Obstacle Resolving Air Quality Model Evaluation with the Michelstadt Case
Rakai, Anikó; Kristóf, Gergely
2013-01-01
Modelling pollutant dispersion in cities is challenging for air quality models as the urban obstacles have an important effect on the flow field and thus the dispersion. Computational Fluid Dynamics (CFD) models with an additional scalar dispersion transport equation are a possible way to resolve the flowfield in the urban canopy and model dispersion taking into consideration the effect of the buildings explicitly. These models need detailed evaluation with the method of verification and validation to gain confidence in their reliability and use them as a regulatory purpose tool in complex urban geometries. This paper shows the performance of an open source general purpose CFD code, OpenFOAM for a complex urban geometry, Michelstadt, which has both flow field and dispersion measurement data. Continuous release dispersion results are discussed to show the strengths and weaknesses of the modelling approach, focusing on the value of the turbulent Schmidt number, which was found to give best statistical metric results with a value of 0.7. PMID:24027450
Microscale obstacle resolving air quality model evaluation with the Michelstadt case.
Rakai, Anikó; Kristóf, Gergely
2013-01-01
Modelling pollutant dispersion in cities is challenging for air quality models as the urban obstacles have an important effect on the flow field and thus the dispersion. Computational Fluid Dynamics (CFD) models with an additional scalar dispersion transport equation are a possible way to resolve the flowfield in the urban canopy and model dispersion taking into consideration the effect of the buildings explicitly. These models need detailed evaluation with the method of verification and validation to gain confidence in their reliability and use them as a regulatory purpose tool in complex urban geometries. This paper shows the performance of an open source general purpose CFD code, OpenFOAM for a complex urban geometry, Michelstadt, which has both flow field and dispersion measurement data. Continuous release dispersion results are discussed to show the strengths and weaknesses of the modelling approach, focusing on the value of the turbulent Schmidt number, which was found to give best statistical metric results with a value of 0.7.
Novel inter and intra prediction tools under consideration for the emerging AV1 video codec
NASA Astrophysics Data System (ADS)
Joshi, Urvang; Mukherjee, Debargha; Han, Jingning; Chen, Yue; Parker, Sarah; Su, Hui; Chiang, Angie; Xu, Yaowu; Liu, Zoe; Wang, Yunqing; Bankoski, Jim; Wang, Chen; Keyder, Emil
2017-09-01
Google started the WebM Project in 2010 to develop open source, royalty- free video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec AV1, in a consortium of major tech companies called the Alliance for Open Media, that achieves at least a generational improvement in coding efficiency over VP9. In this paper, we focus primarily on new tools in AV1 that improve the prediction of pixel blocks before transforms, quantization and entropy coding are invoked. Specifically, we describe tools and coding modes that improve intra, inter and combined inter-intra prediction. Results are presented on standard test sets.
OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale
NASA Astrophysics Data System (ADS)
Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason
2015-03-01
The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.
Youpi: A Web-based Astronomical Image Processing Pipeline
NASA Astrophysics Data System (ADS)
Monnerville, M.; Sémah, G.
2010-12-01
Youpi stands for “YOUpi is your processing PIpeline”. It is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. It is built on top of open source processing tools that are released to the community by Terapix, in order to organize your data on a computer cluster, to manage your processing jobs in real time and to facilitate teamwork by allowing fine-grain sharing of results and data. On the server side, Youpi is written in the Python programming language and uses the Django web framework. On the client side, Ajax techniques are used along with the Prototype and script.aculo.us Javascript librairies.
The Ozone Widget Framework: towards modularity of C2 human interfaces
NASA Astrophysics Data System (ADS)
Hellar, David Benjamin; Vega, Laurian C.
2012-05-01
The Ozone Widget Framework (OWF) is a common webtop environment for distribution across the enterprise. A key mission driver for OWF is to enable rapid capability delivery by lowering time-to-market with lightweight components. OWF has been released as Government Open Source Software and has been deployed in a variety of C2 net-centric contexts ranging from real-time analytics, cyber-situational awareness, to strategic and operational planning. This paper discusses the current and future evolution of OWF including the availability of the OZONE Marketplace (OMP), useractivity driven metrics, and architecture enhancements for accessibility. Together, OWF is moving towards the rapid delivery of modular human interfaces supporting modern and future command and control contexts.
UpSetR: an R package for the visualization of intersecting sets and their properties.
Conway, Jake R; Lex, Alexander; Gehlenborg, Nils
2017-09-15
Venn and Euler diagrams are a popular yet inadequate solution for quantitative visualization of set intersections. A scalable alternative to Venn and Euler diagrams for visualizing intersecting sets and their properties is needed. We developed UpSetR, an open source R package that employs a scalable matrix-based visualization to show intersections of sets, their size, and other properties. UpSetR is available at https://github.com/hms-dbmi/UpSetR/ and released under the MIT License. A Shiny app is available at https://gehlenborglab.shinyapps.io/upsetr/ . nils@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
EMISSIONS OF ORGANIC AIR TOXICS FROM OPEN ...
A detailed literature search was performed to collect and collate available data reporting emissions of toxic organic substances into the air from open burning sources. Availability of data varied according to the source and the class of air toxics of interest. Volatile organic compound (VOC) and polycyclic aromatic hydrocarbon (PAH) data were available for many of the sources. Data on semivolatile organic compounds (SVOCs) that are not PAHs were available for several sources. Carbonyl and polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofuran (PCDD/F) data were available for only a few sources. There were several sources for which no emissions data were available at all. Several observations were made including: 1) Biomass open burning sources typically emitted less VOCs than open burning sources with anthropogenic fuels on a mass emitted per mass burned basis, particularly those where polymers were concerned; 2) Biomass open burning sources typically emitted less SVOCs and PAHs than anthropogenic sources on a mass emitted per mass burned basis. Burning pools of crude oil and diesel fuel produced significant amounts of PAHs relative to other types of open burning. PAH emissions were highest when combustion of polymers was taking place; and 3) Based on very limited data, biomass open burning sources typically produced higher levels of carbonyls than anthropogenic sources on a mass emitted per mass burned basis, probably due to oxygenated structures r
Arctic sea ice melt leads to atmospheric new particle formation.
Dall Osto, M; Beddows, D C S; Tunved, P; Krejci, R; Ström, J; Hansson, H-C; Yoon, Y J; Park, Ki-Tae; Becagli, S; Udisti, R; Onasch, T; O Dowd, C D; Simó, R; Harrison, Roy M
2017-06-12
Atmospheric new particle formation (NPF) and growth significantly influences climate by supplying new seeds for cloud condensation and brightness. Currently, there is a lack of understanding of whether and how marine biota emissions affect aerosol-cloud-climate interactions in the Arctic. Here, the aerosol population was categorised via cluster analysis of aerosol size distributions taken at Mt Zeppelin (Svalbard) during a 11 year record. The daily temporal occurrence of NPF events likely caused by nucleation in the polar marine boundary layer was quantified annually as 18%, with a peak of 51% during summer months. Air mass trajectory analysis and atmospheric nitrogen and sulphur tracers link these frequent nucleation events to biogenic precursors released by open water and melting sea ice regions. The occurrence of such events across a full decade was anti-correlated with sea ice extent. New particles originating from open water and open pack ice increased the cloud condensation nuclei concentration background by at least ca. 20%, supporting a marine biosphere-climate link through sea ice melt and low altitude clouds that may have contributed to accelerate Arctic warming. Our results prompt a better representation of biogenic aerosol sources in Arctic climate models.
2011-09-01
a NSS that lies in this negative explosion positive CLVD quadrant due to the large degree of tectonic release in this event that reversed the phase...Mellman (1986) in their analysis of fundamental model Love and Rayleigh wave amplitude and phase for nuclear and tectonic release source terms, and...1986). Estimating explosion and tectonic release source parameters of underground nuclear explosions from Rayleigh and Love wave observations, Air
Exposure Assessment of a High-energy Tensile Test With Large Carbon Fiber Reinforced Polymer Cables.
Schlagenhauf, Lukas; Kuo, Yu-Ying; Michel, Silvain; Terrasi, Giovanni; Wang, Jing
2015-01-01
This study investigated the particle and fiber release from two carbon fiber reinforced polymer cables that underwent high-energy tensile tests until rupture. The failing event was the source of a large amount of dust whereof a part was suspected to be containing possibly respirable fibers that could cause adverse health effects. The released fibers were suspected to migrate through small openings to the experiment control room and also to an adjacent machine hall where workers were active. To investigate the fiber release and exposure risk of the affected workers, the generated particles were measured with aerosol devices to obtain the particle size and particle concentrations. Furthermore, particles were collected on filter samples to investigate the particle shape and the fiber concentration. Three situations were monitored for the control room and the machine hall: the background concentrations, the impact of the cable failure, and the venting of the exposed rooms afterward. The results showed four important findings: The cable failure caused the release of respirable fibers with diameters below 3 μm and an average length of 13.9 μm; the released particles did migrate to the control room and to the machine hall; the measured peak fiber concentration of 0.76 fibers/cm(3) and the overall fiber concentration of 0.07 fibers/cm(3) in the control room were below the Permissible Exposure Limit (PEL) for fibers without indication of carcinogenicity; and the venting of the rooms was fast and effective. Even though respirable fibers were released, the low fiber concentration and effective venting indicated that the suspected health risks from the experiment on the affected workers was low. However, the effect of long-term exposure is not known therefore additional control measures are recommended.
18 CFR 157.34 - Notice of open season.
Code of Federal Regulations, 2011 CFR
2011-04-01
... including postings on Internet Web sites, press releases, direct mail solicitations, and other advertising... open season or allocation of capacity that is not posted on the open season Internet Web site or that... due to economic, engineering, design, capacity or operational constraints, or accommodating the...
FACTORS RELATING TO THE RELEASE OF STACHYBOTRYS CHARTARUM SPORES FROM CONTAMINATED SOURCES
The paper describes preliminary results of a research project to determine the factors that control the release of S. chartarum spores from a contaminated source and test ways to reduce spore release and thus exposure. As anticipated, S. chartarum spore emissions from gypsum boar...
BORON RELEASE FROM WEATHERING ILLITES, SERPENTINE, SHALES, AND ILLITIC/PALYGORSKITIC SOILS
Despite extensive research on B adsorption and release from soils, mineral sources of B within natively high B soils remain poorly under- stood. The objectives of this study were to identify source minerals contributing to the continued B release after extraction of soluble B and...
OpenStereo: Open Source, Cross-Platform Software for Structural Geology Analysis
NASA Astrophysics Data System (ADS)
Grohmann, C. H.; Campanha, G. A.
2010-12-01
Free and open source software (FOSS) are increasingly seen as synonyms of innovation and progress. Freedom to run, copy, distribute, study, change and improve the software (through access to the source code) assure a high level of positive feedback between users and developers, which results in stable, secure and constantly updated systems. Several software packages for structural geology analysis are available to the user, with commercial licenses or that can be downloaded at no cost from the Internet. Some provide basic tools of stereographic projections such as plotting poles, great circles, density contouring, eigenvector analysis, data rotation etc, while others perform more specific tasks, such as paleostress or geotechnical/rock stability analysis. This variety also means a wide range of data formating for input, Graphical User Interface (GUI) design and graphic export format. The majority of packages is built for MS-Windows and even though there are packages for the UNIX-based MacOS, there aren't native packages for *nix (UNIX, Linux, BSD etc) Operating Systems (OS), forcing the users to run these programs with emulators or virtual machines. Those limitations lead us to develop OpenStereo, an open source, cross-platform software for stereographic projections and structural geology. The software is written in Python, a high-level, cross-platform programming language and the GUI is designed with wxPython, which provide a consistent look regardless the OS. Numeric operations (like matrix and linear algebra) are performed with the Numpy module and all graphic capabilities are provided by the Matplolib library, including on-screen plotting and graphic exporting to common desktop formats (emf, eps, ps, pdf, png, svg). Data input is done with simple ASCII text files, with values of dip direction and dip/plunge separated by spaces, tabs or commas. The user can open multiple file at the same time (or the same file more than once), and overlay different elements of each dataset (poles, great circles etc). The GUI shows the opened files in a tree structure, similar to “layers” of many illustration software, where the vertical order of the files in the tree reflects the drawing order of the selected elements. At this stage, the software performs plotting operations of poles to planes, lineations, great circles, density contours and rose diagrams. A set of statistics is calculated for each file and its eigenvalues and eigenvectors are used to suggest if the data is clustered about a mean value or distributed along a girdle. Modified Flinn, Triangular and histograms plots are also available. Next step of development will focus on tools as merging and rotation of datasets, possibility to save 'projects' and paleostress analysis. In its current state, OpenStereo requires Python, wxPython, Numpy and Matplotlib installed in the system. We recommend installing PythonXY or the Enthought Python Distribution on MS-Windows and MacOS machines, since all dependencies are provided. Most Linux distributions provide an easy way to install all dependencies through software repositories. OpenStereo is released under the GNU General Public License. Programmers willing to contribute are encouraged to contact the authors directly. FAPESP Grant #09/17675-5
Open-Source 3D-Printable Optics Equipment
Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.
2013-01-01
Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104
Open-source 3D-printable optics equipment.
Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M
2013-01-01
Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.
Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...
Coal fly ash as a source of iron in atmospheric dust.
Chen, Haihan; Laskin, Alexander; Baltrusaitis, Jonas; Gorski, Christopher A; Scherer, Michelle M; Grassian, Vicki H
2012-02-21
Anthropogenic coal fly ash (FA) aerosol may represent a significant source of bioavailable iron in the open ocean. Few measurements have been made that compare the solubility of atmospheric iron from anthropogenic aerosols and other sources. We report here an investigation of iron dissolution for three FA samples in acidic aqueous solutions and compare the solubilities with that of Arizona test dust (AZTD), a reference material for mineral dust. The effects of pH, simulated cloud processing, and solar radiation on iron solubility have been explored. Similar to previously reported results on mineral dust, iron in aluminosilicate phases provides the predominant component of dissolved iron. Iron solubility of FA is substantially higher than of the crystalline minerals comprising AZTD. Simulated atmospheric processing elevates iron solubility due to significant changes in the morphology of aluminosilicate glass, a dominant material in FA particles. Iron is continuously released into the aqueous solution as FA particles break up into smaller fragments. These results suggest that the assessment of dissolved atmospheric iron deposition fluxes and their effect on the biogeochemistry at the ocean surface should be constrained by the source, environmental pH, iron speciation, and solar radiation.
Defining Openness: Updating the Concept of "Open" for a Connected World
ERIC Educational Resources Information Center
McAndrew, Patrick
2010-01-01
The release of free resources by the education sector has led to reconsideration of how the open approach implied by Open Educational Resources (OER) impacts on the educator and the learner. However this work has tended to consider the replication of standard campus based approaches and the characteristics of content that will encourage other…
Vegetation in group selection openings: ecology and manipulation
Philip M. McDonald; Gary O. Fiddler
1991-01-01
Group selection openings ranging from 0.1 to 2.0 acres in mixed conifer stands in northern and central California were evaluated for effect of site preparation, opening size, kind and amount of vegetation, and release treatment. Small openings, in general, are characterized by less sunlight and lower temperature extremes than clearcuttings. Roots from border trees...
30 CFR 57.22105 - Smoking and open flames (IV mines).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Smoking and open flames (IV mines). 57.22105... Standards for Methane in Metal and Nonmetal Mines Fire Prevention and Control § 57.22105 Smoking and open flames (IV mines). Smoking or open flames shall not be permitted in a face or raise, or during release of...
30 CFR 57.22105 - Smoking and open flames (IV mines).
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Smoking and open flames (IV mines). 57.22105... Standards for Methane in Metal and Nonmetal Mines Fire Prevention and Control § 57.22105 Smoking and open flames (IV mines). Smoking or open flames shall not be permitted in a face or raise, or during release of...
30 CFR 57.22105 - Smoking and open flames (IV mines).
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Smoking and open flames (IV mines). 57.22105... Standards for Methane in Metal and Nonmetal Mines Fire Prevention and Control § 57.22105 Smoking and open flames (IV mines). Smoking or open flames shall not be permitted in a face or raise, or during release of...
30 CFR 57.22105 - Smoking and open flames (IV mines).
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Smoking and open flames (IV mines). 57.22105... Standards for Methane in Metal and Nonmetal Mines Fire Prevention and Control § 57.22105 Smoking and open flames (IV mines). Smoking or open flames shall not be permitted in a face or raise, or during release of...
Current trend in drug delivery considerations for subcutaneous insulin depots to treat diabetes.
P V, Jayakrishnapillai; Nair, Shantikumar V; Kamalasanan, Kaladhar
2017-05-01
Diabetes mellitus (DM) is a metabolic disorder due to irregularities in glucose metabolism, as a result of insulin disregulation. Chronic DM (Type 1) is treated by daily insulin injections by subcutaneous route. Daily injections cause serious patient non-compliance and medication non-adherence. Insulin Depots (ID) are parenteral formulations designed to release the insulin over a specified period of time, to control the plasma blood glucose level for intended duration. Physiologically, pancreas produces and secretes insulin in basal and pulsatile mode into the blood. Delivery systems mimicking basal release profiles are known as open-loop systems and current marketed products are open-loop systems. Future trend in open-loop systems is to reduce the number of injections per week by enhancing duration of action, by modifying the depot properties. The next generation technologies are closed-loop systems that mimic the pulsatile mode of delivery by pancreas. In closed-loop systems insulin will be released in response to plasma glucose. This review focuses on future trend in open-loop systems; by understanding (a) the secretion of insulin from pancreas, (b) the insulin regulation normal and in DM, (c) insulin depots and (d) the recent progress in open-loop depot technology particularly with respect to nanosystems. Copyright © 2017 Elsevier B.V. All rights reserved.
Elzinga, Kate E; Curran, Matthew W T; Morhart, Michael J; Chan, K Ming; Olson, Jaret L
2016-07-01
Reconstruction of the suprascapular nerve (SSN) after brachial plexus injury often involves nerve grafting or a nerve transfer. To restore shoulder abduction and external rotation, a branch of the spinal accessory nerve is commonly transferred to the SSN. To allow reinnervation of the SSN, any potential compression points should be released to prevent a possible double crush syndrome. For that reason, the authors perform a release of the superior transverse scapular ligament at the suprascapular notch in all patients undergoing reconstruction of the upper trunk of the brachial plexus. Performing the release through a standard anterior open supraclavicular approach to the brachial plexus avoids the need for an additional posterior incision or arthroscopic procedure. Copyright © 2016 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Jin, Jin; Zimmerman, Andrew R; Norton, Stuart B; Annable, Michael D; Harris, Willie G
2016-05-01
While aquifer storage and recovery (ASR) is becoming widely accepted as a way to address water supply shortages, there are concerns that it may lead to release of harmful trace elements such as arsenic (As). Thus, mechanisms of As release from limestone during ASR operations were investigated using 110-day laboratory incubations of core material collected from the Floridan Aquifer, with treatment additions of labile or refractory dissolved organic matter (DOM) or microbes. During the first experimental phase, core materials were equilibrated with native groundwater lacking in DO to simulate initial non-perturbed anaerobic aquifer conditions. Then, ASR was simulated by replacing the native groundwater in the incubations vessels with DO-rich ASR source water, with DOM or microbes added to some treatments. Finally, the vessels were opened to the atmosphere to mimic oxidizing conditions during later stages of ASR. Arsenic was released from aquifer materials, mainly during transitional periods at the beginning of each incubation stage. Most As released was during the initial anaerobic experimental phase via reductive dissolution of Fe oxides in the core materials, some or all of which may have formed during the core storage or sample preparation period. Oxidation of As-bearing Fe sulfides released smaller amounts of As during the start of later aerobic experimental phases. Additions of labile DOM fueled microbially-mediated reactions that mobilized As, while the addition of refractory DOM did not, probably due to mineral sorption of DOM that made it unavailable for microbial utilization or metal chelation. The results suggest that oscillations of groundwater redox conditions, such as might be expected to occur during an ASR operation, are the underlying cause of enhanced As release in these systems. Further, ASR operations using DOM-rich surface waters may not necessarily lead to additional As releases. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Saxena, A.; Choi, E.; Powell, C. A.
2017-12-01
The mechanism behind the seismicity of the New Madrid Seismic Zone (NMSZ), the major intraplate earthquake source in the Central and Eastern US (CEUS), is still debated but new insights are being provided by recent tomographic studies involving USArray. A high-resolution tomography study by Nyamwandha et al. (2016) in the NMSZ indicates the presence of low (3 % - 5 %) upper mantle Vp and Vs anomalies in the depth range 100 to 250 km. The elevated anomaly magnitudes are difficult to explain by temperature alone. As the low-velocity anomalies beneath the northeast China are attributed to fluids released from the stagnant Pacific slab, water released from the stagnant Laramide Slab, presently located at transition zone depths beneath the CEUS might be contributing to the low velocity features in this region's upper mantle. Here, we investigate the potential impact of the slab-released fluids on the stresses at seismogenic depths using numerical modeling. We convert the tomographic results into temperature field under various assumed values of spatially uniform water content. In more realistic cases, water content is added only when the converted temperature exceeds the melting temperature of olivine. Viscosities are then computed based on the temperature and water content and given to our geodynamic models created by Pylith, an open source software for crustal dynamics. The model results show that increasing water content weakens the upper mantle more than temperature alone and thus elevates the differential stress in the upper crust. These results can better explain the tomography results and seismicity without invoking melting. We also invert the tomography results for volume fraction of orthopyroxene and temperature and compare the resultant stresses with those for pure olivine. To enhance the reproducibility, selected models in this study will be made available in the form of sharable and reproducible packages enabled by EarthCube Building block project, GeoTrust.
The 2016 Bioinformatics Open Source Conference (BOSC)
Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather
2016-01-01
Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science. PMID:27781083
46 CFR 308.532 - Release of surety bond, Form MA-312.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 8 2014-10-01 2014-10-01 false Release of surety bond, Form MA-312. 308.532 Section 308.532 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Open Policy War Risk Cargo Insurance § 308.532 Release of surety bond...
Rawification and the careful generation of open government data.
Denis, Jérôme; Goëta, Samuel
2017-10-01
Drawing on a two-year ethnographic study within several French administrations involved in open data programs, this article aims to investigate the conditions of the release of government data - the rawness of which open data policies require. This article describes two sets of phenomena. First, far from being taken for granted, open data emerge in administrations through a progressive process that entails uncertain collective inquiries and extraction work. Second, the opening process draws on a series of transformations, as data are modified to satisfy an important criterion of open data policies: the need for both human and technical intelligibility. There are organizational consequences of these two points, which can notably lead to the visibilization or the invisibilization of data labour. Finally, the article invites us to reconsider the apparent contradiction between the process of data release and the existence of raw data. Echoing the vocabulary of one of the interviewees, the multiple operations can be seen as a 'rawification' process by which open government data are carefully generated. Such a notion notably helps to build a relational model of what counts as data and what counts as work.
Global biogeochemical implications of mercury discharges from rivers and sediment burial.
Amos, Helen M; Jacob, Daniel J; Kocman, David; Horowitz, Hannah M; Zhang, Yanxu; Dutkiewicz, Stephanie; Horvat, Milena; Corbitt, Elizabeth S; Krabbenhoft, David P; Sunderland, Elsie M
2014-08-19
Rivers are an important source of mercury (Hg) to marine ecosystems. Based on an analysis of compiled observations, we estimate global present-day Hg discharges from rivers to ocean margins are 27 ± 13 Mmol a(-1) (5500 ± 2700 Mg a(-1)), of which 28% reaches the open ocean and the rest is deposited to ocean margin sediments. Globally, the source of Hg to the open ocean from rivers amounts to 30% of atmospheric inputs. This is larger than previously estimated due to accounting for elevated concentrations in Asian rivers and variability in offshore transport across different types of estuaries. Riverine inputs of Hg to the North Atlantic have decreased several-fold since the 1970s while inputs to the North Pacific have increased. These trends have large effects on Hg concentrations at ocean margins but are too small in the open ocean to explain observed declines of seawater concentrations in the North Atlantic or increases in the North Pacific. Burial of Hg in ocean margin sediments represents a major sink in the global Hg biogeochemical cycle that has not been previously considered. We find that including this sink in a fully coupled global biogeochemical box model helps to balance the large anthropogenic release of Hg from commercial products recently added to global inventories. It also implies that legacy anthropogenic Hg can be removed from active environmental cycling on a faster time scale (centuries instead of millennia). Natural environmental Hg levels are lower than previously estimated, implying a relatively larger impact from human activity.
OpenMx: An Open Source Extended Structural Equation Modeling Framework
ERIC Educational Resources Information Center
Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John
2011-01-01
OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are…
a Framework for AN Open Source Geospatial Certification Model
NASA Astrophysics Data System (ADS)
Khan, T. U. R.; Davis, P.; Behr, F.-J.
2016-06-01
The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.
RLINE: A Line Source Dispersion Model for Near-Surface Releases
This paper describes the formulation and evaluation of RLINE, a Research LINE source model for near surface releases. The model is designed to simulate mobile source pollutant dispersion to support the assessment of human exposures in near-roadway environments where a significant...
Teh, K K; Ng, E S; Choon, D S K
2009-08-01
This cadaveric study evaluates the margin of safety and technical efficacy of mini open carpal tunnel release performed using Knifelight (Stryker Instruments) through a transverse 1 cm wrist incision. A single investigator released 32 wrists in 17 cadavers. The wrists were then explored to assess the completeness of release and damage to vital structures including the superficial palmar arch, palmar cutaneous branch and recurrent branch of the median nerve. All the releases were complete and no injury to the median nerve and other structures were observed. The mean distance of the recurrent motor branch to the ligamentous divisions was 5.7 +/- 2.4 mm, superficial palmar arch was 8.7 +/- 3.1 mm and palmar cutaneous branch to the ligamentous division was 7.2 +/- 2.4 mm. The mean length of the transverse carpal ligament was 29.3 +/- 3.7 mm. Guyon's canal was preserved in all cases.
ERIC Educational Resources Information Center
Guhlin, Miguel
2007-01-01
Open source has continued to evolve and in the past three years the development of a graphical user interface has made it increasingly accessible and viable for end users without special training. Open source relies to a great extent on the free software movement. In this context, the term free refers not to cost, but to the freedom users have to…
Functional Consequence of Distal Brachioradialis Tendon Release: A Biomechanical Study
Tirrell, Timothy F.; Franko, Orrin I.; Bhola, Siddharth; Hentzen, Eric R.; Abrams, Reid A.; Lieber, Richard L.
2013-01-01
Purpose Open reduction and internal fixation of distal radius fractures often necessitates release of the brachioradialis from the radial styloid. However, this common procedure has the potential to decrease elbow flexion strength. To determine the potential morbidity associated with brachioradialis release, we measured the change in elbow torque as a function of incremental release of the brachioradialis insertion footprint. Methods In 5 upper extremity cadaveric specimens, the brachioradialis tendon was systematically released from the radius, and the resultant effect on brachioradialis elbow flexion torque was measured. Release distance was defined as the distance between the release point and the tip of the radial styloid. Results Brachioradialis elbow flexion torque dropped to 95%, 90% and 86% of its original value at release distances of 27mm, 46mm, and 52mm, respectively. Importantly, brachioradialis torque remained above 80% of its original value at release distances up to 7 centimeters. Conclusions Our data demonstrate that release of the brachioradialis tendon from its insertion has minor effects on its ability to transmit force to the distal radius. Clinical Relevance These data may imply that release of the distal brachioradialis tendon during distal radius open reduction internal fixation can be performed without meaningful functional consequences to elbow flexion torque. Even at large release distances, overall elbow flexion torque loss after brachioradialis release would be expected to be less than 5% due to the much larger contributions of the biceps and brachialis. Use of the brachioradialis as a tendon transfer donor should not be limited by concerns of elbow flexion loss, and the tendon could be considered as an autograft donor. PMID:23528425
Epigenome data release: a participant-centered approach to privacy protection.
Dyke, Stephanie O M; Cheung, Warren A; Joly, Yann; Ammerpohl, Ole; Lutsik, Pavlo; Rothstein, Mark A; Caron, Maxime; Busche, Stephan; Bourque, Guillaume; Rönnblom, Lars; Flicek, Paul; Beck, Stephan; Hirst, Martin; Stunnenberg, Henk; Siebert, Reiner; Walter, Jörn; Pastinen, Tomi
2015-07-17
Large-scale epigenome mapping by the NIH Roadmap Epigenomics Project, the ENCODE Consortium and the International Human Epigenome Consortium (IHEC) produces genome-wide DNA methylation data at one base-pair resolution. We examine how such data can be made open-access while balancing appropriate interpretation and genomic privacy. We propose guidelines for data release that both reduce ambiguity in the interpretation of open-access data and limit immediate access to genetic variation data that are made available through controlled access.
Bring out your codes! Bring out your codes! (Increasing Software Visibility and Re-use)
NASA Astrophysics Data System (ADS)
Allen, A.; Berriman, B.; Brunner, R.; Burger, D.; DuPrie, K.; Hanisch, R. J.; Mann, R.; Mink, J.; Sandin, C.; Shortridge, K.; Teuben, P.
2013-10-01
Progress is being made in code discoverability and preservation, but as discussed at ADASS XXI, many codes still remain hidden from public view. With the Astrophysics Source Code Library (ASCL) now indexed by the SAO/NASA Astrophysics Data System (ADS), the introduction of a new journal, Astronomy & Computing, focused on astrophysics software, and the increasing success of education efforts such as Software Carpentry and SciCoder, the community has the opportunity to set a higher standard for its science by encouraging the release of software for examination and possible reuse. We assembled representatives of the community to present issues inhibiting code release and sought suggestions for tackling these factors. The session began with brief statements by panelists; the floor was then opened for discussion and ideas. Comments covered a diverse range of related topics and points of view, with apparent support for the propositions that algorithms should be readily available, code used to produce published scientific results should be made available, and there should be discovery mechanisms to allow these to be found easily. With increased use of resources such as GitHub (for code availability), ASCL (for code discovery), and a stated strong preference from the new journal Astronomy & Computing for code release, we expect to see additional progress over the next few years.
Cortes-Hernandez, Paulina
2017-01-01
Periplasmic Binding Proteins (PBPs) trap nutrients for their internalization into bacteria by ABC transporters. Ligand binding triggers PBP closure by bringing its two domains together like a Venus flytrap. The atomic determinants that control PBP opening and closure for nutrient capture and release are not known, although it is proposed that opening and ligand release occur while in contact with the ABC transporter for concurrent substrate translocation. In this paper we evaluated the effect of the isomerization of a conserved proline, located near the binding site, on the propensity of PBPs to open and close. ArgT/LAO from Salmonella typhimurium and HisJ from Escherichia coli were studied through molecular mechanics at two different temperatures: 300 and 323 K. Eight microseconds were simulated per protein to analyze protein opening and closure in the absence of the ABC transporter. We show that when the studied proline is in trans, closed empty LAO and HisJ can open. In contrast, with the proline in cis, opening transitions were much less frequent and characterized by smaller changes. The proline in trans also renders the open trap prone to close over a ligand. Our data suggest that the isomerization of this conserved proline modulates the PBP mechanism: the proline in trans allows the exploration of conformational space to produce trap opening and closure, while in cis it restricts PBP movement and could limit ligand release until in productive contact with the ABC transporter. This is the first time that a proline isomerization has been related to the control of a large conformational change like the PBP flytrap mechanism. PMID:29190818
NASA Astrophysics Data System (ADS)
Banks, Michael
2012-08-01
The UK government has "widely accepted" the recommendations of a major report into open-access publishing that was released in June by a 15-strong working group led by the British sociologist Janet Finch.
When Free Isn't Free: The Realities of Running Open Source in School
ERIC Educational Resources Information Center
Derringer, Pam
2009-01-01
Despite the last few years' growth in awareness of open-source software in schools and the potential savings it represents, its widespread adoption is still hampered. Randy Orwin, technology director of the Bainbridge Island School District in Washington State and a strong open-source advocate, cautions that installing an open-source…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kyriakou, I., E-mail: ikyriak@cc.uoi.gr; Šefl, M.; Department of Dosimetry and Application of Ionizing Radiation, Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University in Prague, 115 19 Prague
The most recent release of the open source and general purpose Geant4 Monte Carlo simulation toolkit (Geant4 10.2 release) contains a new set of physics models in the Geant4-DNA extension for improving the modelling of low-energy electron transport in liquid water (<10 keV). This includes updated electron cross sections for excitation, ionization, and elastic scattering. In the present work, the impact of these developments to track-structure calculations is examined for providing the first comprehensive comparison against the default physics models of Geant4-DNA. Significant differences with the default models are found for the average path length and penetration distance, as well asmore » for dose-point-kernels for electron energies below a few hundred eV. On the other hand, self-irradiation absorbed fractions for tissue-like volumes and low-energy electron sources (including some Auger emitters) reveal rather small differences (up to 15%) between these new and default Geant4-DNA models. The above findings indicate that the impact of the new developments will mainly affect those applications where the spatial pattern of interactions and energy deposition of very-low energy electrons play an important role such as, for example, the modelling of the chemical and biophysical stage of radiation damage to cells.« less
Li, Shou-Nan; Chang, Chin-Ta; Shih, Hui-Ya; Tang, Andy; Li, Alen; Chen, Yin-Yung
2003-01-01
A mobile extractive Fourier transform infrared (FTIR) spectrometer was successfully used to locate, identify, and quantify the "odor" sources inside the cleanroom of a semiconductor manufacturing plant. It was found that ozone (O(3)) gas with a peak concentration of 120 ppm was unexpectedly releasing from a headspace of a drain for transporting used ozonized water and that silicon tetrafluoride (SiF(4)) with a peak concentration of 3 ppm was off-gassed from silicon wafers after dry-etching processing. When the sources of the odors was pinpointed by the FTIR, engineering control measures were applied. For O(3) control, a water-sealed pipeline was added to prevent the O(3) gas (emitting from the ozonized water) from entering the mixing unit. A ventilation system also was applied to the mixing unit in case of O(3) release. For SiF(4) mitigation, before the wafer-out chamber was opened, N(2) gas with a flow rate of 150 L/min was used for 100 sec to purge the wafer-out chamber, and a vacuum system was simultaneously activated to pump away the purging N(2). The effectiveness of the control measures was assured by using the FTIR. In addition, the FTIR was used to monitor the potential hazardous gas emissions during preventative maintenance of the semiconductor manufacturing equipment.
78 FR 68502 - Notice of Open Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... U.S.-CHINA ECONOMIC AND SECURITY REVIEW COMMISSION Notice of Open Public Meeting AGENCY: U.S.-China Economic and Security Review Commission. ACTION: Notice of Official Public Release of the... recommendations of its 2013 Annual Report to Congress are open to the public. Topics To Be Discussed The...
Development of Virtual Blade Model for Modelling Helicopter Rotor Downwash in OpenFOAM
2013-12-01
UNCLASSIFIED Development of Virtual Blade Model for Modelling Helicopter Rotor Downwash in OpenFOAM Stefano Wahono Aerospace...Georgia Institute of Technology. The OpenFOAM predicted result was also shown to compare favourably with ANSYS Fluent predictions. RELEASE...UNCLASSIFIED Development of Virtual Blade Model for Modelling Helicopter Rotor Downwash in OpenFOAM Executive Summary The Infrared
Correlates to colonizations of new patches by translocated populations of bighorn sheep
Singer, F.J.; Moses, M.E.; Bellew, S.; Sloan, W.
2000-01-01
By 1950, bighorn sheep were extirpated from large areas of their range. Most extant populations of bighorn sheep (Ovis canadensis) in the Intermountain West consist of <100 individuals occurring in a fragmented distribution across the landscape. Dispersal and successful colonizations of unoccupied habitat patches has been rarely reported, and, in particular, translocated populations have been characterized by limited population growth and limited dispersal rates. Restoration of the species is greatly assisted by dispersal and successful colonization of new patches within a metapopulation structure versus the existing scenario of negligible dispersal and fragmented, small populations. We investigated the correlates for the rate of colonizations of 79 suitable, but unoccupied, patches by 31 translocated populations of bighorn sheep released into nearby patches of habitat. Population growth rates of bighorn sheep in the release patches were correlated to Ne of the founder group, and early contact with a second released population in a nearby release patch (logistic regression, p = 0.08). Largest population size of all extant released populations in 1994 was correlated to potential Ne of the founder group, the number of different source populations represented in the founder, and early contact with a second released population (p = 0.016). Dispersal rates were 100% higher in rams than ewes (p = 0.001). Successful colonizations of unoccupied patches (n = 24 of 79 were colonized) were associated with rapid growth rates in the released population, years since release, larger area of suitable habitat in the release patch, larger population sizes, and a seasonal migratory tendency in the released population (p = 0.05). Fewer water barriers, more open vegetation and more rugged, broken terrain in the intervening habitat were also associated with colonizations (p = <0.05). We concluded that high dispersal rates and rapid reoccupation of large areas could occur if bighorn sheep are placed in large patches of habitat with few barriers to movements to other patches and with no domestic sheep present. Many restorations in the past that did not meet these criteria may have contributed to an insular population structure of bighorn sheep with limited observations of dispersal.
Correlates to colonizations of new patches by translocated populations of bighorn sheep
Singer, F.J.; Moses, M.E.; Bellew, S.; Sloan, W.
2000-01-01
By 1950, bighorn sheep were extirpated from large areas of their range. Most extant populations of bighorn sheep (Ovis canadensis) in the Intermountain West consist of <100 individuals occurring in a fragmented distribution across the landscape. Dispersal and successful colonizations of unoccupied habitat patches has been rarely reported, and, in particular, translocated populations have been characterized by limited population growth and limited dispersal rates. Restoration of the species is greatly assisted by dispersal and successful colonization of new patches within a metapopulation structure versus the existing scenario of negligible dispersal and fragmented, small populations. We investigated the correlates for the rate of colonizations of 79 suitable, but unoccupied, patches by 31 translocated populations of bighorn sheep released into nearby patches of habitat. Population growth rates of bighorn sheep in the release patches were correlated to Ne of the founder group, and early contact with a second released population in a nearby release patch (logistic regression, p = 0.08). Largest population size of all extant released populations in 1994 was correlated to potential Ne of the founder group, the number of different source populations represented in the founder, and early contact with a second released population (p = 0.016). Dispersal rates were 100% higher in rams than ewes (p = 0.001). Successful colonizations of unoccupied patches (n = 24 of 79 were colonized) were associated with rapid growth rates in the released population, years since release, larger area of suitable habitat in the release patch, larger population sizes, and a seasonal migratory tendency in the released population (p = 0.05). Fewer water barriers, more open vegetation and more rugged, broken terrain in the intervening habitat were also associated with colonizations (p = <0.05). We concluded that high dispersal rates and rapid reoccupation of large areas could occur if bighorn sheep are placed in large patches of habitat with few barriers to movements to other patches and with no domestic sheep present. Many restorations in the past that did not meet these criteria may have contributed to an insular population structure of bighorn sheep with limited observations of dispersal.
Chromatin condensation during terminal erythropoiesis.
Zhao, Baobing; Yang, Jing; Ji, Peng
2016-09-02
Mammalian terminal erythropoiesis involves gradual but dramatic chromatin condensation steps that are essential for cell differentiation. Chromatin and nuclear condensation is followed by a unique enucleation process, which is believed to liberate more spaces for hemoglobin enrichment and enable the generation of a physically flexible mature red blood cell. Although these processes have been known for decades, the mechanisms are still unclear. Our recent study reveals an unexpected nuclear opening formation during mouse terminal erythropoiesis that requires caspase-3 activity. Major histones, except H2AZ, are partially released from the opening, which is important for chromatin condensation. Block of the nuclear opening through caspase inhibitor or knockdown of caspase-3 inhibits chromatin condensation and enucleation. We also demonstrate that nuclear opening and histone release are cell cycle regulated. These studies reveal a novel mechanism for chromatin condensation in mammalia terminal erythropoiesis.
OMPC: an Open-Source MATLAB®-to-Python Compiler
Jurica, Peter; van Leeuwen, Cees
2008-01-01
Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577
Transparent Global Seismic Hazard and Risk Assessment
NASA Astrophysics Data System (ADS)
Smolka, Anselm; Schneider, John; Pinho, Rui; Crowley, Helen
2013-04-01
Vulnerability to earthquakes is increasing, yet advanced reliable risk assessment tools and data are inaccessible to most, despite being a critical basis for managing risk. Also, there are few, if any, global standards that allow us to compare risk between various locations. The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange, and leverages the knowledge of leading experts for the benefit of society. Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. Guided by the needs and experiences of governments, companies and citizens at large, they work in continuous interaction with the wider community. A continuously expanding public-private partnership constitutes the GEM Foundation, which drives the collaborative GEM effort. An integrated and holistic approach to risk is key to GEM's risk assessment platform, OpenQuake, that integrates all above-mentioned contributions and will become available towards the end of 2014. Stakeholders worldwide will be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Homogenized information on hazard can be combined with data on exposure (buildings, population) and data on their vulnerability, for loss assessment around the globe. Furthermore, for a true integrated view of seismic risk, users can add social vulnerability and resilience indices to maps and estimate the costs and benefits of different risk management measures. The following global data, models and methodologies will be available in the platform. Some of these will be released to the public already before, such as the ISC-GEM global instrumental catalogue (released January 2013). Datasets: • Global Earthquake History Catalogue [1000-1903] • Global Instrumental Catalogue [1900-2009] • Global Geodetic Strain Rate Model • Global Active Fault Database • Tectonic Regionalisation • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerability Database • Socio-Economic Vulnerability and Resilience Indicators Models: • Seismic Source Models • Ground Motion (Attenuation) Models • Physical Exposure Models • Physical Vulnerability Models • Composite Index Models (social vulnerability, resilience, indirect loss) The aforementioned models developed under the GEM framework will be combined to produce estimates of hazard and risk at a global scale. Furthermore, building on many ongoing efforts and knowledge of scientists worldwide, GEM will integrate state-of-the-art data, models, results and open-source tools into a single platform that is to serve as a "clearinghouse" on seismic risk. The platform will continue to increase in value, in particular for use in local contexts, through contributions and collaborations with scientists and organisations worldwide.
Using stable isotope systematics and trace metals to constrain the dispersion of fish farm pollution
NASA Astrophysics Data System (ADS)
Torchinsky, A.; Shiel, A. E.; Price, M.; Weis, D. A.
2010-12-01
Fish farming is a growing industry of great economic importance to coastal communities. Unfortunately, open-net fish farming is associated with the release of organic and metal pollution, which has the potential to adversely affect the coastal marine environment. The dispersion of fish farm pollution and its environmental impact are not well understood/quantified. Pollutants released by fish farms include organic products such as uneaten feed pellets and fish feces, as well as chemicals and pharmaceuticals, all of which may enter marine ecosystems. In this study, we took advantage of bioaccumulation in passive suspension feeding Manila Clams collected at varying distances from an open-net salmon farm located in the Discovery Islands of British Columbia. Measurements of stable C and N isotopes, as well as trace metal concentrations, in the clams were used to investigate the spread of pollutants by detecting the presence of fish farm waste in the clams’ diet. Lead isotopic measurements were used to identify other significant anthropogenic pollution sources, which may impact the study area. Clams located within the areal extent of waste discharged by a fish farm are expected to exhibit anomalous light stable isotope ratios and metal concentrations, reflecting the presence of pollutants accumulated directly from seawater and from their diet. Clams were collected in the Discovery Islands from three sites in the Octopus Islands, located 850 m, 2100 m and 3000 m north of the Cyrus Rocks salmon farm (near Quadra Island) and from a reference site on Penn Island. Light stable isotope ratios (δN = ~10‰, with little variation between sites, and δC from -14.5 to -17.3‰) of the clams suggest that the most distal site (i.e., 3000 m away) is most impacted by organic fish farm waste (i.e., food pellets and feces) and that contributions of organic waste actually decrease closer to the farm. Not surprisingly, the smallest contribution of organic waste was detected in clams from the reference site. It is thought that resuspension of particulate waste could be responsible for concentrating waste far from the fish farm. No pattern was observed in the trace metal concentration measurements (Cu = 4.11 - 9.64 ppm, Zn 40.0 - 107 ppm and Pb 0.008 - 0.086 ppm) of the clams suggesting differences in the dispersion of metal contaminants and organic waste. Lead isotope ratios (1.14874 to 1.74100 for 206Pb /207Pb and 2.07579 to 2.10615 for 208Pb /206Pb) indicate the importance of anthropogenic Pb sources in the study area (i.e., unleaded gasoline and diesel fuel consumption and metal smelting), however, the anthropogenic Pb sources are unlikely to be associated with the open-net salmon farm. Waste dispersion from open-net fish farms is complicated by physical oceanographic conditions, which characterize individual study areas, this must be taken into account when interpreting results and designing future studies.
ERIC Educational Resources Information Center
Villano, Matt
2006-01-01
Increasingly, colleges and universities are turning to open source as a way to meet their technology infrastructure and application needs. Open source has changed life for visionary CIOs and their campus communities nationwide. The author discusses what these technologists see as the benefits--and the considerations.
Effects of rogue ryanodine receptors on Ca2+ sparks in cardiac myocytes
Chen, Xudong; Feng, Yundi; Tan, Wenchang
2018-01-01
Ca2+ sparks and Ca2+ quarks, arising from clustered and rogue ryanodine receptors (RyRs), are significant Ca2+ release events from the junctional sarcoplasmic reticulum (JSR). Based on the anomalous subdiffusion of Ca2+ in the cytoplasm, a mathematical model was developed to investigate the effects of rogue RyRs on Ca2+ sparks in cardiac myocytes. Ca2+ quarks and sparks from the stochastic opening of rogue and clustered RyRs are numerically reproduced and agree with experimental measurements. It is found that the stochastic opening Ca2+ release units (CRUs) of clustered RyRs are regulated by free Ca2+ concentration in the JSR lumen (i.e. [Ca2+]lumen). The frequency of spontaneous Ca2+ sparks is remarkably increased by the rogue RyRs opening at high [Ca2+]lumen, but not at low [Ca2+]lumen. Hence, the opening of rogue RyRs contributes to the formation of Ca2+ sparks at high [Ca2+]lumen. The interplay of Ca2+ sparks and Ca2+ quarks has been discussed in detail. This work is of significance to provide insight into understanding Ca2+ release mechanisms in cardiac myocytes. PMID:29515864
Effects of rogue ryanodine receptors on Ca2+ sparks in cardiac myocytes.
Chen, Xudong; Feng, Yundi; Huo, Yunlong; Tan, Wenchang
2018-02-01
Ca 2+ sparks and Ca 2+ quarks, arising from clustered and rogue ryanodine receptors (RyRs), are significant Ca 2+ release events from the junctional sarcoplasmic reticulum (JSR). Based on the anomalous subdiffusion of Ca 2+ in the cytoplasm, a mathematical model was developed to investigate the effects of rogue RyRs on Ca 2+ sparks in cardiac myocytes. Ca 2+ quarks and sparks from the stochastic opening of rogue and clustered RyRs are numerically reproduced and agree with experimental measurements. It is found that the stochastic opening Ca 2+ release units (CRUs) of clustered RyRs are regulated by free Ca 2+ concentration in the JSR lumen (i.e. [Ca 2+ ] lumen ). The frequency of spontaneous Ca 2+ sparks is remarkably increased by the rogue RyRs opening at high [Ca 2+ ] lumen , but not at low [Ca 2+ ] lumen . Hence, the opening of rogue RyRs contributes to the formation of Ca 2+ sparks at high [Ca 2+ ] lumen . The interplay of Ca 2+ sparks and Ca 2+ quarks has been discussed in detail. This work is of significance to provide insight into understanding Ca 2+ release mechanisms in cardiac myocytes.
Releasing the cohesin ring: A rigid scaffold model for opening the DNA exit gate by Pds5 and Wapl.
Ouyang, Zhuqing; Yu, Hongtao
2017-04-01
The ring-shaped ATPase machine, cohesin, regulates sister chromatid cohesion, transcription, and DNA repair by topologically entrapping DNA. Here, we propose a rigid scaffold model to explain how the cohesin regulators Pds5 and Wapl release cohesin from chromosomes. Recent studies have established the Smc3-Scc1 interface as the DNA exit gate of cohesin, revealed a requirement for ATP hydrolysis in ring opening, suggested regulation of the cohesin ATPase activity by DNA and Smc3 acetylation, and provided insights into how Pds5 and Wapl open this exit gate. We hypothesize that Pds5, Wapl, and SA1/2 form a rigid scaffold that docks on Scc1 and anchors the N-terminal domain of Scc1 (Scc1N) to the Smc1 ATPase head. Relative movements between the Smc1-3 ATPase heads driven by ATP and Wapl disrupt the Smc3-Scc1 interface. Pds5 binds the dissociated Scc1N and prolongs this open state of cohesin, releasing DNA. We review the evidence supporting this model and suggest experiments that can further test its key principles. © 2017 WILEY Periodicals, Inc.
Experiences Supporting the Lunar Reconnaissance Orbiter Camera: the Devops Model
NASA Astrophysics Data System (ADS)
Licht, A.; Estes, N. M.; Bowman-Cisnesros, E.; Hanger, C. D.
2013-12-01
Introduction: The Lunar Reconnaissance Orbiter Camera (LROC) Science Operations Center (SOC) is responsible for instrument targeting, product processing, and archiving [1]. The LROC SOC maintains over 1,000,000 observations with over 300 TB of released data. Processing challenges compound with the acquisition of over 400 Gbits of observations daily creating the need for a robust, efficient, and reliable suite of specialized software. Development Environment: The LROC SOC's software development methodology has evolved over time. Today, the development team operates in close cooperation with the systems administration team in a model known in the IT industry as DevOps. The DevOps model enables a highly productive development environment that facilitates accomplishment of key goals within tight schedules[2]. The LROC SOC DevOps model incorporates industry best practices including prototyping, continuous integration, unit testing, code coverage analysis, version control, and utilizing existing open source software. Scientists and researchers at LROC often prototype algorithms and scripts in a high-level language such as MATLAB or IDL. After the prototype is functionally complete the solution is implemented as production ready software by the developers. Following this process ensures that all controls and requirements set by the LROC SOC DevOps team are met. The LROC SOC also strives to enhance the efficiency of the operations staff by way of weekly presentations and informal mentoring. Many small scripting tasks are assigned to the cognizant operations personnel (end users), allowing for the DevOps team to focus on more complex and mission critical tasks. In addition to leveraging open source software the LROC SOC has also contributed to the open source community by releasing Lunaserv [3]. Findings: The DevOps software model very efficiently provides smooth software releases and maintains team momentum. Scientists prototyping their work has proven to be very efficient as developers do not need to spend time iterating over small changes. Instead, these changes are realized in early prototypes and implemented before the task is seen by developers. The development practices followed by the LROC SOC DevOps team help facilitate a high level of software quality that is necessary for LROC SOC operations. Application to the Scientific Community: There is no replacement for having software developed by professional developers. While it is beneficial for scientists to write software, this activity should be seen as prototyping, which is then made production ready by professional developers. When constructed properly, even a small development team has the ability to increase the rate of software development for a research group while creating more efficient, reliable, and maintainable products. This strategy allows scientists to accomplish more, focusing on teamwork, rather than software development, which may not be their primary focus. 1. Robinson et al. (2010) Space Sci. Rev. 150, 81-124 2. DeGrandis. (2011) Cutter IT Journal. Vol 24, No. 8, 34-39 3. Estes, N.M.; Hanger, C.D.; Licht, A.A.; Bowman-Cisneros, E.; Lunaserv Web Map Service: History, Implementation Details, Development, and Uses, http://adsabs.harvard.edu/abs/2013LPICo1719.2609E.
26 CFR 514.8 - Release of excess tax withheld at source.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) REGULATIONS UNDER TAX CONVENTIONS FRANCE Withholding of Tax § 514.8 Release of excess tax withheld at source... in France, the withholding agent shall release and pay over to the person from whom the tax was... resident of France, or, in the case of a corporation, the owner was a French corporation; and (d) A...
Stolzenberg, Sebastian; Li, Zheng; Quick, Matthias; Malinauskaite, Lina; Nissen, Poul; Weinstein, Harel; Javitch, Jonathan A.; Shi, Lei
2017-01-01
Neurotransmitter:sodium symporters (NSSs) terminate neurotransmission by the reuptake of released neurotransmitters. This active accumulation of substrate against its concentration gradient is driven by the transmembrane Na+ gradient and requires that the transporter traverses several conformational states. LeuT, a prokaryotic NSS homolog, has been crystallized in outward-open, outward-occluded, and inward-open states. Two crystal structures of another prokaryotic NSS homolog, the multihydrophobic amino acid transporter (MhsT) from Bacillus halodurans, have been resolved in novel inward-occluded states, with the extracellular vestibule closed and the intracellular portion of transmembrane segment 5 (TM5i) in either an unwound or a helical conformation. We have investigated the potential involvement of TM5i in binding and unbinding of Na2, i.e. the Na+ bound in the Na2 site, by carrying out comparative molecular dynamics simulations of the models derived from the two MhsT structures. We find that the helical TM5i conformation is associated with a higher propensity for Na2 release, which leads to the repositioning of the N terminus and transition to an inward-open state. By using comparative interaction network analysis, we also identify allosteric pathways connecting TM5i and the Na2 binding site to the extracellular and intracellular regions. Based on our combined computational and mutagenesis studies of MhsT and LeuT, we propose that TM5i plays a key role in Na2 binding and release associated with the conformational transition toward the inward-open state, a role that is likely to be shared across the NSS family. PMID:28320858
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-14
... contracts before commercial sources in the open market. The proposed rule amends FAR 8.002 as follows: The... requirements for supplies and services from commercial sources in the open market. The proposed FAR 8.004 would... subpart 8.6). (b) Commercial sources (including educational and non-profit institutions) in the open...
EarthCollab, building geoscience-centric implementations of the VIVO semantic software suite
NASA Astrophysics Data System (ADS)
Rowan, L. R.; Gross, M. B.; Mayernik, M. S.; Daniels, M. D.; Krafft, D. B.; Kahn, H. J.; Allison, J.; Snyder, C. B.; Johns, E. M.; Stott, D.
2017-12-01
EarthCollab, an EarthCube Building Block project, is extending an existing open-source semantic web application, VIVO, to enable the exchange of information about scientific researchers and resources across institutions. EarthCollab is a collaboration between UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy, The Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory, and Cornell University. VIVO has been implemented by more than 100 universities and research institutions to highlight research and institutional achievements. This presentation will discuss benefits and drawbacks of working with and extending open source software. Some extensions include plotting georeferenced objects on a map, a mobile-friendly theme, integration of faceting via Elasticsearch, extending the VIVO ontology to capture geoscience-centric objects and relationships, and the ability to cross-link between VIVO instances. Most implementations of VIVO gather information about a single organization. The EarthCollab project created VIVO extensions to enable cross-linking of VIVO instances to reduce the amount of duplicate information about the same people and scientific resources and to enable dynamic linking of related information across VIVO installations. As the list of customizations grows, so does the effort required to maintain compatibility between the EarthCollab forks and the main VIVO code. For example, dozens of libraries and dependencies were updated prior to the VIVO v1.10 release, which introduced conflicts in the EarthCollab cross-linking code. The cross-linking code has been developed to enable sharing of data across different versions of VIVO, however, using a JSON output schema standardized across versions. We will outline lessons learned in working with VIVO and its open source dependencies, which include Jena, Solr, Freemarker, and jQuery and discuss future work by EarthCollab, which includes refining the cross-linking VIVO capabilities by continued integration of persistent and unique identifiers to enable automated lookup and matching across institutional VIVOs.
2016-03-25
peptide hormones such as corticotropin-releasing hormone (CRH) and adrenocorticotropic hormone (ACTH) by circulating glucocorticoids such as cortisol (CORT...As for many other hormones such as gonadotropin- releasing hormone (GnRH), insulin, and growth hormone (GH), the ultradian release pattern of...therapy; GH: growth hormone ; GnRH: gonadotropin-releasing hormone ; GR: glucocorticoid receptors; MDD: major depressive disorder; hnRNA: heterogeneous
SCoT: a Python toolbox for EEG source connectivity.
Billinger, Martin; Brunner, Clemens; Müller-Putz, Gernot R
2014-01-01
Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG). Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs) require single-trial estimation methods. In this paper, we present SCoT-a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with the MVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR) models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting. We demonstrate basic usage of SCoT on motor imagery (MI) data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1) brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2) offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT.
SCoT: a Python toolbox for EEG source connectivity
Billinger, Martin; Brunner, Clemens; Müller-Putz, Gernot R.
2014-01-01
Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG). Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs) require single-trial estimation methods. In this paper, we present SCoT—a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with the MVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR) models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting. We demonstrate basic usage of SCoT on motor imagery (MI) data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1) brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2) offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT. PMID:24653694
Toward a probabilistic acoustic emission source location algorithm: A Bayesian approach
NASA Astrophysics Data System (ADS)
Schumacher, Thomas; Straub, Daniel; Higgins, Christopher
2012-09-01
Acoustic emissions (AE) are stress waves initiated by sudden strain releases within a solid body. These can be caused by internal mechanisms such as crack opening or propagation, crushing, or rubbing of crack surfaces. One application for the AE technique in the field of Structural Engineering is Structural Health Monitoring (SHM). With piezo-electric sensors mounted to the surface of the structure, stress waves can be detected, recorded, and stored for later analysis. An important step in quantitative AE analysis is the estimation of the stress wave source locations. Commonly, source location results are presented in a rather deterministic manner as spatial and temporal points, excluding information about uncertainties and errors. Due to variability in the material properties and uncertainty in the mathematical model, measures of uncertainty are needed beyond best-fit point solutions for source locations. This paper introduces a novel holistic framework for the development of a probabilistic source location algorithm. Bayesian analysis methods with Markov Chain Monte Carlo (MCMC) simulation are employed where all source location parameters are described with posterior probability density functions (PDFs). The proposed methodology is applied to an example employing data collected from a realistic section of a reinforced concrete bridge column. The selected approach is general and has the advantage that it can be extended and refined efficiently. Results are discussed and future steps to improve the algorithm are suggested.
Evans, Nicholas G; Selgelid, Michael J
2015-08-01
In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.
Using R to implement spatial analysis in open source environment
NASA Astrophysics Data System (ADS)
Shao, Yixi; Chen, Dong; Zhao, Bo
2007-06-01
R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.
Laparoscopic versus open-component separation: a comparative analysis in a porcine model.
Rosen, Michael J; Williams, Christina; Jin, Judy; McGee, Michael F; Schomisch, Steve; Marks, Jeffrey; Ponsky, Jeffrey
2007-09-01
The ideal surgical treatment for complicated ventral hernias remains elusive. Traditional component separation provides local advancement of native tissue for tension-free closure without prosthetic materials. This technique requires an extensive subcutaneous dissection with division of perforating vessels predisposing to skin-flap necrosis and complicated wound infections. A minimally invasive component separation may decrease wound complication rates; however, the adequacy of the myofascial advancement has not been studied. Five 25-kg pigs underwent bilateral laparoscopic component separation. A 10-mm incision was made lateral to the rectus abdominus muscle. The external oblique fascia was incised, and a dissecting balloon was inflated between the internal and external oblique muscles. Two additional ports were placed in the intermuscular space. The external oblique was incised from the costal margin to the inguinal ligament. The maximal abdominal wall advancement was recorded. A formal open-component separation was performed and maximal advancement 5 cm superior and 5 cm inferior to the umbilicus was recorded for comparison. Groups were compared using standard statistical analysis. The laparoscopic component separation was completed successfully in all animals, with a mean of 22 min/side. Laparoscopic component separation yielded 3.9 cm (SD 1.1) of fascial advancement above the umbilicus, whereas 4.4 cm (1.2) was obtained after open release (P = .24). Below the umbilicus, laparoscopic release achieved 5.0 cm (1.0) of advancement, whereas 5.8 cm (1.2) was gained after open release (P = .13). The minimally invasive component separation achieved an average of 86% of the myofascial advancement compared with a formal open release. The laparoscopic approach does not require extensive subcutaneous dissection and might theoretically result in a decreased incidence or decreased complexity of postoperative wound infections or skin-flap necrosis. Based on our preliminary data in this porcine model, further comparative studies of laparoscopic versus open component separation in complex ventral hernia repair is warranted to evaluate postoperative morbidity and long-term hernia recurrence rates.
HerMES: point source catalogues from Herschel-SPIRE observations II
NASA Astrophysics Data System (ADS)
Wang, L.; Viero, M.; Clarke, C.; Bock, J.; Buat, V.; Conley, A.; Farrah, D.; Guo, K.; Heinis, S.; Magdis, G.; Marchetti, L.; Marsden, G.; Norberg, P.; Oliver, S. J.; Page, M. J.; Roehlly, Y.; Roseboom, I. G.; Schulz, B.; Smith, A. J.; Vaccari, M.; Zemcov, M.
2014-11-01
The Herschel Multi-tiered Extragalactic Survey (HerMES) is the largest Guaranteed Time Key Programme on the Herschel Space Observatory. With a wedding cake survey strategy, it consists of nested fields with varying depth and area totalling ˜380 deg2. In this paper, we present deep point source catalogues extracted from Herschel-Spectral and Photometric Imaging Receiver (SPIRE) observations of all HerMES fields, except for the later addition of the 270 deg2 HerMES Large-Mode Survey (HeLMS) field. These catalogues constitute the second Data Release (DR2) made in 2013 October. A sub-set of these catalogues, which consists of bright sources extracted from Herschel-SPIRE observations completed by 2010 May 1 (covering ˜74 deg2) were released earlier in the first extensive data release in 2012 March. Two different methods are used to generate the point source catalogues, the SUSSEXTRACTOR point source extractor used in two earlier data releases (EDR and EDR2) and a new source detection and photometry method. The latter combines an iterative source detection algorithm, STARFINDER, and a De-blended SPIRE Photometry algorithm. We use end-to-end Herschel-SPIRE simulations with realistic number counts and clustering properties to characterize basic properties of the point source catalogues, such as the completeness, reliability, photometric and positional accuracy. Over 500 000 catalogue entries in HerMES fields (except HeLMS) are released to the public through the HeDAM (Herschel Database in Marseille) website (http://hedam.lam.fr/HerMES).
Phase III Early Restoration Meeting | NOAA Gulf Spill Restoration
Louisiana Mississippi Texas Region-wide Open Ocean Data Media & News Publications Press Releases Story programmatic approach to early restoration planning for Phase III and future early restoration plans. Open
Ciobanu, O
2009-01-01
The objective of this study was to obtain three-dimensional (3D) images and to perform biomechanical simulations starting from DICOM images obtained by computed tomography (CT). Open source software were used to prepare digitized 2D images of tissue sections and to create 3D reconstruction from the segmented structures. Finally, 3D images were used in open source software in order to perform biomechanic simulations. This study demonstrates the applicability and feasibility of open source software developed in our days for the 3D reconstruction and biomechanic simulation. The use of open source software may improve the efficiency of investments in imaging technologies and in CAD/CAM technologies for implants and prosthesis fabrication which need expensive specialized software.
Survey of Non-Rigid Registration Tools in Medicine.
Keszei, András P; Berkels, Benjamin; Deserno, Thomas M
2017-02-01
We catalogue available software solutions for non-rigid image registration to support scientists in selecting suitable tools for specific medical registration purposes. Registration tools were identified using non-systematic search in Pubmed, Web of Science, IEEE Xplore® Digital Library, Google Scholar, and through references in identified sources (n = 22). Exclusions are due to unavailability or inappropriateness. The remaining (n = 18) tools were classified by (i) access and technology, (ii) interfaces and application, (iii) living community, (iv) supported file formats, and (v) types of registration methodologies emphasizing the similarity measures implemented. Out of the 18 tools, (i) 12 are open source, 8 are released under a permissive free license, which imposes the least restrictions on the use and further development of the tool, 8 provide graphical processing unit (GPU) support; (ii) 7 are built on software platforms, 5 were developed for brain image registration; (iii) 6 are under active development but only 3 have had their last update in 2015 or 2016; (iv) 16 support the Analyze format, while 7 file formats can be read with only one of the tools; and (v) 6 provide multiple registration methods and 6 provide landmark-based registration methods. Based on open source, licensing, GPU support, active community, several file formats, algorithms, and similarity measures, the tools Elastics and Plastimatch are chosen for the platform ITK and without platform requirements, respectively. Researchers in medical image analysis already have a large choice of registration tools freely available. However, the most recently published algorithms may not be included in the tools, yet.
NASA Astrophysics Data System (ADS)
Topping, David; Barley, Mark; Bane, Michael K.; Higham, Nicholas; Aumont, Bernard; Dingle, Nicholas; McFiggans, Gordon
2016-03-01
In this paper we describe the development and application of a new web-based facility, UManSysProp (http://umansysprop.seaes.manchester.ac.uk), for automating predictions of molecular and atmospheric aerosol properties. Current facilities include pure component vapour pressures, critical properties, and sub-cooled densities of organic molecules; activity coefficient predictions for mixed inorganic-organic liquid systems; hygroscopic growth factors and CCN (cloud condensation nuclei) activation potential of mixed inorganic-organic aerosol particles; and absorptive partitioning calculations with/without a treatment of non-ideality. The aim of this new facility is to provide a single point of reference for all properties relevant to atmospheric aerosol that have been checked for applicability to atmospheric compounds where possible. The group contribution approach allows users to upload molecular information in the form of SMILES (Simplified Molecular Input Line Entry System) strings and UManSysProp will automatically extract the relevant information for calculations. Built using open-source chemical informatics, and hosted at the University of Manchester, the facilities are provided via a browser and device-friendly web interface, or can be accessed using the user's own code via a JSON API (application program interface). We also provide the source code for all predictive techniques provided on the site, covered by the GNU GPL (General Public License) license to encourage development of a user community. We have released this via a Github repository (doi:10.5281/zenodo.45143). In this paper we demonstrate its use with specific examples that can be simulated using the web-browser interface.
46 CFR 308.532 - Release of surety bond, Form MA-312.
Code of Federal Regulations, 2013 CFR
2013-10-01
....532 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.532 Release of surety bond... American War Risk Agency or MARAD. ...
46 CFR 308.532 - Release of surety bond, Form MA-312.
Code of Federal Regulations, 2010 CFR
2010-10-01
....532 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.532 Release of surety bond... American War Risk Agency or MARAD. ...
46 CFR 308.532 - Release of surety bond, Form MA-312.
Code of Federal Regulations, 2012 CFR
2012-10-01
....532 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.532 Release of surety bond... American War Risk Agency or MARAD. ...
46 CFR 308.532 - Release of surety bond, Form MA-312.
Code of Federal Regulations, 2011 CFR
2011-10-01
....532 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.532 Release of surety bond... American War Risk Agency or MARAD. ...
Boulos, Maged N Kamel; Honda, Kiyoshi
2006-01-01
Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699
Rapid development of medical imaging tools with open-source libraries.
Caban, Jesus J; Joshi, Alark; Nagy, Paul
2007-11-01
Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.
Open-Source RTOS Space Qualification: An RTEMS Case Study
NASA Technical Reports Server (NTRS)
Zemerick, Scott
2017-01-01
NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.
ERIC Educational Resources Information Center
Armbruster, Chris
2008-01-01
Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge…
Volcanic and atmospheric controls on ash iron solubility: A review
NASA Astrophysics Data System (ADS)
Ayris, Paul; Delmelle, Pierre
2012-01-01
The ash material produced by volcanic eruptions carries important information about the underground magma eruptive conditions and subsequent modifications in the volcanic plume and during atmospheric transport. Volcanic ash is also studied because of its impacts on the environment and human health. In particular, there is a growing interest from a multidisciplinary scientific community to understand the role that ash deposition over open ocean regions may play as a source of bioavailable Fe for phytoplankton production. Similar to aeolian mineral dust, the processes that affect the mineralogy and speciation of Fe in ash may promote solubilisation of Fe in ash, and thus may increase the amount of volcanic Fe supplied to ocean surface waters. Our knowledge of these controls is still very limited, a situation which has hindered quantitative interpretation of experimental Fe release measurements. In this review, we identify the key volcanic and atmospheric controls that are likely to modulate ash Fe solubility. We also briefly discuss existing data on Fe release from ash and make some recommendations for future studies in this area.
NASA Astrophysics Data System (ADS)
Tamura, Yoshinobu; Yamada, Shigeru
OSS (open source software) systems which serve as key components of critical infrastructures in our social life are still ever-expanding now. Especially, embedded OSS systems have been gaining a lot of attention in the embedded system area, i.e., Android, BusyBox, TRON, etc. However, the poor handling of quality problem and customer support prohibit the progress of embedded OSS. Also, it is difficult for developers to assess the reliability and portability of embedded OSS on a single-board computer. In this paper, we propose a method of software reliability assessment based on flexible hazard rates for the embedded OSS. Also, we analyze actual data of software failure-occurrence time-intervals to show numerical examples of software reliability assessment for the embedded OSS. Moreover, we compare the proposed hazard rate model for the embedded OSS with the typical conventional hazard rate models by using the comparison criteria of goodness-of-fit. Furthermore, we discuss the optimal software release problem for the porting-phase based on the total expected software maintenance cost.
A conduit dilation model of methane venting from lake sediments
Scandella, B.P.; Varadharajan, C.; Hemond, Harold F.; Ruppel, C.; Juanes, R.
2011-01-01
Methane is a potent greenhouse gas, but its effects on Earth's climate remain poorly constrained, in part due to uncertainties in global methane fluxes to the atmosphere. An important source of atmospheric methane is the methane generated in organic-rich sediments underlying surface water bodies, including lakes, wetlands, and the ocean. The fraction of the methane that reaches the atmosphere depends critically on the mode and spatiotemporal characteristics of free-gas venting from the underlying sediments. Here we propose that methane transport in lake sediments is controlled by dynamic conduits, which dilate and release gas as the falling hydrostatic pressure reduces the effective stress below the tensile strength of the sediments. We test our model against a four-month record of hydrostatic load and methane flux in Upper Mystic Lake, Mass., USA, and show that it captures the complex episodicity of methane ebullition. Our quantitative conceptualization opens the door to integrated modeling of methane transport to constrain global methane release from lakes and other shallow-water, organic-rich sediment systems, and to assess its climate feedbacks.
Learning from hackers: open-source clinical trials.
Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico
2012-05-02
Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.
Barista: A Framework for Concurrent Speech Processing by USC-SAIL
Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G.; Narayanan, Shrikanth S.
2016-01-01
We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0. PMID:27610047
UpSetR: an R package for the visualization of intersecting sets and their properties
Conway, Jake R.; Lex, Alexander; Gehlenborg, Nils
2017-01-01
Abstract Motivation: Venn and Euler diagrams are a popular yet inadequate solution for quantitative visualization of set intersections. A scalable alternative to Venn and Euler diagrams for visualizing intersecting sets and their properties is needed. Results: We developed UpSetR, an open source R package that employs a scalable matrix-based visualization to show intersections of sets, their size, and other properties. Availability and implementation: UpSetR is available at https://github.com/hms-dbmi/UpSetR/ and released under the MIT License. A Shiny app is available at https://gehlenborglab.shinyapps.io/upsetr/. Contact: nils@hms.harvard.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28645171
Jossinet, Fabrice; Ludwig, Thomas E; Westhof, Eric
2010-08-15
Assemble is an intuitive graphical interface to analyze, manipulate and build complex 3D RNA architectures. It provides several advanced and unique features within the framework of a semi-automated modeling process that can be performed by homology and ab initio with or without electron density maps. Those include the interactive editing of a secondary structure and a searchable, embedded library of annotated tertiary structures. Assemble helps users with performing recurrent and otherwise tedious tasks in structural RNA research. Assemble is released under an open-source license (MIT license) and is freely available at http://bioinformatics.org/assemble. It is implemented in the Java language and runs on MacOSX, Linux and Windows operating systems.
Barista: A Framework for Concurrent Speech Processing by USC-SAIL.
Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G; Narayanan, Shrikanth S
2014-05-01
We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0.
NASA Astrophysics Data System (ADS)
Kopka, Piotr; Wawrzynczak, Anna; Borysiewicz, Mieczyslaw
2016-11-01
In this paper the Bayesian methodology, known as Approximate Bayesian Computation (ABC), is applied to the problem of the atmospheric contamination source identification. The algorithm input data are on-line arriving concentrations of the released substance registered by the distributed sensors network. This paper presents the Sequential ABC algorithm in detail and tests its efficiency in estimation of probabilistic distributions of atmospheric release parameters of a mobile contamination source. The developed algorithms are tested using the data from Over-Land Atmospheric Diffusion (OLAD) field tracer experiment. The paper demonstrates estimation of seven parameters characterizing the contamination source, i.e.: contamination source starting position (x,y), the direction of the motion of the source (d), its velocity (v), release rate (q), start time of release (ts) and its duration (td). The online-arriving new concentrations dynamically update the probability distributions of search parameters. The atmospheric dispersion Second-order Closure Integrated PUFF (SCIPUFF) Model is used as the forward model to predict the concentrations at the sensors locations.
Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.
Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed
2015-02-01
Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.
Benchmarking for Bayesian Reinforcement Learning
Ernst, Damien; Couëtoux, Adrien
2016-01-01
In the Bayesian Reinforcement Learning (BRL) setting, agents try to maximise the collected rewards while interacting with their environment while using some prior knowledge that is accessed beforehand. Many BRL algorithms have already been proposed, but the benchmarks used to compare them are only relevant for specific cases. The paper addresses this problem, and provides a new BRL comparison methodology along with the corresponding open source library. In this methodology, a comparison criterion that measures the performance of algorithms on large sets of Markov Decision Processes (MDPs) drawn from some probability distributions is defined. In order to enable the comparison of non-anytime algorithms, our methodology also includes a detailed analysis of the computation time requirement of each algorithm. Our library is released with all source code and documentation: it includes three test problems, each of which has two different prior distributions, and seven state-of-the-art RL algorithms. Finally, our library is illustrated by comparing all the available algorithms and the results are discussed. PMID:27304891
PolNet: A Tool to Quantify Network-Level Cell Polarity and Blood Flow in Vascular Remodeling.
Bernabeu, Miguel O; Jones, Martin L; Nash, Rupert W; Pezzarossa, Anna; Coveney, Peter V; Gerhardt, Holger; Franco, Claudio A
2018-05-08
In this article, we present PolNet, an open-source software tool for the study of blood flow and cell-level biological activity during vessel morphogenesis. We provide an image acquisition, segmentation, and analysis protocol to quantify endothelial cell polarity in entire in vivo vascular networks. In combination, we use computational fluid dynamics to characterize the hemodynamics of the vascular networks under study. The tool enables, to our knowledge for the first time, a network-level analysis of polarity and flow for individual endothelial cells. To date, PolNet has proven invaluable for the study of endothelial cell polarization and migration during vascular patterning, as demonstrated by two recent publications. Additionally, the tool can be easily extended to correlate blood flow with other experimental observations at the cellular/molecular level. We release the source code of our tool under the Lesser General Public License. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Benchmarking for Bayesian Reinforcement Learning.
Castronovo, Michael; Ernst, Damien; Couëtoux, Adrien; Fonteneau, Raphael
2016-01-01
In the Bayesian Reinforcement Learning (BRL) setting, agents try to maximise the collected rewards while interacting with their environment while using some prior knowledge that is accessed beforehand. Many BRL algorithms have already been proposed, but the benchmarks used to compare them are only relevant for specific cases. The paper addresses this problem, and provides a new BRL comparison methodology along with the corresponding open source library. In this methodology, a comparison criterion that measures the performance of algorithms on large sets of Markov Decision Processes (MDPs) drawn from some probability distributions is defined. In order to enable the comparison of non-anytime algorithms, our methodology also includes a detailed analysis of the computation time requirement of each algorithm. Our library is released with all source code and documentation: it includes three test problems, each of which has two different prior distributions, and seven state-of-the-art RL algorithms. Finally, our library is illustrated by comparing all the available algorithms and the results are discussed.
Cluster-lensing: A Python Package for Galaxy Clusters and Miscentering
NASA Astrophysics Data System (ADS)
Ford, Jes; VanderPlas, Jake
2016-12-01
We describe a new open source package for calculating properties of galaxy clusters, including Navarro, Frenk, and White halo profiles with and without the effects of cluster miscentering. This pure-Python package, cluster-lensing, provides well-documented and easy-to-use classes and functions for calculating cluster scaling relations, including mass-richness and mass-concentration relations from the literature, as well as the surface mass density {{Σ }}(R) and differential surface mass density {{Δ }}{{Σ }}(R) profiles, probed by weak lensing magnification and shear. Galaxy cluster miscentering is especially a concern for stacked weak lensing shear studies of galaxy clusters, where offsets between the assumed and the true underlying matter distribution can lead to a significant bias in the mass estimates if not accounted for. This software has been developed and released in a public GitHub repository, and is licensed under the permissive MIT license. The cluster-lensing package is archived on Zenodo. Full documentation, source code, and installation instructions are available at http://jesford.github.io/cluster-lensing/.
OMPC: an Open-Source MATLAB-to-Python Compiler.
Jurica, Peter; van Leeuwen, Cees
2009-01-01
Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.
Wang, Yang; Zhang, Xiao-jian; Chen, Chao; Pan, An-jun; Xu, Yang; Liao, Ping-an; Zhang, Su-xia; Gu, Jun-nong
2009-12-01
Red water phenomenon occurred in some communities of a city in China after water source switch in recent days. The origin of this red water problem and mechanism of iron release were investigated in the study. Water quality of local and new water sources was tested and tap water quality in suffered area had been monitored for 3 months since red water occurred. Interior corrosion scales on the pipe which was obtained from the suffered area were analyzed by XRD, SEM, and EDS. Corrosion rates of cast iron under the conditions of two source water were obtained by Annular Reactor. The influence of different source water on iron release was studied by pipe section reactor to simulate the distribution systems. The results indicated that large increase of sulfate concentration by water source shift was regarded as the cause of red water problem. The Larson ratio increased from about 0.4 to 1.7-1.9 and the red water problem happened in the taps of some urban communities just several days after the new water source was applied. The mechanism of iron release was concluded that the stable shell of scales in the pipes had been corrupted by this kind of high-sulfate-concentration source water and it was hard to recover soon spontaneously. The effect of sulfate on iron release of the old cast iron was more significant than its effect on enhancing iron corrosion. The rate of iron release increased with increasing Larson ratio, and the correlation of them was nonlinear on the old cast-iron. The problem remained quite a long time even if the water source re-shifted into the blended one with only small ratio of the new source and the Larson ratio reduced to about 0.6.
Xu, Junnan; Song, Dan; Bai, Qiufang; Zhou, Lijun; Cai, Liping; Hertz, Leif; Peng, Liang
2014-01-13
This study investigates the role of glycogenolysis in stimulated release of ATP as a transmitter from astrocytes. Within the last 20 years our understanding of brain glycogenolysis has changed from it being a relatively uninteresting process to being a driving force for essential brain functions like production of transmitter glutamate and homoeostasis of potassium ions (K+) after their release from excited neurons. Simultaneously, the importance of astrocytic handling of adenosine, its phosphorylation to ATP and release of some astrocytic ATP, located in vesicles, as an important transmitter has also become to be realized. Among the procedures stimulating Ca2+-dependent release of vesicular ATP are exposure to such transmitters as glutamate and adenosine, which raise intra-astrocytic Ca2+ concentration, or increase of extracellular K+ to a depolarizing level that opens astrocytic L-channels for Ca2+ and thereby also increase intra-astrocytic Ca2+ concentration, a prerequisite for glycogenolysis. The present study has confirmed and quantitated stimulated ATP release from well differentiated astrocyte cultures by glutamate, adenosine or elevated extracellular K+ concentrations, measured by a luciferin/luciferase reaction. It has also shown that this release is virtually abolished by an inhibitor of glycogenolysis as well as by inhibitors of transmitter-mediated signaling or of L-channel opening by elevated K+ concentrations.
Hybrid cloud and cluster computing paradigms for life science applications
2010-01-01
Background Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Results Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. Conclusions The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. Methods We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments. PMID:21210982
Hybrid cloud and cluster computing paradigms for life science applications.
Qiu, Judy; Ekanayake, Jaliya; Gunarathne, Thilina; Choi, Jong Youl; Bae, Seung-Hee; Li, Hui; Zhang, Bingjing; Wu, Tak-Lon; Ruan, Yang; Ekanayake, Saliya; Hughes, Adam; Fox, Geoffrey
2010-12-21
Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments.
Rothman, Jason S.; Silver, R. Angus
2018-01-01
Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519
NASA Astrophysics Data System (ADS)
Zhao, L.; Landi, E.; Lepri, S. T.; Kocher, M.; Zurbuchen, T. H.; Fisk, L. A.; Raines, J. M.
2017-01-01
In this paper, we study a subset of slow solar winds characterized by an anomalous charge state composition and ion temperatures compared to average solar wind distributions, and thus referred to as an “Outlier” wind. We find that although this wind is slower and denser than normal slow wind, it is accelerated from the same source regions (active regions and quiet-Sun regions) as the latter and its occurrence rate depends on the solar cycle. The defining property of the Outlier wind is that its charge state composition is the same as that of normal slow wind, with the only exception being a very large decrease in the abundance of fully charged species (He2+, C6+, N7+, O8+, Mg12+), resulting in a significant depletion of the He and C element abundances. Based on these observations, we suggest three possible scenarios for the origin of this wind: (1) local magnetic waves preferentially accelerating non-fully stripped ions over fully stripped ions from a loop opened by reconnection; (2) depleted fully stripped ions already contained in the corona magnetic loops before they are opened up by reconnection; or (3) fully stripped ions depleted by Coulomb collision after magnetic reconnection in the solar corona. If any one of these three scenarios is confirmed, the Outlier wind represents a direct signature of slow wind release through magnetic reconnection.
The validity of open-source data when assessing jail suicides.
Thomas, Amanda L; Scott, Jacqueline; Mellow, Jeff
2018-05-09
The Bureau of Justice Statistics' Deaths in Custody Reporting Program is the primary source for jail suicide research, though the data is restricted from general dissemination. This study is the first to examine whether jail suicide data obtained from publicly available sources can help inform our understanding of this serious public health problem. Of the 304 suicides that were reported through the DCRP in 2009, roughly 56 percent (N = 170) of those suicides were identified through the open-source search protocol. Each of the sources was assessed based on how much information was collected on the incident and the types of variables available. A descriptive analysis was then conducted on the variables that were present in both data sources. The four variables present in each data source were: (1) demographic characteristics of the victim, (2) the location of occurrence within the facility, (3) the location of occurrence by state, and (4) the size of the facility. Findings demonstrate that the prevalence and correlates of jail suicides are extremely similar in both open-source and official data. However, for almost every variable measured, open-source data captured as much information as official data did, if not more. Further, variables not found in official data were identified in the open-source database, thus allowing researchers to have a more nuanced understanding of the situational characteristics of the event. This research provides support for the argument in favor of including open-source data in jail suicide research as it illustrates how open-source data can be used to provide additional information not originally found in official data. In sum, this research is vital in terms of possible suicide prevention, which may be directly linked to being able to manipulate environmental factors.
Update to An Inventory of Sources and Environmental ...
In 2006, EPA published an inventory of sources and environmental releases of dioxin-like compounds in the United States. This draft report presents an update and revision to that dioxin source inventory. It also presents updated estimates of environmental releases of dioxin-like compounds to the air, water, land and products. The sources are grouped into five broad categories: combustion sources, metals smelting/refining, chemical manufacturing, natural sources, and environmental reservoirs. Estimates of annual releases to land, air, and water are presented for reference years 1987, 1995, and 2000. While the overall decreasing trend in emissions seen in the original report continues, the individual dioxin releases in this draft updated report are generally higher than the values reported in 2006. This is largely due to the inclusion (in all three years) of additional sources in the quantitative inventory that were not included in the 2006 report. The largest new source included in this draft updated inventory was forest fires. In the 2006 report, this was classified as preliminary and not included in the quantitative inventory. The top three air sources of dioxin emissions in 2000 were forest fires, backyard burning of trash, and medical waste incinerators. The Report Presents An Update To The Dioxin Source Inventory Published In 2006 (U.S. Epa, 2006). The Peer-Review Panel For The 2006 Document Provided Additional Comments After The Final Report Had
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kocbach, Anette; Herseth, Jan Inge; Lag, Marit
2008-10-15
The inflammatory potential of particles from wood smoke and traffic has not been well elucidated. In this study, a contact co-culture of monocytes and pneumocytes was exposed to 10-40 {mu}g/cm{sup 2} of particles from wood smoke and traffic for 12, 40 and 64 h to determine their influence on pro-inflammatory cytokine release (TNF-{alpha}, IL-1, IL-6, IL-8) and viability. To investigate the role of organic constituents in cytokine release the response to particles, their organic extracts and the washed particles were compared. Antagonists were used to investigate source-dependent differences in intercellular signalling (TNF-{alpha}, IL-1). The cytotoxicity was low after exposure tomore » particles from both sources. However, wood smoke, and to a lesser degree traffic-derived particles, induced a reduction in cell number, which was associated with the organic fraction. The release of pro-inflammatory cytokines was similar for both sources after 12 h, but traffic induced a greater release than wood smoke particles with increasing exposure time. The organic fraction accounted for the majority of the cytokine release induced by wood smoke, whereas the washed traffic particles induced a stronger response than the corresponding organic extract. TNF-{alpha} and IL-1 antagonists reduced the release of IL-8 induced by particles from both sources. In contrast, the IL-6 release was only reduced by the IL-1 antagonist during exposure to traffic-derived particles. In summary, particles from wood smoke and traffic induced differential pro-inflammatory response patterns with respect to cytokine release and cell number. Moreover, the influence of the organic particle fraction and intercellular signalling on the pro-inflammatory response seemed to be source-dependent.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Lei; Yu, Cong, E-mail: muduri@shao.ac.cn, E-mail: cyu@ynao.ac.cn
2014-04-01
We propose a catastrophic eruption model for the enormous energy release of magnetars during giant flares, in which a toroidal and helically twisted flux rope is embedded within a force-free magnetosphere. The flux rope stays in stable equilibrium states initially and evolves quasi-statically. Upon the loss of equilibrium, the flux rope cannot sustain the stable equilibrium states and erupts catastrophically. During the process, the magnetic energy stored in the magnetosphere is rapidly released as the result of destabilization of global magnetic topology. The magnetospheric energy that could be accumulated is of vital importance for the outbursts of magnetars. We carefullymore » establish the fully open fields and partially open fields for various boundary conditions at the magnetar surface and study the relevant energy thresholds. By investigating the magnetic energy accumulated at the critical catastrophic point, we find that it is possible to drive fully open eruptions for dipole-dominated background fields. Nevertheless, it is hard to generate fully open magnetic eruptions for multipolar background fields. Given the observational importance of the multipolar magnetic fields in the vicinity of the magnetar surface, it would be worthwhile to explore the possibility of the alternative eruption approach in multipolar background fields. Fortunately, we find that flux ropes may give rise to partially open eruptions in the multipolar fields, which involve only partial opening of background fields. The energy release fractions are greater for cases with central-arcaded multipoles than those with central-caved multipoles that emerged in background fields. Eruptions would fail only when the centrally caved multipoles become extremely strong.« less
NASA Technical Reports Server (NTRS)
Sutter, B.; Archer, D.; McAdam, A.; Franz, H.; Ming, D. W.; Eigenbrode, J. L.; Glavin, D. P.; Mahaffy, P.; Stern, J.; Navarro-Gonzalez, R.
2013-01-01
The Sample Analysis at Mars (SAM) instrument detected four releases of carbon dioxide (CO2) that ranged from 100 to 700 C from the Rocknest eolian bedform material (Fig. 1). Candidate sources of CO2 include adsorbed CO2, carbonate(s), combusted organics that are either derived from terrestrial contamination and/or of martian origin, occluded or trapped CO2, and other sources that have yet to be determined. The Phoenix Lander s Thermal Evolved Gas Analyzer (TEGA) detected two CO2 releases (400-600, 700-840 C) [1,2]. The low temperature release was attributed to Fe- and/or Mg carbonates [1,2], per-chlorate interactions with carbonates [3], nanophase carbonates [4] and/or combusted organics [1]. The high temperature CO2 release was attributed to a calcium bearing carbonate [1,2]. No evidence of a high temperature CO2 release similar to the Phoenix material was detected in the Rocknest materials by SAM. The objectives of this work are to evaluate the temperature and total contribution of each Rocknest CO2 release and their possible sources. Four CO2 releases from the Rocknest material were detected by SAM. Potential sources of CO2 are adsorbed CO2, (peak 1) and Fe/Mg carbonates (peak 4). Only a fraction of peaks 2 and 3 (0.01 C wt.%) may be partially attributed to combustion of organic contamination. Meteoritic organics mixed in the Rocknest bedform could be present, but the peak 2 and 3 C concentration (approx.0.21 C wt. %) is likely too high to be attributed solely to meteoritic organic C. Other inorganic sources of C such as interactions of perchlorates and carbonates and sources yet to be identified will be evaluated to account for CO2 released from the thermal decomposition of Rocknest material.
Hazardous material releases due to unsecured openings and lining failures, volume 1.
DOT National Transportation Integrated Search
1990-12-01
In response to the large number of unintentional releases of hazardous : materials from railroad tank cars for which accidents were not the cause, the : Federal Railroad Administration (FRA) initiated this study to (1) recommend : procedures to ensur...
Open Source 2010: Reflections on 2007
ERIC Educational Resources Information Center
Wheeler, Brad
2007-01-01
Colleges and universities and commercial firms have demonstrated great progress in realizing the vision proffered for "Open Source 2007," and 2010 will mark even greater progress. Although much work remains in refining open source for higher education applications, the signals are now clear: the collaborative development of software can provide…
Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments
ERIC Educational Resources Information Center
Wang, Shuo; Wang, Jing; Gao, Yanjing
2017-01-01
An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…
Creating Open Source Conversation
ERIC Educational Resources Information Center
Sheehan, Kate
2009-01-01
Darien Library, where the author serves as head of knowledge and learning services, launched a new website on September 1, 2008. The website is built with Drupal, an open source content management system (CMS). In this article, the author describes how she and her colleagues overhauled the library's website to provide an open source content…
Integrating an Automatic Judge into an Open Source LMS
ERIC Educational Resources Information Center
Georgouli, Katerina; Guerreiro, Pedro
2011-01-01
This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…
76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... be held in the General Services Administration (GSA), Central Office Auditorium, 1800 F Street NW...
The open-source movement: an introduction for forestry professionals
Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove
2005-01-01
In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....
Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.
ERIC Educational Resources Information Center
Newby, Gregory B.; Greenberg, Jane; Jones, Paul
2003-01-01
Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)
Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat.
Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart
2015-04-21
Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation.
Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat
Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart
2015-01-01
Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation. PMID:25897892
ERIC Educational Resources Information Center
Guhlin, Miguel
2007-01-01
A switch to free open source software can minimize cost and allow funding to be diverted to equipment and other programs. For instance, the OpenOffice suite is an alternative to expensive basic application programs offered by major vendors. Many such programs on the market offer features seldom used in education but for which educators must pay.…
Bhardwaj, Anshu; Scaria, Vinod; Raghava, Gajendra Pal Singh; Lynn, Andrew Michael; Chandra, Nagasuma; Banerjee, Sulagna; Raghunandanan, Muthukurussi V; Pandey, Vikas; Taneja, Bhupesh; Yadav, Jyoti; Dash, Debasis; Bhattacharya, Jaijit; Misra, Amit; Kumar, Anil; Ramachandran, Srinivasan; Thomas, Zakir; Brahmachari, Samir K
2011-09-01
It is being realized that the traditional closed-door and market driven approaches for drug discovery may not be the best suited model for the diseases of the developing world such as tuberculosis and malaria, because most patients suffering from these diseases have poor paying capacity. To ensure that new drugs are created for patients suffering from these diseases, it is necessary to formulate an alternate paradigm of drug discovery process. The current model constrained by limitations for collaboration and for sharing of resources with confidentiality hampers the opportunities for bringing expertise from diverse fields. These limitations hinder the possibilities of lowering the cost of drug discovery. The Open Source Drug Discovery project initiated by Council of Scientific and Industrial Research, India has adopted an open source model to power wide participation across geographical borders. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multi-faceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Since the inventions are community generated, the new chemical entities developed by Open Source Drug Discovery will be taken up for clinical trial in a non-exclusive manner by participation of multiple companies with majority funding from Open Source Drug Discovery. This will ensure availability of drugs through a lower cost community driven drug discovery process for diseases afflicting people with poor paying capacity. Hopefully what LINUX the World Wide Web have done for the information technology, Open Source Drug Discovery will do for drug discovery. Copyright © 2011 Elsevier Ltd. All rights reserved.
State-of-the-practice and lessons learned on implementing open data and open source policies.
DOT National Transportation Integrated Search
2012-05-01
This report describes the current government, academic, and private sector practices associated with open data and open source application development. These practices are identified; and the potential uses with the ITS Programs Data Capture and M...
OPM: The Open Porous Media Initiative
NASA Astrophysics Data System (ADS)
Flemisch, B.; Flornes, K. M.; Lie, K.; Rasmussen, A.
2011-12-01
The principal objective of the Open Porous Media (OPM) initiative is to develop a simulation suite that is capable of modeling industrially and scientifically relevant flow and transport processes in porous media and bridge the gap between the different application areas of porous media modeling, including reservoir mechanics, CO2 sequestration, biological systems, and product development of engineered media. The OPM initiative will provide a long-lasting, efficient, and well-maintained open-source software for flow and transport in porous media built on modern software principles. The suite is released under the GNU General Public License (GPL). Our motivation is to provide a means to unite industry and public research on simulation of flow and transport in porous media. For academic users, we seek to provide a software infrastructure that facilitates testing of new ideas on models with industry-standard complexity, while at the same time giving the researcher control over discretization and solvers. Similarly, we aim to accelerate the technology transfer from academic institutions to professional companies by making new research results available as free software of professional standard. The OPM initiative is currently supported by six research groups in Norway and Germany and funded by existing grants from public research agencies as well as from Statoil Petroleum and Total E&P Norge. However, a full-scale development of the OPM initiative requires substantially more funding and involvement of more research groups and potential end users. In this talk, we will provide an overview of the current activities in the OPM initiative. Special emphasis will be given to the demonstration of the synergies achieved by combining the strengths of individual open-source software components. In particular, a new fully implicit solver developed within the DUNE-based simulator DuMux could be enhanced by the ability to read industry-standard Eclipse input files and to run on grids given in corner-point format. Examples taken from the SPE comparative solution projects and CO2 sequestration benchmarks illustrate the current capabilities of the simulation suite.