ERIC Educational Resources Information Center
Guhlin, Miguel
2007-01-01
Open source has continued to evolve and in the past three years the development of a graphical user interface has made it increasingly accessible and viable for end users without special training. Open source relies to a great extent on the free software movement. In this context, the term free refers not to cost, but to the freedom users have to…
The openEHR Java reference implementation project.
Chen, Rong; Klein, Gunnar
2007-01-01
The openEHR foundation has developed an innovative design for interoperable and future-proof Electronic Health Record (EHR) systems based on a dual model approach with a stable reference information model complemented by archetypes for specific clinical purposes.A team from Sweden has implemented all the stable specifications in the Java programming language and donated the source code to the openEHR foundation. It was adopted as the openEHR Java Reference Implementation in March 2005 and released under open source licenses. This encourages early EHR implementation projects around the world and a number of groups have already started to use this code. The early Java implementation experience has also led to the publication of the openEHR Java Implementation Technology Specification. A number of design changes to the specifications and important minor corrections have been directly initiated by the implementation project over the last two years. The Java Implementation has been important for the validation and improvement of the openEHR design specifications and provides building blocks for future EHR systems.
The Commercial Open Source Business Model
NASA Astrophysics Data System (ADS)
Riehle, Dirk
Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.
Comparing the biocidal properties of non-thermal plasma sources by reference protocol
NASA Astrophysics Data System (ADS)
Khun, Josef; Jirešová, Jana; Kujalová, Lucie; Hozák, Pavel; Scholtz, Vladimír
2017-10-01
The previously proposed reference protocol enabling easy comparison of biocidal properties of different non-thermal plasma sources has been followed and discussed. For inactivation tests the reference protocol has used spores of Gram positive bacterium Bacillus subtilis (ATCC 6633) deposited on a polycarbonate membrane as reference sample. In this work, biocidal properties of a negative glow corona, positive streamer corona, positive transient spark and cometary discharges are being compared in both open air and closed apparatus. Despite the total number of bacteria surviving 1 h exposure has decreased by up to 7 orders in closed apparatus, in open one, only weak inhibition bactericidal effect has been observed.
Openness, Web 2.0 Technology, and Open Science
ERIC Educational Resources Information Center
Peters, Michael A.
2010-01-01
Open science is a term that is being used in the literature to designate a form of science based on open source models or that utilizes principles of open access, open archiving and open publishing to promote scientific communication. Open science increasingly also refers to open governance and more democratized engagement and control of science…
Open Source Software and the Intellectual Commons.
ERIC Educational Resources Information Center
Dorman, David
2002-01-01
Discusses the Open Source Software method of software development and its relationship to control over information content. Topics include digital library resources; reference services; preservation; the legal and economic status of information; technical standards; access to digital data; control of information use; and copyright and patent laws.…
Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.
Benson, Tim
2016-07-04
Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.
OpenSHMEM-UCX : Evaluation of UCX for implementing OpenSHMEM Programming Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Matthew B; Gorentla Venkata, Manjunath; Aderholdt, William Ferrol
2016-01-01
The OpenSHMEM reference implementation was developed towards the goal of developing an open source and high-performing Open- SHMEM implementation. To achieve portability and performance across various networks, the OpenSHMEM reference implementation uses GAS- Net and UCCS for network operations. Recently, new network layers have emerged with the promise of providing high-performance, scalabil- ity, and portability for HPC applications. In this paper, we implement the OpenSHMEM reference implementation to use the UCX framework for network operations. Then, we evaluate its performance and scalabil- ity on Cray XK systems to understand UCX s suitability for developing the OpenSHMEM programming model. Further, wemore » develop a bench- mark called SHOMS for evaluating the OpenSHMEM implementation. Our experimental results show that OpenSHMEM-UCX outperforms the vendor supplied OpenSHMEM implementation in most cases on the Cray XK system by up to 40% with respect to message rate and up to 70% for the execution of application kernels.« less
Open-Source Low-Cost Wireless Potentiometric Instrument for pH Determination Experiments
ERIC Educational Resources Information Center
Jin, Hao; Qin, Yiheng; Pan, Si; Alam, Arif U.; Dong, Shurong; Ghosh, Raja; Deen, M. Jamal
2018-01-01
pH determination is an essential experiment in many chemistry laboratories. It requires a potentiometric instrument with extremely low input bias current to accurately measure the voltage between a pH sensing electrode and a reference electrode. In this technology report, we propose an open-source potentiometric instrument for pH determination…
40 CFR 49.129 - Rule for limiting emissions of sulfur dioxide.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., incinerator, marine vessel, mobile sources, motor vehicle, nonroad engine, nonroad vehicle, open burning, process source, reference method, refuse, residual fuel oil, solid fuel, stack, standard conditions...
40 CFR 49.129 - Rule for limiting emissions of sulfur dioxide.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., incinerator, marine vessel, mobile sources, motor vehicle, nonroad engine, nonroad vehicle, open burning, process source, reference method, refuse, residual fuel oil, solid fuel, stack, standard conditions...
Open Genetic Code: on open source in the life sciences.
Deibel, Eric
2014-01-01
The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.
Cultural Geography Model Validation
2010-03-01
the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S
Rideout, Jai Ram; He, Yan; Navas-Molina, Jose A; Walters, William A; Ursell, Luke K; Gibbons, Sean M; Chase, John; McDonald, Daniel; Gonzalez, Antonio; Robbins-Pianka, Adam; Clemente, Jose C; Gilbert, Jack A; Huse, Susan M; Zhou, Hong-Wei; Knight, Rob; Caporaso, J Gregory
2014-01-01
We present a performance-optimized algorithm, subsampled open-reference OTU picking, for assigning marker gene (e.g., 16S rRNA) sequences generated on next-generation sequencing platforms to operational taxonomic units (OTUs) for microbial community analysis. This algorithm provides benefits over de novo OTU picking (clustering can be performed largely in parallel, reducing runtime) and closed-reference OTU picking (all reads are clustered, not only those that match a reference database sequence with high similarity). Because more of our algorithm can be run in parallel relative to "classic" open-reference OTU picking, it makes open-reference OTU picking tractable on massive amplicon sequence data sets (though on smaller data sets, "classic" open-reference OTU clustering is often faster). We illustrate that here by applying it to the first 15,000 samples sequenced for the Earth Microbiome Project (1.3 billion V4 16S rRNA amplicons). To the best of our knowledge, this is the largest OTU picking run ever performed, and we estimate that our new algorithm runs in less than 1/5 the time than would be required of "classic" open reference OTU picking. We show that subsampled open-reference OTU picking yields results that are highly correlated with those generated by "classic" open-reference OTU picking through comparisons on three well-studied datasets. An implementation of this algorithm is provided in the popular QIIME software package, which uses uclust for read clustering. All analyses were performed using QIIME's uclust wrappers, though we provide details (aided by the open-source code in our GitHub repository) that will allow implementation of subsampled open-reference OTU picking independently of QIIME (e.g., in a compiled programming language, where runtimes should be further reduced). Our analyses should generalize to other implementations of these OTU picking algorithms. Finally, we present a comparison of parameter settings in QIIME's OTU picking workflows and make recommendations on settings for these free parameters to optimize runtime without reducing the quality of the results. These optimized parameters can vastly decrease the runtime of uclust-based OTU picking in QIIME.
Mobile Technologies and Augmented Reality in Open Education
ERIC Educational Resources Information Center
Kurubacak, Gulsun, Ed.; Altinpulluk, Hakan, Ed.
2017-01-01
Novel trends and innovations have enhanced contemporary educational environments. When applied properly, these computing advances can create enriched learning opportunities for students. "Mobile Technologies and Augmented Reality in Open Education" is a pivotal reference source for the latest academic research on the integration of…
OpCost: an open-source system for estimating costs of stand-level forest operations
Conor K. Bell; Robert F. Keefe; Jeremy S. Fried
2017-01-01
This report describes and documents the OpCost forest operations cost model, a key component of the BioSum analysis framework. OpCost is available in two editions: as a callable module for use with BioSum, and in a stand-alone edition that can be run directly from R. OpCost model logic and assumptions for this open-source tool are explained, references to the...
Git, K-A; Fioravante, L A B; Fernandes, J L
2015-09-01
To assess whether an online open-source tool would provide accurate calculations of T2(*) values for iron concentrations in the liver and heart compared with a standard reference software. An online open-source tool, written in pure HTML5/Javascript, was tested in 50 patients (age 26.0 ± 18.9 years, 46% males) who underwent T2(*) MRI of the liver and heart for iron overload assessment as part of their routine workup. Automated truncation correction was the default with optional manual adjustment provided if needed. The results were compared against a standard reference measurement using commercial software with manual truncation (CVI(42)(®) v. 5.1; Circle Cardiovascular Imaging; Calgary, AB). The mean liver T2(*) values calculated with the automated tool was 4.3 ms [95% confidence interval (CI) 3.1 to 5.5 ms] vs 4.26 ms using the reference software (95% CI 3.1 to 5.4 ms) without any significant differences (p = 0.71). In the liver, the mean difference was 0.036 ms (95% CI -0.1609 to 0.2329 ms) with a regression correlation coefficient of 0.97. For the heart, the automated T2(*) value was 26.0 ms (95% CI 22.9 to 29.0 ms) vs 25.3 ms (95% CI 22.3 to 28.3 ms), p = 0.28. The mean difference was 0.72 ms (95% CI 0.08191 to 1.3621 ms) with a correlation coefficient of 0.96. The automated online tool provides similar T2(*) values for the liver and myocardial iron concentrations as compared with a standard reference software. The online program provides an open-source tool for the calculation of T2(*) values, incorporating an automated correction algorithm in a simple and easy-to-use interface.
Cox, B L; Ludwig, K D; Adamson, E B; Eliceiri, K W; Fain, S B
2018-03-01
In medical imaging, clinicians, researchers and technicians have begun to use 3D printing to create specialized phantoms to replace commercial ones due to their customizable and iterative nature. Presented here is the design of a 3D printed open source, reusable magnetic resonance imaging (MRI) phantom, capable of flood-filling, with removable samples for measurements of contrast agent solutions and reference standards, and for use in evaluating acquisition techniques and image reconstruction performance. The phantom was designed using SolidWorks, a computer-aided design software package. The phantom consists of custom and off-the-shelf parts and incorporates an air hole and Luer Lock system to aid in flood filling, a marker for orientation of samples in the filled mode and bolt and tube holes for assembly. The cost of construction for all materials is under $90. All design files are open-source and available for download. To demonstrate utility, B 0 field mapping was performed using a series of gadolinium concentrations in both the unfilled and flood-filled mode. An excellent linear agreement (R 2 >0.998) was observed between measured relaxation rates (R 1 /R 2 ) and gadolinium concentration. The phantom provides a reliable setup to test data acquisition and reconstruction methods and verify physical alignment in alternative nuclei MRI techniques (e.g. carbon-13 and fluorine-19 MRI). A cost-effective, open-source MRI phantom design for repeated quantitative measurement of contrast agents and reference standards in preclinical research is presented. Specifically, the work is an example of how the emerging technology of 3D printing improves flexibility and access for custom phantom design.
Brown, Alisa; Uneri, Ali; Silva, Tharindu De; Manbachi, Amir; Siewerdsen, Jeffrey H
2018-04-01
Dynamic reference frames (DRFs) are a common component of modern surgical tracking systems; however, the limited number of commercially available DRFs poses a constraint in developing systems, especially for research and education. This work presents the design and validation of a large, open-source library of DRFs compatible with passive, single-face tracking systems, such as Polaris stereoscopic infrared trackers (NDI, Waterloo, Ontario). An algorithm was developed to create new DRF designs consistent with intra- and intertool design constraints and convert to computer-aided design (CAD) files suitable for three-dimensional printing. A library of 10 such groups, each with 6 to 10 DRFs, was produced and tracking performance was validated in comparison to a standard commercially available reference, including pivot calibration, fiducial registration error (FRE), and target registration error (TRE). Pivot tests showed calibration error [Formula: see text], indistinguishable from the reference. FRE was [Formula: see text], and TRE in a CT head phantom was [Formula: see text], both equivalent to the reference. The library of DRFs offers a useful resource for surgical navigation research and could be extended to other tracking systems and alternative design constraints.
Kisand, Veljo; Lettieri, Teresa
2013-04-01
De novo genome sequencing of previously uncharacterized microorganisms has the potential to open up new frontiers in microbial genomics by providing insight into both functional capabilities and biodiversity. Until recently, Roche 454 pyrosequencing was the NGS method of choice for de novo assembly because it generates hundreds of thousands of long reads (<450 bps), which are presumed to aid in the analysis of uncharacterized genomes. The array of tools for processing NGS data are increasingly free and open source and are often adopted for both their high quality and role in promoting academic freedom. The error rate of pyrosequencing the Alcanivorax borkumensis genome was such that thousands of insertions and deletions were artificially introduced into the finished genome. Despite a high coverage (~30 fold), it did not allow the reference genome to be fully mapped. Reads from regions with errors had low quality, low coverage, or were missing. The main defect of the reference mapping was the introduction of artificial indels into contigs through lower than 100% consensus and distracting gene calling due to artificial stop codons. No assembler was able to perform de novo assembly comparable to reference mapping. Automated annotation tools performed similarly on reference mapped and de novo draft genomes, and annotated most CDSs in the de novo assembled draft genomes. Free and open source software (FOSS) tools for assembly and annotation of NGS data are being developed rapidly to provide accurate results with less computational effort. Usability is not high priority and these tools currently do not allow the data to be processed without manual intervention. Despite this, genome assemblers now readily assemble medium short reads into long contigs (>97-98% genome coverage). A notable gap in pyrosequencing technology is the quality of base pair calling and conflicting base pairs between single reads at the same nucleotide position. Regardless, using draft whole genomes that are not finished and remain fragmented into tens of contigs allows one to characterize unknown bacteria with modest effort.
Mythologies of the World: A Guide to Sources.
ERIC Educational Resources Information Center
Smith, Ron
This book surveys the important available books on mythologies of all parts of the globe and the cultural contexts from which the mythological traditions emerged. Written as a series of bibliographic essays, the guide opens with a description of major reference sources encompassing many cultures, as well as those tracing particular themes (such as…
JETSPIN: A specific-purpose open-source software for simulations of nanofiber electrospinning
NASA Astrophysics Data System (ADS)
Lauricella, Marco; Pontrelli, Giuseppe; Coluzza, Ivan; Pisignano, Dario; Succi, Sauro
2015-12-01
We present the open-source computer program JETSPIN, specifically designed to simulate the electrospinning process of nanofibers. Its capabilities are shown with proper reference to the underlying model, as well as a description of the relevant input variables and associated test-case simulations. The various interactions included in the electrospinning model implemented in JETSPIN are discussed in detail. The code is designed to exploit different computational architectures, from single to parallel processor workstations. This paper provides an overview of JETSPIN, focusing primarily on its structure, parallel implementations, functionality, performance, and availability.
Mashup Scheme Design of Map Tiles Using Lightweight Open Source Webgis Platform
NASA Astrophysics Data System (ADS)
Hu, T.; Fan, J.; He, H.; Qin, L.; Li, G.
2018-04-01
To address the difficulty involved when using existing commercial Geographic Information System platforms to integrate multi-source image data fusion, this research proposes the loading of multi-source local tile data based on CesiumJS and examines the tile data organization mechanisms and spatial reference differences of the CesiumJS platform, as well as various tile data sources, such as Google maps, Map World, and Bing maps. Two types of tile data loading schemes have been designed for the mashup of tiles, the single data source loading scheme and the multi-data source loading scheme. The multi-sources of digital map tiles used in this paper cover two different but mainstream spatial references, the WGS84 coordinate system and the Web Mercator coordinate system. According to the experimental results, the single data source loading scheme and the multi-data source loading scheme with the same spatial coordinate system showed favorable visualization effects; however, the multi-data source loading scheme was prone to lead to tile image deformation when loading multi-source tile data with different spatial references. The resulting method provides a low cost and highly flexible solution for small and medium-scale GIS programs and has a certain potential for practical application values. The problem of deformation during the transition of different spatial references is an important topic for further research.
Datacube Services in Action, Using Open Source and Open Standards
NASA Astrophysics Data System (ADS)
Baumann, P.; Misev, D.
2016-12-01
Array Databases comprise novel, promising technology for massive spatio-temporal datacubes, extending the SQL paradigm of "any query, anytime" to n-D arrays. On server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. The rasdaman ("raster data manager") system, which has pioneered Array Databases, is available in open source on www.rasdaman.org. Its declarative query language extends SQL with array operators which are optimized and parallelized on server side. The rasdaman engine, which is part of OSGeo Live, is mature and in operational use databases individually holding dozens of Terabytes. Further, the rasdaman concepts have strongly impacted international Big Data standards in the field, including the forthcoming MDA ("Multi-Dimensional Array") extension to ISO SQL, the OGC Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) standards, and the forthcoming INSPIRE WCS/WCPS; in both OGC and INSPIRE, OGC is WCS Core Reference Implementation. In our talk we present concepts, architecture, operational services, and standardization impact of open-source rasdaman, as well as experiences made.
Learning Opportunities With Creation of Open Source Textbooks (LOW COST) Act of 2009
Rep. Foster, Bill [D-IL-14
2009-03-12
House - 05/14/2009 Referred to the Subcommittee on Higher Education, Lifelong Learning, and Competitiveness. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Steady-state capabilities for hydroturbines with OpenFOAM
NASA Astrophysics Data System (ADS)
Page, M.; Beaudoin, M.; Giroux, A. M.
2010-08-01
The availability of a high quality Open Source CFD simulation platform like OpenFOAM offers new R&D opportunities by providing direct access to models and solver implementation details. Efforts have been made by Hydro-Québec to adapt OpenFOAM to hydroturbines for the development of steady-state capabilities. The paper describes the developments that have been made to implement new turbomachinery related capabilities: Multiple Frame of Reference solver, domain coupling interfaces (GGI, cyclicGGI and mixing plane) and specialized boundary conditions. Practical use of the new turbomachinery capabilities are demonstrated for the analysis of a 195-MW Francis hydroturbine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.
VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less
2013-12-01
explosive device ISR intelligence, surveillance and reconnaissance J2 joint staff intelligence section NATO North Atlantic Treaty Organization OSINT open...commonly referring to the technical means of collection to the exclusion of HUMINT, open source intelligence ( OSINT ), and other information coming...is using to gain popular support or to delegitimize the supported government. The 10th Mountain Division determined OSINT was an important component
A Stigmergy Approach for Open Source Software Developer Community Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Xiaohui; Beaver, Justin M; Potok, Thomas E
2009-01-01
The stigmergy collaboration approach provides a hypothesized explanation about how online groups work together. In this research, we presented a stigmergy approach for building an agent based open source software (OSS) developer community collaboration simulation. We used group of actors who collaborate on OSS projects as our frame of reference and investigated how the choices actors make in contribution their work on the projects determinate the global status of the whole OSS projects. In our simulation, the forum posts and project codes served as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing developer agentmore » behaviors selection probability.« less
Trujillo, Logan T.; Stanfield, Candice T.; Vela, Ruben D.
2017-01-01
Converging evidence suggests that human cognition and behavior emerge from functional brain networks interacting on local and global scales. We investigated two information-theoretic measures of functional brain segregation and integration—interaction complexity CI(X), and integration I(X)—as applied to electroencephalographic (EEG) signals and how these measures are affected by choice of EEG reference. CI(X) is a statistical measure of the system entropy accounted for by interactions among its elements, whereas I(X) indexes the overall deviation from statistical independence of the individual elements of a system. We recorded 72 channels of scalp EEG from human participants who sat in a wakeful resting state (interleaved counterbalanced eyes-open and eyes-closed blocks). CI(X) and I(X) of the EEG signals were computed using four different EEG references: linked-mastoids (LM) reference, average (AVG) reference, a Laplacian (LAP) “reference-free” transformation, and an infinity (INF) reference estimated via the Reference Electrode Standardization Technique (REST). Fourier-based power spectral density (PSD), a standard measure of resting state activity, was computed for comparison and as a check of data integrity and quality. We also performed dipole source modeling in order to assess the accuracy of neural source CI(X) and I(X) estimates obtained from scalp-level EEG signals. CI(X) was largest for the LAP transformation, smallest for the LM reference, and at intermediate values for the AVG and INF references. I(X) was smallest for the LAP transformation, largest for the LM reference, and at intermediate values for the AVG and INF references. Furthermore, across all references, CI(X) and I(X) reliably distinguished between resting-state conditions (larger values for eyes-open vs. eyes-closed). These findings occurred in the context of the overall expected pattern of resting state PSD. Dipole modeling showed that simulated scalp EEG-level CI(X) and I(X) reflected changes in underlying neural source dependencies, but only for higher levels of integration and with highest accuracy for the LAP transformation. Our observations suggest that the Laplacian-transformation should be preferred for the computation of scalp-level CI(X) and I(X) due to its positive impact on EEG signal quality and statistics, reduction of volume-conduction, and the higher accuracy this provides when estimating scalp-level EEG complexity and integration. PMID:28790884
NASA Astrophysics Data System (ADS)
Edwards, Brian E.; Nitkowski, Arthur; Lawrence, Ryan; Horton, Kasey; Higgs, Charles
2004-10-01
Atmospheric turbulence and laser-induced thermal blooming effects can degrade the beam quality of a high-energy laser (HEL) weapon, and ultimately limit the amount of energy deliverable to a target. Lincoln Laboratory has built a thermal blooming laboratory capable of emulating atmospheric thermal blooming and turbulence effects for tactical HEL systems. The HEL weapon emulation hardware includes an adaptive optics beam delivery system, which utilizes a Shack-Hartman wavefront sensor and a 349 actuator deformable mirror. For this experiment, the laboratory was configured to emulate an engagement scenario consisting of sea skimming target approaching directly toward the HEL weapon at a range of 10km. The weapon utilizes a 1.5m aperture and radiates at a 1.62 micron wavelength. An adaptive optics reference beam was provided as either a point source located at the target (cooperative) or a projected point source reflected from the target (uncooperative). Performance of the adaptive optics system was then compared between reference sources. Results show that, for operating conditions with a thermal blooming distortion number of 75 and weak turbulence (Rytov of 0.02 and D/ro of 3), cooperative beacon AO correction experiences Phase Compensation Instability, resulting in lower performance than a simple, open-loop condition. The uncooperative beacon resulted in slightly better performance than the open-loop condition.
Open source EMR software: profiling, insights and hands-on analysis.
Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A
2014-11-01
The use of open source software in health informatics is increasingly advocated by authors in the literature. Although there is no clear evidence of the superiority of the current open source applications in the healthcare field, the number of available open source applications online is growing and they are gaining greater prominence. This repertoire of open source options is of a great value for any future-planner interested in adopting an electronic medical/health record system, whether selecting an existent application or building a new one. The following questions arise. How do the available open source options compare to each other with respect to functionality, usability and security? Can an implementer of an open source application find sufficient support both as a user and as a developer, and to what extent? Does the available literature provide adequate answers to such questions? This review attempts to shed some light on these aspects. The objective of this study is to provide more comprehensive guidance from an implementer perspective toward the available alternatives of open source healthcare software, particularly in the field of electronic medical/health records. The design of this study is twofold. In the first part, we profile the published literature on a sample of existent and active open source software in the healthcare area. The purpose of this part is to provide a summary of the available guides and studies relative to the sampled systems, and to identify any gaps in the published literature with respect to our research questions. In the second part, we investigate those alternative systems relative to a set of metrics, by actually installing the software and reporting a hands-on experience of the installation process, usability, as well as other factors. The literature covers many aspects of open source software implementation and utilization in healthcare practice. Roughly, those aspects could be distilled into a basic taxonomy, making the literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Inventory of U.S. 2012 dioxin emissions to atmosphere.
Dwyer, Henri; Themelis, Nickolas J
2015-12-01
In 2006, the U.S. EPA published an inventory of dioxin emissions for the U.S. covering the period from 1987-2000. This paper is an updated inventory of all U.S. dioxin emissions to the atmosphere in the year 2012. The sources of emissions of polychlorinated dibenzodioxins (PCDDs) and polychlorinated dibenzofurans (PCDFs), collectively referred to in this paper as "dioxins", were separated into two classes: controlled industrial and open burning sources. Controlled source emissions decreased 95.5% from 14.0 kg TEQ in 1987 to 0.6 kg in 2012. Open burning source emissions increased from 2.3 kg TEQ in 1987 to 2.9 kg in 2012. The 2012 dioxin emissions from 53 U.S. waste-to-energy (WTE) power plants were compiled on the basis of detailed data obtained from the two major U.S. WTE companies, representing 84% of the total MSW combusted (27.4 million metric tons). The dioxin emissions of all U.S. WTE plants in 2012 were 3.4 g TEQ and represented 0.54% of the controlled industrial dioxin emissions, and 0.09% of all dioxin emissions from controlled and open burning sources. Copyright © 2015. Published by Elsevier Ltd.
The open literature, Federal publications, industrial reports, and other sources published between 1975 and 1980 were reviewed for information relevant to personal air samplers potentially useful in sampling organic compounds at ambient levels (50-200 ppt). Seventy one references...
Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy.
Görlitz, Frederik; Kelly, Douglas J; Warren, Sean C; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J; Stuhmeier, Frank; Neil, Mark A A; Tate, Edward W; Dunsby, Christopher; French, Paul M W
2017-01-18
We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set.
Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy
Warren, Sean C.; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A.; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J.; Stuhmeier, Frank; Neil, Mark A. A.; Tate, Edward W.; Dunsby, Christopher; French, Paul M. W.
2017-01-01
We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set. PMID:28190060
Integrating an Awareness of Selfhood and Society into Virtual Learning
ERIC Educational Resources Information Center
Stricker, Andrew, Ed.; Calongne, Cynthia, Ed.; Truman, Barbara, Ed.; Arenas, Fil, Ed.
2017-01-01
Recent technological advances have opened new platforms for learning and teaching. By utilizing virtual spaces, more educational opportunities are created for students who cannot attend a physical classroom environment. "Integrating an Awareness of Selfhood and Society into Virtual Learning" is a pivotal reference source that discusses…
Journal of Open Source Software (JOSS): design and first-year review
NASA Astrophysics Data System (ADS)
Smith, Arfon M.
2018-01-01
JOSS is a free and open-access journal that publishes articles describing research software across all disciplines. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit. While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license. A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. JOSS published more than 100 articles in its first year, many from the scientific python ecosystem (including a number of articles related to astronomy and astrophysics). JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative.In this presentation, I'll describes the motivation, design, and progress of the Journal of Open Source Software (JOSS) and how it compares to other avenues for publishing research software in astronomy.
Reference software implementation for GIFTS ground data processing
NASA Astrophysics Data System (ADS)
Garcia, R. K.; Howell, H. B.; Knuteson, R. O.; Martin, G. D.; Olson, E. R.; Smuga-Otto, M. J.
2006-08-01
Future satellite weather instruments such as high spectral resolution imaging interferometers pose a challenge to the atmospheric science and software development communities due to the immense data volumes they will generate. An open-source, scalable reference software implementation demonstrating the calibration of radiance products from an imaging interferometer, the Geosynchronous Imaging Fourier Transform Spectrometer1 (GIFTS), is presented. This paper covers essential design principles laid out in summary system diagrams, lessons learned during implementation and preliminary test results from the GIFTS Information Processing System (GIPS) prototype.
Moody, George B; Mark, Roger G; Goldberger, Ary L
2011-01-01
PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.
NASA Technical Reports Server (NTRS)
Jones, R. A. (Inventor)
1974-01-01
The square root of the product of thermophysical properties q, c and k, where p is density, c is specific heat and k is thermal conductivity, is determined directly on a test specimen such as a wind tunnel model. The test specimen and a reference specimen of known specific heat are positioned at a given distance from a heat source. The specimens are provided with a coating, such as a phase change coating, to visually indicate that a given temperature was reached. A shutter interposed between the heat source and the specimens is opened and a motion picture camera is actuated to provide a time record of the heating step. The temperature of the reference specimen is recorded as a function of time. The heat rate to which both the test and reference specimens were subjected is determined from the temperature time response of the reference specimen by the conventional thin-skin calorimeter equation.
PACS for Bhutan: a cost effective open source architecture for emerging countries.
Ratib, Osman; Roduit, Nicolas; Nidup, Dechen; De Geer, Gerard; Rosset, Antoine; Geissbuhler, Antoine
2016-10-01
This paper reports the design and implementation of an innovative and cost-effective imaging management infrastructure suitable for radiology centres in emerging countries. It was implemented in the main referring hospital of Bhutan equipped with a CT, an MRI, digital radiology, and a suite of several ultrasound units. They lacked the necessary informatics infrastructure for image archiving and interpretation and needed a system for distribution of images to clinical wards. The solution developed for this project combines several open source software platforms in a robust and versatile archiving and communication system connected to analysis workstations equipped with a FDA-certified version of the highly popular Open-Source software. The whole system was implemented on standard off-the-shelf hardware. The system was installed in three days, and training of the radiologists as well as the technical and IT staff was provided onsite to ensure full ownership of the system by the local team. Radiologists were rapidly capable of reading and interpreting studies on the diagnostic workstations, which had a significant benefit on their workflow and ability to perform diagnostic tasks more efficiently. Furthermore, images were also made available to several clinical units on standard desktop computers through a web-based viewer. • Open source imaging informatics platforms can provide cost-effective alternatives for PACS • Robust and cost-effective open architecture can provide adequate solutions for emerging countries • Imaging informatics is often lacking in hospitals equipped with digital modalities.
Processor Emulator with Benchmark Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lloyd, G. Scott; Pearce, Roger; Gokhale, Maya
2015-11-13
A processor emulator and a suite of benchmark applications have been developed to assist in characterizing the performance of data-centric workloads on current and future computer architectures. Some of the applications have been collected from other open source projects. For more details on the emulator and an example of its usage, see reference [1].
Introducing Text Analytics as a Graduate Business School Course
ERIC Educational Resources Information Center
Edgington, Theresa M.
2011-01-01
Text analytics refers to the process of analyzing unstructured data from documented sources, including open-ended surveys, blogs, and other types of web dialog. Text analytics has enveloped the concept of text mining, an analysis approach influenced heavily from data mining. While text mining has been covered extensively in various computer…
Expand Reference Resources: Research the Holocaust through the Internet.
ERIC Educational Resources Information Center
Anderson, Judy
1998-01-01
The Internet opens a wide range of possibilities for accessing materials on the Holocaust from both traditional sources and more volatile areas (personal homepages, e-mail, and discussion groups archives). Excerpts from accounts by one Hungarian and one Norwegian political prisoner are included as illustrations of material which may not have been…
Application of Open Source Software by the Lunar Mapping and Modeling Project
NASA Astrophysics Data System (ADS)
Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.
2011-12-01
The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on top of XML. Apache Solr, an open source search engine, was used to drive our search interface and as way to store references to metadata and data exposed via REST endpoints. As was the case with Apache OODT there was team experience with this component that helped drive this choice. Lastly, OpenSSO, an open source single sign on service, was used to secure and provide access constraints to our REST based services. For this product there was little past experience but given our service based approach seemed to be a natural fit. Given our exposure to open source we will discuss the tradeoffs and benefits received by the choices made. Moreover, we will dive into the context of how the software packages were used and the impact of their design and extensibility had on the construction of the infrastructure. Finally, we will compare our encounter across open source solutions and attributes that can vary the impression one will get. This comprehensive account of our endeavor should aid others in their assessment and use of open source.
OOSTethys - Open Source Software for the Global Earth Observing Systems of Systems
NASA Astrophysics Data System (ADS)
Bridger, E.; Bermudez, L. E.; Maskey, M.; Rueda, C.; Babin, B. L.; Blair, R.
2009-12-01
An open source software project is much more than just picking the right license, hosting modular code and providing effective documentation. Success in advancing in an open collaborative way requires that the process match the expected code functionality to the developer's personal expertise and organizational needs as well as having an enthusiastic and responsive core lead group. We will present the lessons learned fromOOSTethys , which is a community of software developers and marine scientists who develop open source tools, in multiple languages, to integrate ocean observing systems into an Integrated Ocean Observing System (IOOS). OOSTethys' goal is to dramatically reduce the time it takes to install, adopt and update standards-compliant web services. OOSTethys has developed servers, clients and a registry. Open source PERL, PYTHON, JAVA and ASP tool kits and reference implementations are helping the marine community publish near real-time observation data in interoperable standard formats. In some cases publishing an OpenGeospatial Consortium (OGC), Sensor Observation Service (SOS) from NetCDF files or a database or even CSV text files could take only minutes depending on the skills of the developer. OOSTethys is also developing an OGC standard registry, Catalog Service for Web (CSW). This open source CSW registry was implemented to easily register and discover SOSs using ISO 19139 service metadata. A web interface layer over the CSW registry simplifies the registration process by harvesting metadata describing the observations and sensors from the “GetCapabilities” response of SOS. OPENIOOS is the web client, developed in PERL to visualize the sensors in the SOS services. While the number of OOSTethys software developers is small, currently about 10 around the world, the number of OOSTethys toolkit implementers is larger and growing and the ease of use has played a large role in spreading the use of interoperable standards compliant web services widely in the marine community.
MetaboLights: An Open-Access Database Repository for Metabolomics Data.
Kale, Namrata S; Haug, Kenneth; Conesa, Pablo; Jayseelan, Kalaivani; Moreno, Pablo; Rocca-Serra, Philippe; Nainala, Venkata Chandrasekhar; Spicer, Rachel A; Williams, Mark; Li, Xuefei; Salek, Reza M; Griffin, Julian L; Steinbeck, Christoph
2016-03-24
MetaboLights is the first general purpose, open-access database repository for cross-platform and cross-species metabolomics research at the European Bioinformatics Institute (EMBL-EBI). Based upon the open-source ISA framework, MetaboLights provides Metabolomics Standard Initiative (MSI) compliant metadata and raw experimental data associated with metabolomics experiments. Users can upload their study datasets into the MetaboLights Repository. These studies are then automatically assigned a stable and unique identifier (e.g., MTBLS1) that can be used for publication reference. The MetaboLights Reference Layer associates metabolites with metabolomics studies in the archive and is extensively annotated with data fields such as structural and chemical information, NMR and MS spectra, target species, metabolic pathways, and reactions. The database is manually curated with no specific release schedules. MetaboLights is also recommended by journals for metabolomics data deposition. This unit provides a guide to using MetaboLights, downloading experimental data, and depositing metabolomics datasets using user-friendly submission tools. Copyright © 2016 John Wiley & Sons, Inc.
Mobile service for open data visualization on geo-based images
NASA Astrophysics Data System (ADS)
Lee, Kiwon; Kim, Kwangseob; Kang, Sanggoo
2015-12-01
Since the early 2010s, governments in most countries have adopted and promoted open data policy and open data platform. Korea are in the same situation, and government and public organizations have operated the public-accessible open data portal systems since 2011. The number of open data and data type have been increasing every year. These trends are more expandable or extensible on mobile environments. The purpose of this study is to design and implement a mobile application service to visualize various typed or formatted public open data with geo-based images on the mobile web. Open data cover downloadable data sets or open-accessible data application programming interface API. Geo-based images mean multi-sensor satellite imageries which are referred in geo-coordinates and matched with digital map sets. System components for mobile service are fully based on open sources and open development environments without any commercialized tools: PostgreSQL for database management system, OTB for remote sensing image processing, GDAL for data conversion, GeoServer for application server, OpenLayers for mobile web mapping, R for data analysis and D3.js for web-based data graphic processing. Mobile application in client side was implemented by using HTML5 for cross browser and cross platform. The result shows many advantageous points such as linking open data and geo-based data, integrating open data and open source, and demonstrating mobile applications with open data. It is expected that this approach is cost effective and process efficient implementation strategy for intelligent earth observing data.
OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Seyong; Vetter, Jeffrey S
2014-01-01
Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-12-09
PV_LIB comprises a library of Matlab? code for modeling photovoltaic (PV) systems. Included are functions to compute solar position and to estimate irradiance in the PV system's plane of array, cell temperature, PV module electrical output, and conversion from DC to AC power. Also included are functions that aid in determining parameters for module performance models from module characterization testing. PV_LIB is open source code primarily intended for research and academic purposes. All algorithms are documented in openly available literature with the appropriate references included in comments within the code.
Deterministic Design Optimization of Structures in OpenMDAO Framework
NASA Technical Reports Server (NTRS)
Coroneos, Rula M.; Pai, Shantaram S.
2012-01-01
Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.
An innovative use of instant messaging technology to support a library's single-service point.
Horne, Andrea S; Ragon, Bart; Wilson, Daniel T
2012-01-01
A library service model that provides reference and instructional services by summoning reference librarians from a single service point is described. The system utilizes Libraryh3lp, an open-source, multioperator instant messaging system. The selection and refinement of this solution and technical challenges encountered are explored, as is the design of public services around this technology, usage of the system, and best practices. This service model, while a major cultural and procedural change at first, is now a routine aspect of customer service for this library.
Cargo Movement Operations System (CMOS) Software Test Plan. Final
1990-07-26
NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ ] ORIGINATOR CONTROL NUMBER: STP-0002 PROGRAM OFFICE CONTROL NUMBER: DATA ITEM DISCREPANCY WORKSHEET CDRL NUMBER: A007-03 DATE: 07/26/90 ORIGINATOR NAME: John J.Brassil OFFICE SYMBOL: SAIC TELEPHONE NU4BER: 272-2999 SUBSTANTIVE: X EDITORIAL: PAGE NUMBER: 63 PARA NUMBER: Table 4.2.1.2 COMMENT OR RECOMMENDED CHANGE: Replace the reference to the Source and Destination STP paragraphs with a reference to the paragraph of the STP which tests the interface itself. RATIONALE: Each internal
RADIOMICS.io | Informatics Technology for Cancer Research (ITCR)
RADIOMICS.io is a open source platform for informatics developments for radiographic phenotyping using automated algorithms, such as engineered features or using deep learning technologies. With this platform, we aim to establish a reference standard for radiomic analyses, provide a tested and maintained resource, and to grow the community of radiomic developers addressing critical needs in cancer research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George
This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.
An Unexpected Ally: Using Microsoft's SharePoint to Create a Departmental Intranet
ERIC Educational Resources Information Center
Dahl, David
2010-01-01
In September 2008, the Albert S. Cook Library at Towson University implemented an intranet to support the various functions of the library's Reference Department. This intranet is called the RefPortal. After exploring open source options and other Web 2.0 tools, the department (under the guidance of the library technology coordinator) chose…
Next-Generation Bibliographic Manager: An Interview with Trevor Owens
ERIC Educational Resources Information Center
Morrison, James L.; Owens, Trevor
2008-01-01
James Morrison's interview with Trevor Owens explores Zotero, a free, open-source bibliographic tool that works as a Firefox plug-in. Previous bibliographic software, such as EndNote or Refworks, worked either online or offline to collect references and citations. Zotero leverages the power of the browser to allow users to work either online or…
TACOM LCMC IB and DMSMS Mitigation
2011-09-26
Sources I Gosed II Opened ~ I AAC flag: Vii6d AI CAGE codes (CONUS): 3 3 CAGE codes (OCONUS): 0 0 ---- Total: 3 3 Single or no CAGE code...v In box - I’lL- I Qi) chambers:... I ~ Microsoft - I I~ AADO SER.- t@) ~ i_ .... gose I I Used On Reference/Part Numbers I~ 26SEP11
Laajala, Teemu D; Murtojärvi, Mika; Virkki, Arho; Aittokallio, Tero
2018-06-15
Prognostic models are widely used in clinical decision-making, such as risk stratification and tailoring treatment strategies, with the aim to improve patient outcomes while reducing overall healthcare costs. While prognostic models have been adopted into clinical use, benchmarking their performance has been difficult due to lack of open clinical datasets. The recent DREAM 9.5 Prostate Cancer Challenge carried out an extensive benchmarking of prognostic models for metastatic Castration-Resistant Prostate Cancer (mCRPC), based on multiple cohorts of open clinical trial data. We make available an open-source implementation of the top-performing model, ePCR, along with an extended toolbox for its further re-use and development, and demonstrate how to best apply the implemented model to real-world data cohorts of advanced prostate cancer patients. The open-source R-package ePCR and its reference documentation are available at the Central R Archive Network (CRAN): https://CRAN.R-project.org/package=ePCR. R-vignette provides step-by-step examples for the ePCR usage. Supplementary data are available at Bioinformatics online.
OPAL: An Open-Source MPI-IO Library over Cray XT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Weikuan; Vetter, Jeffrey S; Canon, Richard Shane
Parallel IO over Cray XT is supported by a vendor-supplied MPI-IO package. This package contains a proprietary ADIO implementation built on top of the sysio library. While it is reasonable to maintain a stable code base for application scientists' convenience, it is also very important to the system developers and researchers to analyze and assess the effectiveness of parallel IO software, and accordingly, tune and optimize the MPI-IO implementation. A proprietary parallel IO code base relinquishes such flexibilities. On the other hand, a generic UFS-based MPI-IO implementation is typically used on many Linux-based platforms. We have developed an open-source MPI-IOmore » package over Lustre, referred to as OPAL (OPportunistic and Adaptive MPI-IO Library over Lustre). OPAL provides a single source-code base for MPI-IO over Lustre on Cray XT and Linux platforms. Compared to Cray implementation, OPAL provides a number of good features, including arbitrary specification of striping patterns and Lustre-stripe aligned file domain partitioning. This paper presents the performance comparisons between OPAL and Cray's proprietary implementation. Our evaluation demonstrates that OPAL achieves the performance comparable to the Cray implementation. We also exemplify the benefits of an open source package in revealing the underpinning of the parallel IO performance.« less
Steiner, Andreas; Hella, Jerry; Grüninger, Servan; Mhalu, Grace; Mhimbira, Francis; Cercamondi, Colin I; Doulla, Basra; Maire, Nicolas; Fenner, Lukas
2016-09-01
A software tool is developed to facilitate data entry and to monitor research projects in under-resourced countries in real-time. The eManagement tool "odk_planner" is written in the scripting languages PHP and Python. The odk_planner is lightweight and uses minimal internet resources. It was designed to be used with the open source software Open Data Kit (ODK). The users can easily configure odk_planner to meet their needs, and the online interface displays data collected from ODK forms in a graphically informative way. The odk_planner also allows users to upload pictures and laboratory results and sends text messages automatically. User-defined access rights protect data and privacy. We present examples from four field applications in Tanzania successfully using the eManagement tool: 1) clinical trial; 2) longitudinal Tuberculosis (TB) Cohort Study with a complex visit schedule, where it was used to graphically display missing case report forms, upload digitalized X-rays, and send text message reminders to patients; 3) intervention study to improve TB case detection, carried out at pharmacies: a tablet-based electronic referral system monitored referred patients, and sent automated messages to remind pharmacy clients to visit a TB Clinic; and 4) TB retreatment case monitoring designed to improve drug resistance surveillance: clinicians at four public TB clinics and lab technicians at the TB reference laboratory used a smartphone-based application that tracked sputum samples, and collected clinical and laboratory data. The user friendly, open source odk_planner is a simple, but multi-functional, Web-based eManagement tool with add-ons that helps researchers conduct studies in under-resourced countries. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
43 CFR 2.4 - How do I obtain information routinely available to the public?
Code of Federal Regulations, 2011 CFR
2011-10-01
... 20240 (see appendix A to this part). The Library is open to the public for on-site reference use from 7...) Another source of information is DOI's Library, which contains over one million holdings dealing with a broad range of matters pertaining to the Department's mission. You may wish to visit the Library, which...
43 CFR 2.4 - How do I obtain information routinely available to the public?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 20240 (see appendix A to this part). The Library is open to the public for on-site reference use from 7...) Another source of information is DOI's Library, which contains over one million holdings dealing with a broad range of matters pertaining to the Department's mission. You may wish to visit the Library, which...
ERIC Educational Resources Information Center
Igado, Manuel Fandos
2010-01-01
This work provides some considerations that complements the scarcity of researches this field of knowledge of the e-learning specifically referred to secondary education. Distance training programmes (both open source code and not) are becoming increasingly more popular, especially in higher level education. However, there are very few cases of…
NASA Technical Reports Server (NTRS)
Ramachandran, Ganesh K.; Akopian, David; Heckler, Gregory W.; Winternitz, Luke B.
2011-01-01
Location technologies have many applications in wireless communications, military and space missions, etc. US Global Positioning System (GPS) and other existing and emerging Global Navigation Satellite Systems (GNSS) are expected to provide accurate location information to enable such applications. While GNSS systems perform very well in strong signal conditions, their operation in many urban, indoor, and space applications is not robust or even impossible due to weak signals and strong distortions. The search for less costly, faster and more sensitive receivers is still in progress. As the research community addresses more and more complicated phenomena there exists a demand on flexible multimode reference receivers, associated SDKs, and development platforms which may accelerate and facilitate the research. One of such concepts is the software GPS/GNSS receiver (GPS SDR) which permits a facilitated access to algorithmic libraries and a possibility to integrate more advanced algorithms without hardware and essential software updates. The GNU-SDR and GPS-SDR open source receiver platforms are such popular examples. This paper evaluates the performance of recently proposed block-corelator techniques for acquisition and tracking of GPS signals using open source GPS-SDR platform.
Application of OpenStreetMap (OSM) to Support the Mapping Village in Indonesia
NASA Astrophysics Data System (ADS)
Swasti Kanthi, Nurin; Hery Purwanto, Taufik
2016-11-01
Geospatial Information is a important thing in this era, because the need for location information is needed to know the condition of a region. In 2015 the Indonesian government release detailed mapping in village level and their Parent maps Indonesian state regulatory standards set forth in Rule form Norm Standards, Procedures and Criteria for Mapping Village (NSPK). Over time Web and Mobile GIS was developed with a wide range of applications. The merger between detailed mapping and Web GIS is still rarely performed and not used optimally. OpenStreetMap (OSM) is a WebGIS which can be utilized as Mobile GIS providing sufficient information to the representative levels of the building and can be used for mapping the village.Mapping Village using OSM was conducted using remote sensing approach and Geographical Information Systems (GIS), which's to interpret remote sensing imagery from OSM. The study was conducted to analyzed how far the role of OSM to support the mapping of the village, it's done by entering the house number data, administrative boundaries, public facilities and land use into OSM with reference data and data image Village Plan. The results of the mapping portion villages in OSM as a reference map-making village and analyzed in accordance with NSPK for detailed mapping Rukun Warga (RW) is part of the village mapping. The use of OSM greatly assists the process of mapping the details of the region with data sources in the form of images and can be accessed for Open Source. But still need their care and updating the data source to maintain the validity of the data.
SCARF: maximizing next-generation EST assemblies for evolutionary and population genomic analyses.
Barker, Michael S; Dlugosch, Katrina M; Reddy, A Chaitanya C; Amyotte, Sarah N; Rieseberg, Loren H
2009-02-15
Scaffolded and Corrected Assembly of Roche 454 (SCARF) is a next-generation sequence assembly tool for evolutionary genomics that is designed especially for assembling 454 EST sequences against high-quality reference sequences from related species. The program was created to knit together 454 contigs that do not assemble during traditional de novo assembly, using a reference sequence library to orient the 454 sequences. SCARF is freely available at http://msbarker.com/software.htm, and is released under the open source GPLv3 license (http://www.opensource.org/licenses/gpl-3.0.html.
Performance of the High Sensitivity Open Source Multi-GNSS Assisted GNSS Reference Server.
NASA Astrophysics Data System (ADS)
Sarwar, Ali; Rizos, Chris; Glennon, Eamonn
2015-06-01
The Open Source GNSS Reference Server (OSGRS) exploits the GNSS Reference Interface Protocol (GRIP) to provide assistance data to GPS receivers. Assistance can be in terms of signal acquisition and in the processing of the measurement data. The data transfer protocol is based on Extensible Mark-up Language (XML) schema. The first version of the OSGRS required a direct hardware connection to a GPS device to acquire the data necessary to generate the appropriate assistance. Scenarios of interest for the OSGRS users are weak signal strength indoors, obstructed outdoors or heavy multipath environments. This paper describes an improved version of OSGRS that provides alternative assistance support from a number of Global Navigation Satellite Systems (GNSS). The underlying protocol to transfer GNSS assistance data from global casters is the Networked Transport of RTCM (Radio Technical Commission for Maritime Services) over Internet Protocol (NTRIP), and/or the RINEX (Receiver Independent Exchange) format. This expands the assistance and support model of the OSGRS to globally available GNSS data servers connected via internet casters. A variety of formats and versions of RINEX and RTCM streams become available, which strengthens the assistance provisioning capability of the OSGRS platform. The prime motivation for this work was to enhance the system architecture of the OSGRS to take advantage of globally available GNSS data sources. Open source software architectures and assistance models provide acquisition and data processing assistance for GNSS receivers operating in weak signal environments. This paper describes test scenarios to benchmark the OSGRSv2 performance against other Assisted-GNSS solutions. Benchmarking devices include the SPOT satellite messenger, MS-Based & MS-Assisted GNSS, HSGNSS (SiRFstar-III) and Wireless Sensor Networks Assisted-GNSS. Benchmarked parameters include the number of tracked satellites, the Time to Fix First (TTFF), navigation availability and accuracy. Three different configurations of Multi-GNSS assistance servers were used, namely Cloud-Client-Server, the Demilitarized Zone (DMZ) Client-Server and PC-Client-Server; with respect to the connectivity location of client and server. The impact on the performance based on server and/or client initiation, hardware capability, network latency, processing delay and computation times with their storage, scalability, processing and load sharing capabilities, were analysed. The performance of the OSGRS is compared against commercial GNSS, Assisted-GNSS and WSN-enabled GNSS devices. The OSGRS system demonstrated lower TTFF and higher availability.
Sykes, Melissa L.; Jones, Amy J.; Shelper, Todd B.; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E.
2017-01-01
ABSTRACT Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro. Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. PMID:28674055
Duffy, Sandra; Sykes, Melissa L; Jones, Amy J; Shelper, Todd B; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E; Avery, Vicky M
2017-09-01
Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. Copyright © 2017 Duffy et al.
A method for determining the conversion efficiency of multiple-cell photovoltaic devices
NASA Astrophysics Data System (ADS)
Glatfelter, Troy; Burdick, Joseph
A method for accurately determining the conversion efficiency of any multiple-cell photovoltaic device under any arbitrary reference spectrum is presented. This method makes it possible to obtain not only the short-circuit current, but also the fill factor, the open-circuit voltage, and hence the conversion efficiency of a multiple-cell device under any reference spectrum. Results are presented which allow a comparison of the I-V parameters of two-terminal, two- and three-cell tandem devices measured under a multiple-source simulator with the same parameters measured under different reference spectra. It is determined that the uncertainty in the conversion efficiency of a multiple-cell photovoltaic device obtained with this method is less than +/-3 percent.
2014-01-01
Background Providing scalable clinical decision support (CDS) across institutions that use different electronic health record (EHR) systems has been a challenge for medical informatics researchers. The lack of commonly shared EHR models and terminology bindings has been recognised as a major barrier to sharing CDS content among different organisations. The openEHR Guideline Definition Language (GDL) expresses CDS content based on openEHR archetypes and can support any clinical terminologies or natural languages. Our aim was to explore in an experimental setting the practicability of GDL and its underlying archetype formalism. A further aim was to report on the artefacts produced by this new technological approach in this particular experiment. We modelled and automatically executed compliance checking rules from clinical practice guidelines for acute stroke care. Methods We extracted rules from the European clinical practice guidelines as well as from treatment contraindications for acute stroke care and represented them using GDL. Then we executed the rules retrospectively on 49 mock patient cases to check the cases’ compliance with the guidelines, and manually validated the execution results. We used openEHR archetypes, GDL rules, the openEHR reference information model, reference terminologies and the Data Archetype Definition Language. We utilised the open-sourced GDL Editor for authoring GDL rules, the international archetype repository for reusing archetypes, the open-sourced Ocean Archetype Editor for authoring or modifying archetypes and the CDS Workbench for executing GDL rules on patient data. Results We successfully represented clinical rules about 14 out of 19 contraindications for thrombolysis and other aspects of acute stroke care with 80 GDL rules. These rules are based on 14 reused international archetypes (one of which was modified), 2 newly created archetypes and 51 terminology bindings (to three terminologies). Our manual compliance checks for 49 mock patients were a complete match versus the automated compliance results. Conclusions Shareable guideline knowledge for use in automated retrospective checking of guideline compliance may be achievable using GDL. Whether the same GDL rules can be used for at-the-point-of-care CDS remains unknown. PMID:24886468
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, L.R.
Much research has been devoted to measurement of total blood volume (TBV) and cardiac output (CO) in humans but not enough effort has been devoted to collection and reduction of results for the purpose of deriving typical or {open_quotes}reference{close_quotes} values. Identification of normal values for TBV and CO is needed not only for clinical evaluations but also for the development of biokinetic models for ultra-short-lived radionuclides used in nuclear medicine (Leggett and Williams 1989). The purpose of this report is to offer reference values for TBV and CO, along with estimates of the associated uncertainties that arise from intra- andmore » inter-subject variation, errors in measurement techniques, and other sources. Reference values are derived for basal supine CO and TBV in reference adult humans, and differences associated with age, sex, body size, body position, exercise, and other circumstances are discussed.« less
DasPy – Open Source Multivariate Land Data Assimilation Framework with High Performance Computing
NASA Astrophysics Data System (ADS)
Han, Xujun; Li, Xin; Montzka, Carsten; Kollet, Stefan; Vereecken, Harry; Hendricks Franssen, Harrie-Jan
2015-04-01
Data assimilation has become a popular method to integrate observations from multiple sources with land surface models to improve predictions of the water and energy cycles of the soil-vegetation-atmosphere continuum. In recent years, several land data assimilation systems have been developed in different research agencies. Because of the software availability or adaptability, these systems are not easy to apply for the purpose of multivariate land data assimilation research. Multivariate data assimilation refers to the simultaneous assimilation of observation data for multiple model state variables into a simulation model. Our main motivation was to develop an open source multivariate land data assimilation framework (DasPy) which is implemented using the Python script language mixed with C++ and Fortran language. This system has been evaluated in several soil moisture, L-band brightness temperature and land surface temperature assimilation studies. The implementation allows also parameter estimation (soil properties and/or leaf area index) on the basis of the joint state and parameter estimation approach. LETKF (Local Ensemble Transform Kalman Filter) is implemented as the main data assimilation algorithm, and uncertainties in the data assimilation can be represented by perturbed atmospheric forcings, perturbed soil and vegetation properties and model initial conditions. The CLM4.5 (Community Land Model) was integrated as the model operator. The CMEM (Community Microwave Emission Modelling Platform), COSMIC (COsmic-ray Soil Moisture Interaction Code) and the two source formulation were integrated as observation operators for assimilation of L-band passive microwave, cosmic-ray soil moisture probe and land surface temperature measurements, respectively. DasPy is parallelized using the hybrid MPI (Message Passing Interface) and OpenMP (Open Multi-Processing) techniques. All the input and output data flow is organized efficiently using the commonly used NetCDF file format. Online 1D and 2D visualization of data assimilation results is also implemented to facilitate the post simulation analysis. In summary, DasPy is a ready to use open source parallel multivariate land data assimilation framework.
Quality Analysis of Open Street Map Data
NASA Astrophysics Data System (ADS)
Wang, M.; Li, Q.; Hu, Q.; Zhou, M.
2013-05-01
Crowd sourcing geographic data is an opensource geographic data which is contributed by lots of non-professionals and provided to the public. The typical crowd sourcing geographic data contains GPS track data like OpenStreetMap, collaborative map data like Wikimapia, social websites like Twitter and Facebook, POI signed by Jiepang user and so on. These data will provide canonical geographic information for pubic after treatment. As compared with conventional geographic data collection and update method, the crowd sourcing geographic data from the non-professional has characteristics or advantages of large data volume, high currency, abundance information and low cost and becomes a research hotspot of international geographic information science in the recent years. Large volume crowd sourcing geographic data with high currency provides a new solution for geospatial database updating while it need to solve the quality problem of crowd sourcing geographic data obtained from the non-professionals. In this paper, a quality analysis model for OpenStreetMap crowd sourcing geographic data is proposed. Firstly, a quality analysis framework is designed based on data characteristic analysis of OSM data. Secondly, a quality assessment model for OSM data by three different quality elements: completeness, thematic accuracy and positional accuracy is presented. Finally, take the OSM data of Wuhan for instance, the paper analyses and assesses the quality of OSM data with 2011 version of navigation map for reference. The result shows that the high-level roads and urban traffic network of OSM data has a high positional accuracy and completeness so that these OSM data can be used for updating of urban road network database.
On the seamless, harmonized use of ISO/IEEE11073 and openEHR.
Trigo, Jesús D; Kohl, Christian D; Eguzkiza, Aitor; Martínez-Espronceda, Miguel; Alesanco, Álvaro; Serrano, Luis; García, José; Knaup, Petra
2014-05-01
Standardized exchange of clinical information is a key factor in the provision of high quality health care systems. In this context, the openEHR specification facilitates the management of health data in electronic health records (EHRs), while the ISO/IEEE11073 (also referred to as X73PHD) family of standards provides a reference framework for medical device interoperability. Hospitals and health care providers using openEHR require flawless integration of data coming from external sources, such as X73PHD. Hence, a harmonization process is crucial for achieving a seamless, coherent use of those specifications in real scenarios. Such harmonization is the aim of this paper. Thus, the classes and attributes of a representative number of X73PHD specializations for medical devices--weight, temperature, blood pressure, pulse and heart rate, oximetry, and electrocardiograph--along with the X73PHD core document--ISO/IEEE11073-20601--have been analyzed and mapped to openEHR archetypes. The proposed methodology reuses the existing archetypes when possible and suggests new ones--or appropriate modifications--otherwise. As a result, this paper analyzes the inconsistencies found and the implications thereof in the coordinated use of these two standards. The procedure has also shown how existing standards are able to influence the archetype development process, enhancing the existing archetype corpus.
Tichit, Paul-Henri; Burokur, Shah Nawaz; Qiu, Cheng-Wei; de Lustrac, André
2013-09-27
It has long been conjectured that isotropic radiation by a simple coherent source is impossible due to changes in polarization. Though hypothetical, the isotropic source is usually taken as the reference for determining a radiator's gain and directivity. Here, we demonstrate both theoretically and experimentally that an isotropic radiator can be made of a simple and finite source surrounded by electric-field-driven LC resonator metamaterials designed by space manipulation. As a proof-of-concept demonstration, we show the first isotropic source with omnidirectional radiation from a dipole source (applicable to all distributed sources), which can open up several possibilities in axion electrodynamics, optical illusion, novel transformation-optic devices, wireless communication, and antenna engineering. Owing to the electric- field-driven LC resonator realization scheme, this principle can be readily applied to higher frequency regimes where magnetism is usually not present.
Reference-based phasing using the Haplotype Reference Consortium panel.
Loh, Po-Ru; Danecek, Petr; Palamara, Pier Francesco; Fuchsberger, Christian; A Reshef, Yakir; K Finucane, Hilary; Schoenherr, Sebastian; Forer, Lukas; McCarthy, Shane; Abecasis, Goncalo R; Durbin, Richard; L Price, Alkes
2016-11-01
Haplotype phasing is a fundamental problem in medical and population genetics. Phasing is generally performed via statistical phasing in a genotyped cohort, an approach that can yield high accuracy in very large cohorts but attains lower accuracy in smaller cohorts. Here we instead explore the paradigm of reference-based phasing. We introduce a new phasing algorithm, Eagle2, that attains high accuracy across a broad range of cohort sizes by efficiently leveraging information from large external reference panels (such as the Haplotype Reference Consortium; HRC) using a new data structure based on the positional Burrows-Wheeler transform. We demonstrate that Eagle2 attains a ∼20× speedup and ∼10% increase in accuracy compared to reference-based phasing using SHAPEIT2. On European-ancestry samples, Eagle2 with the HRC panel achieves >2× the accuracy of 1000 Genomes-based phasing. Eagle2 is open source and freely available for HRC-based phasing via the Sanger Imputation Service and the Michigan Imputation Server.
Open-tube diffusion techniques for InP/LnGaAs heterojunctior bipolar transistors
NASA Astrophysics Data System (ADS)
Schuitemaker, P.; Houston, P. A.
1986-11-01
Open-tube diffusion techniques used between 450 and 600° C are described which involve the supply of diffusant from a vapour source (via a solution) and a solid evaporated metal source. Investigations of Zn into InP and InGaAs(P) have been undertaken using both sources. SIMS profile analyses show that in the case of the vapour source the profiles indicate a concentration-dependent diffusion coefficient while the solid source diffusions can be well described by a Gaussian-type profile. The usefulness of the vapour source method has been demonstrated in the fabrication of bipolar transistors which exhibit good d.c. characteristics. The solid source method is limited by the slow diffusion velocity and more gradual profile. The InGaAs(P)/InP materials system has important applications in optical communications and future high speed microwave and switching devices. Useful technologies allied to the introduction of impurities into Si by diffusion, have gradually been emerging for use in the III-V semiconductor family. Closed tube systems1 have been used in order to contain the volatile group V species and prevent surface erosion. In addition, simpler open tube systems2,3 have been developed that maintain a sufficient overpressure of the group V element. Zn and Cd p-dopants have been studied extensively because of the volatility and relatively large diffusion rates in III-V semiconductors. Opentube diffusion into both InP and InGaAs2-6 has been studied but little detail has appeared concerning InGaAs and InGaAsP. In this paper we describe a comprehensive study of the diffusion of Zn into InP and InGaAs(P) using both open-tube vapour source and a Au/Zn/Au evaporated solid source with SiNx acting both as a mask and also an encapsulant to prevent loss of Zn and decomposition of the substrate material. The techniques have been successfully applied to the fabrication of InP/lnGaAs heterojunction bipolar transistors which show good dc characteristics. Reference to InGaAs in the text implies the InP lattice-matched composition In0.53Ga0.47As.
Young, Matthew M; Dubeau, Chad; Corazza, Ornella
2015-01-01
Objective To determine the feasibility and utility of using media reports and other open-source information collected by the Global Public Health Intelligence Network (GPHIN), an event-based surveillance system operated by the Public Health Agency of Canada, to rapidly detect clusters of adverse drug events associated with ‘novel psychoactive substances’ (NPS) at the international level. Methods and Results Researchers searched English media reports collected by the GPHIN between 1997 and 2013 for references to synthetic cannabinoids. They screened the resulting reports for relevance and content (i.e., reports of morbidity and arrest), plotted and compared with other available indicators (e.g., US poison control center exposures). The pattern of results from the analysis of GPHIN reports resembled the pattern seen from the other indicators. Conclusions The results of this study indicate that using media and other open-source information can help monitor the presence, usage, local policy, law enforcement responses, and spread of NPS in a rapid effective way. Further, modifying GPHIN to actively track NPS would be relatively inexpensive to implement and would be highly complementary to current national and international monitoring efforts. © 2015 The Authors. Human Psychopharmacology: Clinical and Experimental published by John Wiley & Sons, Ltd. PMID:26216568
NASA Astrophysics Data System (ADS)
Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen
2017-04-01
Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.
Firefox add-ons for medical reference.
Hoy, Matthew B
2010-07-01
Firefox is a Web browser created by the Mozilla project, an open-source software group. Features of the browser include automated updates, advanced security and standards compliance, and the ability to add functionality through add-ons and extensions. First introduced in 2004, Firefox now accounts for roughly 30% of the browser market. This article will focus primarily on add-ons and extensions available for the browser that are useful to medical researchers.
Smith, Daniel G A; Burns, Lori A; Sirianni, Dominic A; Nascimento, Daniel R; Kumar, Ashutosh; James, Andrew M; Schriber, Jeffrey B; Zhang, Tianyuan; Zhang, Boyi; Abbott, Adam S; Berquist, Eric J; Lechner, Marvin H; Cunha, Leonardo A; Heide, Alexander G; Waldrop, Jonathan M; Takeshita, Tyler Y; Alenaizan, Asem; Neuhauser, Daniel; King, Rollin A; Simmonett, Andrew C; Turney, Justin M; Schaefer, Henry F; Evangelista, Francesco A; DePrince, A Eugene; Crawford, T Daniel; Patkowski, Konrad; Sherrill, C David
2018-06-11
Psi4NumPy demonstrates the use of efficient computational kernels from the open-source Psi4 program through the popular NumPy library for linear algebra in Python to facilitate the rapid development of clear, understandable Python computer code for new quantum chemical methods, while maintaining a relatively low execution time. Using these tools, reference implementations have been created for a number of methods, including self-consistent field (SCF), SCF response, many-body perturbation theory, coupled-cluster theory, configuration interaction, and symmetry-adapted perturbation theory. Furthermore, several reference codes have been integrated into Jupyter notebooks, allowing background, underlying theory, and formula information to be associated with the implementation. Psi4NumPy tools and associated reference implementations can lower the barrier for future development of quantum chemistry methods. These implementations also demonstrate the power of the hybrid C++/Python programming approach employed by the Psi4 program.
Integrated genome browser: visual analytics platform for genomics.
Freese, Nowlan H; Norris, David C; Loraine, Ann E
2016-07-15
Genome browsers that support fast navigation through vast datasets and provide interactive visual analytics functions can help scientists achieve deeper insight into biological systems. Toward this end, we developed Integrated Genome Browser (IGB), a highly configurable, interactive and fast open source desktop genome browser. Here we describe multiple updates to IGB, including all-new capabilities to display and interact with data from high-throughput sequencing experiments. To demonstrate, we describe example visualizations and analyses of datasets from RNA-Seq, ChIP-Seq and bisulfite sequencing experiments. Understanding results from genome-scale experiments requires viewing the data in the context of reference genome annotations and other related datasets. To facilitate this, we enhanced IGB's ability to consume data from diverse sources, including Galaxy, Distributed Annotation and IGB-specific Quickload servers. To support future visualization needs as new genome-scale assays enter wide use, we transformed the IGB codebase into a modular, extensible platform for developers to create and deploy all-new visualizations of genomic data. IGB is open source and is freely available from http://bioviz.org/igb aloraine@uncc.edu. © The Author 2016. Published by Oxford University Press.
PLUME-FEATHER, Referencing and Finding Software for Research and Education
NASA Astrophysics Data System (ADS)
Bénassy, O.; Caron, C.; Ferret-Canape, C.; Cheylus, A.; Courcelle, E.; Dantec, C.; Dayre, P.; Dostes, T.; Durand, A.; Facq, A.; Gambini, G.; Geahchan, E.; Helft, C.; Hoffmann, D.; Ingarao, M.; Joly, P.; Kieffer, J.; Larré, J.-M.; Libes, M.; Morris, F.; Parmentier, H.; Pérochon, L.; Porte, O.; Romier, G.; Rousse, D.; Tournoy, R.; Valeins, H.
2014-06-01
PLUME-FEATHER is a non-profit project created to Promote economicaL, Useful and Maintained softwarEFor theHigher Education And THE Research communities. The site references software, mainly Free/Libre Open Source Software (FLOSS) from French universities and national research organisations, (CNRS, INRA...), laboratories or departments as well as other FLOSS software used and evaluated by users within these institutions. Each software is represented by a reference card, which describes origin, aim, installation, cost (if applicable) and user experience from the point of view of an academic user for academic users. Presently over 1000 programs are referenced on PLUME by more than 900 contributors. Although the server is maintained by a French institution, it is open to international contributions in the academic domain. All contained and validated contents are visible to anonymous public, whereas (presently more than 2000) registered users can contribute, starting with comments on single software reference cards up to help with the organisation and presentation of the referenced software products. The project has been presented to the HEP community in 2012 for the first time [1]. This is an update of the status and a call for (further) contributions.
Genetically improved BarraCUDA.
Langdon, W B; Lam, Brian Yee Hong
2017-01-01
BarraCUDA is an open source C program which uses the BWA algorithm in parallel with nVidia CUDA to align short next generation DNA sequences against a reference genome. Recently its source code was optimised using "Genetic Improvement". The genetically improved (GI) code is up to three times faster on short paired end reads from The 1000 Genomes Project and 60% more accurate on a short BioPlanet.com GCAT alignment benchmark. GPGPU BarraCUDA running on a single K80 Tesla GPU can align short paired end nextGen sequences up to ten times faster than bwa on a 12 core server. The speed up was such that the GI version was adopted and has been regularly downloaded from SourceForge for more than 12 months.
Integrated Robust Open-Set Speaker Identification System (IROSIS)
2012-05-01
29 LIST OF TABLES Table 1. Detail of NIST Data Used for Training and Testing ............................................ 3 Table 2...scenarios are referred to as VB-YB, VL-YL, VB-YL and VL-YB respectively. Table 1. Detail of NIST Data Used for Training and Testing Purpose Source No...M is the UBM supervector, and that the difference between ( )L m and ( , )Q M m is the Kullback - Leibler divergence between the “alignment” of the
Battlespace Awareness: Heterogeneous Sensor Maps of Large Scale, Complex Environments
2017-06-13
reference frames enable a system designer to describe the position of any sensor or platform at any point of time. This section introduces the...analysis to evaluate the quality of reconstructions created by our algorithms. CloudCompare is an open-source tool designed for this purpose [65]. In...structure of the data. The data term seeks to keep the proposed solution (u) similar to the originally observed values ( f ). A systems designer must
OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.
Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A
2016-01-01
High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.
OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid
Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.
2016-01-01
High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617
NASA Astrophysics Data System (ADS)
Penasa, Luca; Franceschi, Marco; Preto, Nereo; Girardeau-Montaut, Daniel
2015-04-01
Three-dimensional Virtual Outcrop Models (VOMs), often produced using terrestrial laser scanning or photogrammetry, have become popular in the Geosciences. The main feature of a VOM is that it allows for a quantification of the 3D geometry and/or distribution of geologic features that range from rock properties to structural elements. This actually generated much of the interest in VOMs by the oil and gas industry. The potential importance of a VOM in stratigraphy, however, does not seems completely disclosed yet. Indeed outcrops are the primary sources of data for a number of stratigraphic studies (e.g. palaeontology, sedimentology, cyclostratigraphy, geochemistry...). All the observations are typically reported on stratigraphic logs which constitute an idealized representation of the stratigraphic series, drawn by the researcher on the basis of the features that has to be highlighted. The observations are localized by means of manual measurements and a certain amount of subjectivity in log drawing is involved. These facts can prevent the log from being properly pinned to the real outcrop. Moreover, the integration of stratigraphic logs made by different researchers studying the same outcrop may be difficult. The exposure conditions of outcrops can change through time, to the point that they can become unaccessible or even be destroyed. In such a case, linking the stratigraphic log to its physical counterpart becomes impossible. This can be particularly relevant when a classical outcrop or even a GSSP is considered. A VOM may prove useful to tackle these issues, by providing a more objective stratigraphic reference for measurements and by preserving an outcrop through time as a visual representation, thus permitting reference and accurate comparison between observations made through time. Finally, a VOM itself may contain relevant stratigraphic information (e.g. scalar fields associated with the point cloud as intensity, rgb data or hyperspectral information from passive remote sensing devices). This information requires to be merged with geological data collected in the field, in a consistent and reproducible way. We present Vombat, a proof-of-concept of open-source software to illustrate some of the possibilities in terms of information storage, visualization and exploitation of outcrop stratigraphic information. Our solution integrates with CloudCompare, a software that permits to visualize and edit point clouds. A dedicated algorithm estimates stratigraphic attitudes from point cloud data, without the need of exposed planar bedding surfaces. These attitudes can be used to define a virtual stratigraphic section. Composite sections can then be realized defining stratigraphic constraints between different reference frames. Any observation can be displayed in a stratigraphic framework that is directly generated from a VOM. The virtual outcrop, the samples and the stratigraphic reference frames can be saved into an XML file. In the future, the adoption of a standard format (e.g. GeoSciML) will permit easier exchange of stratigraphic data among researchers. The software constitutes a first step towards the full exploitation of VOMs in stratigraphy, is stored at http://github.com/luca-penasa/vombat and is open source. Comments and suggestions are most welcome and will help focusing and refining the software and its tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and... CATEGORIES National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and...
The Calibration Reference Data System
NASA Astrophysics Data System (ADS)
Greenfield, P.; Miller, T.
2016-07-01
We describe a software architecture and implementation for using rules to determine which calibration files are appropriate for calibrating a given observation. This new system, the Calibration Reference Data System (CRDS), replaces what had been previously used for the Hubble Space Telescope (HST) calibration pipelines, the Calibration Database System (CDBS). CRDS will be used for the James Webb Space Telescope (JWST) calibration pipelines, and is currently being used for HST calibration pipelines. CRDS can be easily generalized for use in similar applications that need a rules-based system for selecting the appropriate item for a given dataset; we give some examples of such generalizations that will likely be used for JWST. The core functionality of the Calibration Reference Data System is available under an Open Source license. CRDS is briefly contrasted with a sampling of other similar systems used at other observatories.
Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de
2012-10-01
The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions < 50% in the LMCA and < 70% in the ADA. The agreement for lesions > 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.
An open-source textbook for teaching climate-related risk analysis using the R computing environment
NASA Astrophysics Data System (ADS)
Applegate, P. J.; Keller, K.
2015-12-01
Greenhouse gas emissions lead to increased surface air temperatures and sea level rise. In turn, sea level rise increases the risks of flooding for people living near the world's coastlines. Our own research on assessing sea level rise-related risks emphasizes both Earth science and statistics. At the same time, the free, open-source computing environment R is growing in popularity among statisticians and scientists due to its flexibility and graphics capabilities, as well as its large library of existing functions. We have developed a set of laboratory exercises that introduce students to the Earth science and statistical concepts needed for assessing the risks presented by climate change, particularly sea-level rise. These exercises will be published as a free, open-source textbook on the Web. Each exercise begins with a description of the Earth science and/or statistical concepts that the exercise teaches, with references to key journal articles where appropriate. Next, students are asked to examine in detail a piece of existing R code, and the exercise text provides a clear explanation of how the code works. Finally, students are asked to modify the existing code to produce a well-defined outcome. We discuss our experiences in developing the exercises over two separate semesters at Penn State, plus using R Markdown to interweave explanatory text with sample code and figures in the textbook.
NASA Astrophysics Data System (ADS)
Fulker, D. W.; Gallagher, J. H. R.
2015-12-01
OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of engineering expertise, via contracts and grants, is the economic engine. Hence sustainability, as needed to address global grand challenges in geoscience, depends on agencies' and others' abilities and willingness to offer grants and let contracts for continually upgrading open-source software from OPeNDAP and others.
NASA Astrophysics Data System (ADS)
De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel
2015-04-01
Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )
Gamma-sky.net: Portal to the gamma-ray sky
NASA Astrophysics Data System (ADS)
Voruganti, Arjun; Deil, Christoph; Donath, Axel; King, Johannes
2017-01-01
http://gamma-sky.net is a novel interactive website designed for exploring the gamma-ray sky. The Map View portion of the site is powered by the Aladin Lite sky atlas, providing a scalable survey image tesselated onto a three-dimensional sphere. The map allows for interactive pan and zoom navigation as well as search queries by sky position or object name. The default image overlay shows the gamma-ray sky observed by the Fermi-LAT gamma-ray space telescope. Other survey images (e.g. Planck microwave images in low/high frequency bands, ROSAT X-ray image) are available for comparison with the gamma-ray data. Sources from major gamma-ray source catalogs of interest (Fermi-LAT 2FHL, 3FGL and a TeV source catalog) are overlaid over the sky map as markers. Clicking on a given source shows basic information in a popup, and detailed pages for every source are available via the Catalog View component of the website, including information such as source classification, spectrum and light-curve plots, and literature references. We intend for gamma-sky.net to be applicable for both professional astronomers as well as the general public. The website started in early June 2016 and is being developed as an open-source, open data project on GitHub (https://github.com/gammapy/gamma-sky). We plan to extend it to display more gamma-ray and multi-wavelength data. Feedback and contributions are very welcome!
Open Data and Open Science for better Research in the Geo and Space Domain
NASA Astrophysics Data System (ADS)
Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.
2015-12-01
Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data catalog based on semantical interoperability including the transparent access to data in relational data bases. References: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/207772/Open_Data_Charter.pdfhttp://www.openscience.org/blog/wp-content/uploads/2013/06/OpenSciencePoster.pdf
Optical 3-Way Handshake (O3WHS) Protocol Simulation in OMNeT++
2017-06-01
PERSON Vinod K Mishra a. REPORT Unclassified b. ABSTRACT Unclassified c . THIS PAGE Unclassified 19b. TELEPHONE NUMBER (Include area code) 410...popular program called OMNeT++2 for that purpose. It is an open-source discrete event simulator tool written in C ++ language. It has been chiefly...References 1. Von Lehmen A, Doverspike R, Clapp G, Freimuth DM, Gannett J, Kolarov A, Kobrinski H, Makaya C , Mavrogiorgis E, Pastor J, Rauch M
Hall, Stephen H.
1996-01-01
The present invention is a reference half-cell electrode wherein intermingling of test fluid with reference fluid does not affect the performance of the reference half-cell over a long time. This intermingling reference half-cell may be used as a single or double junction submersible or surface reference electrode. The intermingling reference half-cell relies on a capillary tube having a first end open to reference fluid and a second end open to test fluid wherein the small diameter of the capillary tube limits free motion of fluid within the capillary to diffusion. The electrode is placed near the first end of the capillary in contact with the reference fluid. The method of operation of the present invention begins with filling the capillary tube with a reference solution. After closing the first end of the capillary, the capillary tube may be fully submerged or partially submerged with the second open end inserted into test fluid. Since the electrode is placed near the first end of the capillary, and since the test fluid may intermingle with the reference fluid through the second open end only by diffusion, this intermingling capillary reference half-cell provides a stable voltage potential for long time periods.
Hall, S.H.
1996-02-13
The present invention is a reference half-cell electrode wherein intermingling of test fluid with reference fluid does not affect the performance of the reference half-cell over a long time. This intermingling reference half-cell may be used as a single or double junction submersible or surface reference electrode. The intermingling reference half-cell relies on a capillary tube having a first end open to reference fluid and a second end open to test fluid wherein the small diameter of the capillary tube limits free motion of fluid within the capillary to diffusion. The electrode is placed near the first end of the capillary in contact with the reference fluid. The method of operation of the present invention begins with filling the capillary tube with a reference solution. After closing the first end of the capillary, the capillary tube may be fully submerged or partially submerged with the second open end inserted into test fluid. Since the electrode is placed near the first end of the capillary, and since the test fluid may intermingle with the reference fluid through the second open end only by diffusion, this intermingling capillary reference half-cell provides a stable voltage potential for long time periods. 11 figs.
Fisher Matrix Preloaded — FISHER4CAST
NASA Astrophysics Data System (ADS)
Bassett, Bruce A.; Fantaye, Yabebal; Hlozek, Renée; Kotze, Jacques
The Fisher Matrix is the backbone of modern cosmological forecasting. We describe the Fisher4Cast software: A general-purpose, easy-to-use, Fisher Matrix framework. It is open source, rigorously designed and tested and includes a Graphical User Interface (GUI) with automated LATEX file creation capability and point-and-click Fisher ellipse generation. Fisher4Cast was designed for ease of extension and, although written in Matlab, is easily portable to open-source alternatives such as Octave and Scilab. Here we use Fisher4Cast to present new 3D and 4D visualizations of the forecasting landscape and to investigate the effects of growth and curvature on future cosmological surveys. Early releases have been available at since mid-2008. The current release of the code is Version 2.2 which is described here. For ease of reference a Quick Start guide and the code used to produce the figures in this paper are included, in the hope that it will be useful to the cosmology and wider scientific communities.
A Stigmergy Collaboration Approach in the Open Source Software Developer Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Xiaohui; Pullum, Laura L; Treadwell, Jim N
2009-01-01
The communication model of some self-organized online communities is significantly different from the traditional social network based community. It is problematic to use social network analysis to analyze the collaboration structure and emergent behaviors in these communities because these communities lack peer-to-peer connections. Stigmergy theory provides an explanation of the collaboration model of these communities. In this research, we present a stigmergy approach for building an agent-based simulation to simulate the collaboration model in the open source software (OSS) developer community. We used a group of actors who collaborate on OSS projects through forums as our frame of reference andmore » investigated how the choices actors make in contributing their work on the projects determines the global status of the whole OSS project. In our simulation, the forum posts serve as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing the developer agents behavior selection probability.« less
Development of a Framework to Characterise the Openness of Mathematical Tasks
ERIC Educational Resources Information Center
Yeo, Joseph B. W.
2017-01-01
Educators usually mean different constructs when they speak of open tasks: some may refer to pure-mathematics investigative tasks while others may have authentic real-life tasks in mind; some may think of the answer being open while others may refer to an open method. On the other hand, some educators use different terms, e.g. open and open-ended,…
Noise Radiation Of A Strongly Pulsating Tailpipe Exhaust
NASA Astrophysics Data System (ADS)
Peizi, Li; Genhua, Dai; Zhichi, Zhu
1993-11-01
The method of characteristics is used to solve the problem of the propagation of a strongly pulsating flow in an exhaust system tailpipe. For a strongly pulsating exhaust, the flow may shock at the pipe's open end at some point in a pulsating where the flow pressure exceeds its critical value. The method fails if one insists on setting the flow pressure equal to the atmospheric pressure as the pipe end boundary condition. To solve the problem, we set the Mach number equal to 1 as the boundary condition when the flow pressure exceeds its critical value. For a strongly pulsating flow, the fluctuations of flow variables may be much higher than their respective time averages. Therefore, the acoustic radiation method would fail in the computation of the noise radiation from the pipe's open end. We simulate the exhaust flow out of the open end as a simple sound source to compute the noise radiation, which has been successfully applied in reference [1]. The simple sound source strength is proportional to the volume acceleration of exhaust gas. Also computed is the noise radiation from the turbulence of the exhaust flow, as was done in reference [1]. Noise from a reciprocating valve simulator has been treated in detail. The radiation efficiency is very low for the pressure range considered and is about 10 -5. The radiation efficiency coefficient increases with the square of the frequency. Computation of the pipe length dependence of the noise radiation and mass flux allows us to design a suitable length for an aerodynamic noise generator or a reciprocating internal combustion engine. For the former, powerful noise radiation is preferable. For the latter, maximum mass flux is desired because a freer exhaust is preferable.
A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...
2016-01-01
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
The Emergence of Open-Source Software in China
ERIC Educational Resources Information Center
Pan, Guohua; Bonk, Curtis J.
2007-01-01
The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…
A Study of Clinically Related Open Source Software Projects
Hogarth, Michael A.; Turner, Stuart
2005-01-01
Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056
MolProbity: More and better reference data for improved all-atom structure validation.
Williams, Christopher J; Headd, Jeffrey J; Moriarty, Nigel W; Prisant, Michael G; Videau, Lizbeth L; Deis, Lindsay N; Verma, Vishal; Keedy, Daniel A; Hintze, Bradley J; Chen, Vincent B; Jain, Swati; Lewis, Steven M; Arendall, W Bryan; Snoeyink, Jack; Adams, Paul D; Lovell, Simon C; Richardson, Jane S; Richardson, David C
2018-01-01
This paper describes the current update on macromolecular model validation services that are provided at the MolProbity website, emphasizing changes and additions since the previous review in 2010. There have been many infrastructure improvements, including rewrite of previous Java utilities to now use existing or newly written Python utilities in the open-source CCTBX portion of the Phenix software system. This improves long-term maintainability and enhances the thorough integration of MolProbity-style validation within Phenix. There is now a complete MolProbity mirror site at http://molprobity.manchester.ac.uk. GitHub serves our open-source code, reference datasets, and the resulting multi-dimensional distributions that define most validation criteria. Coordinate output after Asn/Gln/His "flip" correction is now more idealized, since the post-refinement step has apparently often been skipped in the past. Two distinct sets of heavy-atom-to-hydrogen distances and accompanying van der Waals radii have been researched and improved in accuracy, one for the electron-cloud-center positions suitable for X-ray crystallography and one for nuclear positions. New validations include messages at input about problem-causing format irregularities, updates of Ramachandran and rotamer criteria from the million quality-filtered residues in a new reference dataset, the CaBLAM Cα-CO virtual-angle analysis of backbone and secondary structure for cryoEM or low-resolution X-ray, and flagging of the very rare cis-nonProline and twisted peptides which have recently been greatly overused. Due to wide application of MolProbity validation and corrections by the research community, in Phenix, and at the worldwide Protein Data Bank, newly deposited structures have continued to improve greatly as measured by MolProbity's unique all-atom clashscore. © 2017 The Protein Society.
ERIC Educational Resources Information Center
Krishnamurthy, M.
2008-01-01
Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…
Tao, Lei; Sun, Kang; Khan, M Amir; Miller, David J; Zondlo, Mark A
2012-12-17
A compact and portable open-path sensor for simultaneous detection of atmospheric N(2)O and CO has been developed with a 4.5 μm quantum cascade laser (QCL). An in-line acetylene (C(2)H(2)) gas reference cell allows for continuous monitoring of the sensor drift and calibration in rapidly changing field environments and thereby allows for open-path detection at high precision and stability. Wavelength modulation spectroscopy (WMS) is used to detect simultaneously both the second and fourth harmonic absorption spectra with an optimized dual modulation amplitude scheme. Multi-harmonic spectra containing atmospheric N(2)O, CO, and the reference C(2)H(2) signals are fit in real-time (10 Hz) by combining a software-based lock-in amplifier with a computationally fast numerical model for WMS. The sensor consumes ~50 W of power and has a mass of ~15 kg. Precision of 0.15 ppbv N(2)O and 0.36 ppbv CO at 10 Hz under laboratory conditions was demonstrated. The sensor has been deployed for extended periods in the field. Simultaneous N(2)O and CO measurements distinguished between natural and fossil fuel combustion sources of N(2)O, an important greenhouse gas with poorly quantified emissions in space and time.
NASA Astrophysics Data System (ADS)
Lachat, E.; Landes, T.; Grussenmeyer, P.
2018-05-01
Terrestrial and airborne laser scanning, photogrammetry and more generally 3D recording techniques are used in a wide range of applications. After recording several individual 3D datasets known in local systems, one of the first crucial processing steps is the registration of these data into a common reference frame. To perform such a 3D transformation, commercial and open source software as well as programs from the academic community are available. Due to some lacks in terms of computation transparency and quality assessment in these solutions, it has been decided to develop an open source algorithm which is presented in this paper. It is dedicated to the simultaneous registration of multiple point clouds as well as their georeferencing. The idea is to use this algorithm as a start point for further implementations, involving the possibility of combining 3D data from different sources. Parallel to the presentation of the global registration methodology which has been employed, the aim of this paper is to confront the results achieved this way with the above-mentioned existing solutions. For this purpose, first results obtained with the proposed algorithm to perform the global registration of ten laser scanning point clouds are presented. An analysis of the quality criteria delivered by two selected software used in this study and a reflexion about these criteria is also performed to complete the comparison of the obtained results. The final aim of this paper is to validate the current efficiency of the proposed method through these comparisons.
NASA Astrophysics Data System (ADS)
Giebel, B. M.; Riemer, D. D.; Swart, P. K.
2008-12-01
Determining δ13C values for reduced hydrocarbons in atmospheric samples is emerging as an important area of interest in isotopic analytical chemistry. The importance of stable isotopic data stems from its usefulness to differentiate between multiple sources and allows for an assessment of changing source structure and source strength in a constantly changing environment. Though much stable isotopic work is available on CH4 and other VOCs, particularly NMHCs, few studies have focused on oxygenated volatile organic compounds (OVOCs) such as methanol, ethanol, acetone, and propanal. Both anthropogenic and biogenic sources exist for these OVOCs and their role in atmospheric chemistry is important. The OVOCs of interest here are found in very low concentrations in ambient air (low ppbv to high pptv) and thus provide unique challenges for analysis by GC-C-IRMS. To address the challenges of measuring OVOCs, a Hewlett Packard 6890 gas chromatograph interfaced with a Europa Scientific Geo 20-20 IRMS was modified to accept ambient atmospheric samples. To sharpen peak shape all dead volume within the system was minimized; starting with the addition of a fused silica combustion tube (0.25 mm i.d.) containing Cu, Pt, or Ni wires (0.1 mm dia.). To assist water removal from the sample stream before delivery to the IRMS a small volume nafion dryer (0.20 mm i.d.) and a water-trap submersed in a dry-ice / acetone slurry were tested individually. Deactivated fused silica (0.1 mm i.d.) joins the custom designed open split to the ion source and effectively decreases dead volume while maintaining chromatographic separation and desired source pressure. To decrease the variability of the instrumentation, and to increase the total amount of carbon at the ion source, total carrier gas flow is reduced to 0.7 mL/min. Reference gas addition is manually facilitated by a six port rotary valve upstream of the open split and delivers diluted CO2 reference gas (0.1% CO2 in He) directly to the ion source while maintaining continuous flow conditions from the gas chromatograph. Experimental results of initial biogenic source sampling will be presented and future directions will be discussed.
New Open-Source Version of FLORIS Released | News | NREL
New Open-Source Version of FLORIS Released New Open-Source Version of FLORIS Released January 26 , 2018 National Renewable Energy Laboratory (NREL) researchers recently released an updated open-source simplified and documented. Because of the living, open-source nature of the newly updated utility, NREL
EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.
Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice
2015-01-01
The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.
EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome
Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice
2015-01-01
The brain is a large-scale complex network often referred to as the “connectome”. Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/. PMID:26379232
NASA Astrophysics Data System (ADS)
Cancio, A.; Colazo, M.; García, B.
2017-10-01
In December 2012, the European Space Agency opened its third Deep Space Station in Malargüe, province of Mendoza, Argentina. Due to the nature of its operations, the antenna has requirements for the stability of reference signals and low phase noise equipment that makes it a candidate for use in radio astronomy applications. The present work evaluates the first experience of observation of astronomical sources.
ASTRONAUTICS INFORMATION. OPEN LITERATURE SURVEY, VOLUME III, NO. 2 (ENTRIES 30,202-30,404)
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1961-02-01
<>15:014925. An annotated list of references on temperature control of satellite and space vehicles is presented. Methods and systems for maintaining vehicles within tolerable temperature bounds while operating outside planetary atmospheres are outlined. Discussions of the temperature environment in space and how it might affect vehicle operation are given. Re-entry heating problems are not included. Among the sources used were: Engineering Index, Applied Science and Technology Index, Astronautics Abstracts, PAL uniterm index, ASTIA, and LMSD card catalog. (auth)
2010-08-01
use of these submodels was reached. The details of the four agent-specific meetings, including the dates , locations, and participating SMEs are...Protected Armored Personnel Carrier – Closed 0% 0% Armored Personnel Carrier – Moving 50% 0% Armored Personnel Carrier – Open 100% 0% Earth Shelter 75...limited number of radioisotopes , typically emitting gamma and/or beta radiation. RDDs include at least two types of radiation sources: 1) point
The successes and challenges of open-source biopharmaceutical innovation.
Allarakhia, Minna
2014-05-01
Increasingly, open-source-based alliances seek to provide broad access to data, research-based tools, preclinical samples and downstream compounds. The challenge is how to create value from open-source biopharmaceutical innovation. This value creation may occur via transparency and usage of data across the biopharmaceutical value chain as stakeholders move dynamically between open source and open innovation. In this article, several examples are used to trace the evolution of biopharmaceutical open-source initiatives. The article specifically discusses the technological challenges associated with the integration and standardization of big data; the human capacity development challenges associated with skill development around big data usage; and the data-material access challenge associated with data and material access and usage rights, particularly as the boundary between open source and open innovation becomes more fluid. It is the author's opinion that the assessment of when and how value creation will occur, through open-source biopharmaceutical innovation, is paramount. The key is to determine the metrics of value creation and the necessary technological, educational and legal frameworks to support the downstream outcomes of now big data-based open-source initiatives. The continued focus on the early-stage value creation is not advisable. Instead, it would be more advisable to adopt an approach where stakeholders transform open-source initiatives into open-source discovery, crowdsourcing and open product development partnerships on the same platform.
Performance comparison of AV1, HEVC, and JVET video codecs on 360 (spherical) video
NASA Astrophysics Data System (ADS)
Topiwala, Pankaj; Dai, Wei; Krishnan, Madhu; Abbas, Adeel; Doshi, Sandeep; Newman, David
2017-09-01
This paper compares the coding efficiency performance on 360 videos, of three software codecs: (a) AV1 video codec from the Alliance for Open Media (AOM); (b) the HEVC Reference Software HM; and (c) the JVET JEM Reference SW. Note that 360 video is especially challenging content, in that one codes full res globally, but typically looks locally (in a viewport), which magnifies errors. These are tested in two different projection formats ERP and RSP, to check consistency. Performance is tabulated for 1-pass encoding on two fronts: (1) objective performance based on end-to-end (E2E) metrics such as SPSNR-NN, and WS-PSNR, currently developed in the JVET committee; and (2) informal subjective assessment of static viewports. Constant quality encoding is performed with all the three codecs for an unbiased comparison of the core coding tools. Our general conclusion is that under constant quality coding, AV1 underperforms HEVC, which underperforms JVET. We also test with rate control, where AV1 currently underperforms the open source X265 HEVC codec. Objective and visual evidence is provided.
ERIC Educational Resources Information Center
Voyles, Bennett
2007-01-01
People know about the Sakai Project (open source course management system); they may even know about Kuali (open source financials). So, what is the next wave in open source software? This article discusses business intelligence (BI) systems. Though open source BI may still be only a rumor in most campus IT departments, some brave early adopters…
The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.
Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin
2007-11-01
This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.
Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets.
Clark, Alex M; Dole, Krishna; Coulon-Spektor, Anna; McNutt, Andrew; Grass, George; Freundlich, Joel S; Reynolds, Robert C; Ekins, Sean
2015-06-22
On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user's own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery.
Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets
2015-01-01
On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user’s own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery. PMID:25994950
Hansen, A.D.
1988-01-25
An improved aethalometer having a single light source and a single light detector and two light paths from the light source to the light detector. A quartz fiber filter is inserted in the device, the filter having a collection area in one light path and a reference area in the other light path. A gas flow path through the aethalometer housing allows ambient air to flow through the collection area of the filter so that aerosol particles can be collected on the filter. A rotating disk with an opening therethrough allows light for the light source to pass alternately through the two light paths. The voltage output of the detector is applied to a VCO and the VCO pulses for light transmission separately through the two light paths, are counted and compared to determine the absorption coefficient of the collected aerosol particles. 5 figs.
NASA Astrophysics Data System (ADS)
Udell, C.; Selker, J. S.
2017-12-01
The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.
QSAR DataBank - an approach for the digital organization and archiving of QSAR model information
2014-01-01
Background Research efforts in the field of descriptive and predictive Quantitative Structure-Activity Relationships or Quantitative Structure–Property Relationships produce around one thousand scientific publications annually. All the materials and results are mainly communicated using printed media. The printed media in its present form have obvious limitations when they come to effectively representing mathematical models, including complex and non-linear, and large bodies of associated numerical chemical data. It is not supportive of secondary information extraction or reuse efforts while in silico studies poses additional requirements for accessibility, transparency and reproducibility of the research. This gap can and should be bridged by introducing domain-specific digital data exchange standards and tools. The current publication presents a formal specification of the quantitative structure-activity relationship data organization and archival format called the QSAR DataBank (QsarDB for shorter, or QDB for shortest). Results The article describes QsarDB data schema, which formalizes QSAR concepts (objects and relationships between them) and QsarDB data format, which formalizes their presentation for computer systems. The utility and benefits of QsarDB have been thoroughly tested by solving everyday QSAR and predictive modeling problems, with examples in the field of predictive toxicology, and can be applied for a wide variety of other endpoints. The work is accompanied with open source reference implementation and tools. Conclusions The proposed open data, open source, and open standards design is open to public and proprietary extensions on many levels. Selected use cases exemplify the benefits of the proposed QsarDB data format. General ideas for future development are discussed. PMID:24910716
Open-source software: not quite endsville.
Stahl, Matthew T
2005-02-01
Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.
Developing an Open Source Option for NASA Software
NASA Technical Reports Server (NTRS)
Moran, Patrick J.; Parks, John W. (Technical Monitor)
2003-01-01
We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.
Schober, Daniel; Jacob, Daniel; Wilson, Michael; Cruz, Joseph A; Marcu, Ana; Grant, Jason R; Moing, Annick; Deborde, Catherine; de Figueiredo, Luis F; Haug, Kenneth; Rocca-Serra, Philippe; Easton, John; Ebbels, Timothy M D; Hao, Jie; Ludwig, Christian; Günther, Ulrich L; Rosato, Antonio; Klein, Matthias S; Lewis, Ian A; Luchinat, Claudio; Jones, Andrew R; Grauslys, Arturas; Larralde, Martin; Yokochi, Masashi; Kobayashi, Naohiro; Porzel, Andrea; Griffin, Julian L; Viant, Mark R; Wishart, David S; Steinbeck, Christoph; Salek, Reza M; Neumann, Steffen
2018-01-02
NMR is a widely used analytical technique with a growing number of repositories available. As a result, demands for a vendor-agnostic, open data format for long-term archiving of NMR data have emerged with the aim to ease and encourage sharing, comparison, and reuse of NMR data. Here we present nmrML, an open XML-based exchange and storage format for NMR spectral data. The nmrML format is intended to be fully compatible with existing NMR data for chemical, biochemical, and metabolomics experiments. nmrML can capture raw NMR data, spectral data acquisition parameters, and where available spectral metadata, such as chemical structures associated with spectral assignments. The nmrML format is compatible with pure-compound NMR data for reference spectral libraries as well as NMR data from complex biomixtures, i.e., metabolomics experiments. To facilitate format conversions, we provide nmrML converters for Bruker, JEOL and Agilent/Varian vendor formats. In addition, easy-to-use Web-based spectral viewing, processing, and spectral assignment tools that read and write nmrML have been developed. Software libraries and Web services for data validation are available for tool developers and end-users. The nmrML format has already been adopted for capturing and disseminating NMR data for small molecules by several open source data processing tools and metabolomics reference spectral libraries, e.g., serving as storage format for the MetaboLights data repository. The nmrML open access data standard has been endorsed by the Metabolomics Standards Initiative (MSI), and we here encourage user participation and feedback to increase usability and make it a successful standard.
Open-Source Data and the Study of Homicide.
Parkin, William S; Gruenewald, Jeff
2015-07-20
To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.
Nurturing reliable and robust open-source scientific software
NASA Astrophysics Data System (ADS)
Uieda, L.; Wessel, P.
2017-12-01
Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo (zenodo.org). However, citations to these sources are not always recognized when computing citation metrics. In summary, the widespread development of reliable and robust open-source software relies on the creation of formal training programs in software development best practices and the recognition of software as a valid form of scholarship.
Rosner, David; Markowitz, Gerald; Chowkwanyun, Merlin
2018-02-01
As a result of a legal mechanism called discovery, the authors accumulated millions of internal corporate and trade association documents related to the introduction of new products and chemicals into workplaces and commerce. What did these private entities discuss among themselves and with their experts? The plethora of documents, both a blessing and a curse, opened new sources and interesting questions about corporate and regulatory histories. But they also posed an almost insurmountable challenge to historians. Thus emerged ToxicDocs, possible only with a technological innovation known as "Big Data." That refers to the sheer volume of new digital data and to the computational power to analyze them. Users will be able to identify what firms knew (or did not know) about the dangers of toxic substances in their products-and when. The database opens many areas to inquiry including environmental studies, business history, government regulation, and public policy. ToxicDocs will remain a resource free and open to all, anywhere in the world.
NASA Astrophysics Data System (ADS)
Baudin, Veronique; Gomez-Diaz, Teresa
2013-04-01
The PLUME open platform (https://www.projet-plume.org) has as first goal to share competences and to value the knowledge of software experts within the French higher education and research communities. The project proposes in its platform the access to more than 380 index cards describing useful and economic software for this community, with open access to everybody. The second goal of PLUME focuses on to improve the visibility of software produced by research laboratories within the higher education and research communities. The "development-ESR" index cards briefly describe the main features of the software, including references to research publications associated to it. The platform counts more than 300 cards describing research software, where 89 cards have an English version. In this talk we describe the theme classification and the taxonomy of the index cards and the evolution with new themes added to the project. We will also focus on the organisation of PLUME as an open project and its interests in the promotion of free/open source software from and for research, contributing to the creation of a community of shared knowledge.
NASA Astrophysics Data System (ADS)
Han, X.; Li, X.; He, G.; Kumbhar, P.; Montzka, C.; Kollet, S.; Miyoshi, T.; Rosolem, R.; Zhang, Y.; Vereecken, H.; Franssen, H.-J. H.
2015-08-01
Data assimilation has become a popular method to integrate observations from multiple sources with land surface models to improve predictions of the water and energy cycles of the soil-vegetation-atmosphere continuum. Multivariate data assimilation refers to the simultaneous assimilation of observation data from multiple model state variables into a simulation model. In recent years, several land data assimilation systems have been developed in different research agencies. Because of the software availability or adaptability, these systems are not easy to apply for the purpose of multivariate land data assimilation research. We developed an open source multivariate land data assimilation framework (DasPy) which is implemented using the Python script language mixed with the C++ and Fortran programming languages. LETKF (Local Ensemble Transform Kalman Filter) is implemented as the main data assimilation algorithm, and uncertainties in the data assimilation can be introduced by perturbed atmospheric forcing data, and represented by perturbed soil and vegetation parameters and model initial conditions. The Community Land Model (CLM) was integrated as the model operator. The implementation allows also parameter estimation (soil properties and/or leaf area index) on the basis of the joint state and parameter estimation approach. The Community Microwave Emission Modelling platform (CMEM), COsmic-ray Soil Moisture Interaction Code (COSMIC) and the Two-Source Formulation (TSF) were integrated as observation operators for the assimilation of L-band passive microwave, cosmic-ray soil moisture probe and land surface temperature measurements, respectively. DasPy has been evaluated in several assimilation studies of neutron count intensity (soil moisture), L-band brightness temperature and land surface temperature. DasPy is parallelized using the hybrid Message Passing Interface and Open Multi-Processing techniques. All the input and output data flows are organized efficiently using the commonly used NetCDF file format. Online 1-D and 2-D visualization of data assimilation results is also implemented to facilitate the post simulation analysis. In summary, DasPy is a ready to use open source parallel multivariate land data assimilation framework.
Using Open Space Technology for School Improvement.
ERIC Educational Resources Information Center
Cox, David
2002-01-01
Describes a theory referred to as Open Space Technology (OST), which holds that the most productive learning in conference settings takes place in the open space between formally scheduled conference sessions. Argues that OST can be applied to staff development days and other educational development programs. (Contains 10 references.) (NB)
ERIC Educational Resources Information Center
Kapor, Mitchell
2005-01-01
Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…
Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne
2011-09-28
Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.
Longo, F; Finotti, L; Bellini, L; Zavan, B; Busetto, R; Isola, M
2016-05-01
A 15-year-old female huacaya alpaca (Vicugna pacos) was referred because of a non-weight-bearing lameness (4/4) in the left pelvic limb caused by a grade three open metatarsal fracture. The referring veterinarian treated the fracture with conservative management using bandages, but it progressively evolved to a non-union. Clinical examination revealed external wounds on the medial and lateral surfaces of the metatarsus. Radiographs confirmed an open, nonarticular, displaced, diaphyseal fracture of the left metatarsus. Cancellous bone was sourced from bovine proximal and distal femur epiphyses, followed by a thermal shock procedure to achieve decellularisation, to produce a xenograft. Open reduction and internal fixation of the fracture using locking plates was performed. Alignment of the fracture fragments was corrected and the xenograft was placed at the debrided fracture site to stimulate and harness osteogenesis in situ. Clinical and radiographic follow-up was performed up to 40 weeks postoperatively. Clinical evaluations revealed that the alpaca gradually increased weight bearing following bandage removal 10 days after surgery. Serial radiographs showed correct alignment of the left metatarsus, progressive bone modelling and, complete bone union at 12 weeks. Ten months postoperatively the alpaca showed no signs of lameness and resumed normal activity. For management of a metatarsal non-union, a combination of bovine xenograft application and angular stable internal fixation progressed toward an excellent long-term recovery.
Weather forecasting with open source software
NASA Astrophysics Data System (ADS)
Rautenhaus, Marc; Dörnbrack, Andreas
2013-04-01
To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.
Open Source Software Development
2011-01-01
Software, 2002, 149(1), 3-17. 3. DiBona , C., Cooper, D., and Stone, M. (Eds.), Open Sources 2.0, 2005, O’Reilly Media, Sebastopol, CA. Also see, C... DiBona , S. Ockman, and M. Stone (Eds.). Open Sources: Vocides from the Open Source Revolution, 1999. O’Reilly Media, Sebastopol, CA. 4. Ducheneaut, N
Medical Data GRIDs as approach towards secure cross enterprise document sharing (based on IHE XDS).
Wozak, Florian; Ammenwerth, Elske; Breu, Micheal; Penz, Robert; Schabetsberger, Thomas; Vogl, Raimund; Wurz, Manfred
2006-01-01
Quality and efficiency of health care services is expected to be improved by the electronic processing and trans-institutional availability of medical data. A prototype architecture based on the IHE-XDS profile is currently being developed. Due to legal and organizational requirements specific adaptations to the IHE-XDS profile have been made. In this work the services of the health@net reference architecture are described in details, which have been developed with focus on compliance to both, the IHE-XDS profile and the legal situation in Austria. We expect to gain knowledge about the development of a shared electronic health record using Medical Data Grids as an Open Source reference implementation and how proprietary Hospital Information systems can be integrated in this environment.
Firewall Traversal for CORBA Applications Using an Implementation of Bidirectional IIOP in MICO
NASA Technical Reports Server (NTRS)
Griffin, Robert I.; Lopez, Isaac (Technical Monitor)
2002-01-01
The Object Management Group (OMG) has added specifications to the General Inter-ORB Protocol (GIOP 1.2), specifically the Internet Inter-ORB Protocol (IIOP 1.2), that allow servers and clients on opposing sides of a firewall to reverse roles and still communicate freely. This addition to the GIOP specifications is referred to as Bidirectional GIOP. The implementation of these specifications as applied to communication over TCP/IP connections is referred to as 'Bidirectional Internet Inter-ORB Protocol' or BiDirIIOP. This paper details the implementation and testing of the BiDirIIOP Specification in an open source ORB, MICO, that did not previously support Bidirectional GIOP. It also provides simple contextual information and a description of the OMG GIOP/IIOP messaging protocols.
Lalanne, Jennifer; Rozenberg, Johanna; Grolleau, Pauline; Piolino, Pascale
2013-12-01
The Self-reference effect (SRE) on long-term episodic memory and autonoetic consciousness has been investigated in young adults, scarcely in older adults, but never in Alzheimer's patients. Is the functional influence of Selfreference still present when the individual's memory and identity are impaired? We investigated this issue in 60 young subjects, 41 elderly subjects, and 28 patients with Alzheimer's disease, by using 1) an incidental learning task of personality traits in three encoding conditions, inducing variable degrees of depth of processing and personal involvement, 2) a 2- minute retention interval free recall task, and 3) a 20-minute delayed recognition task, combined with a remember-know paradigm. Each recorded score was corrected for errors (intrusions in free recall, false alarms in recognition, and false source memory in remember responses). Compared with alternative encodings, the Self-reference significantly enhanced performance on the free recall task in the young group, and on the recognition task both in the young and older groups but not in the Alzheimer group. The most important finding in the Alzheimer group is that the Self-reference led the most often to a subjective sense of remembering (especially for the positive words) with the retrieval of the correct encoding source. This Self-reference recollection effect in patients was related to independent subjective measures of a positive and definite sense of Self (measured by the Tennessee Self Concept Scale), and to memory complaints in daily life. In conclusion, these results demonstrated the power and robustness of the Self-reference effect on recollection in long-term episodic memory in Alzheimer's disease, albeit the retrieval is considerably reduced. These results should open new perspectives for the development of rehabilitation programs for memory deficits.
Open-source hardware for medical devices
2016-01-01
Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528
Open-source hardware for medical devices.
Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold
2016-04-01
Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.
The case for open-source software in drug discovery.
DeLano, Warren L
2005-02-01
Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.
Hamilton, Samina; Bernstein, Aaron B; Blakey, Graham; Fagan, Vivien; Farrow, Tracy; Jordan, Debbie; Seiler, Walther; Shannon, Anna; Gertel, Art
2016-01-01
Interventional clinical studies conducted in the regulated drug research environment are reported using International Council for Harmonisation (ICH) regulatory guidance documents: ICH E3 on the structure and content of clinical study reports (CSRs) published in 1995 and ICH E3 supplementary Questions & Answers (Q & A) published in 2012.Since the ICH guidance documents were published, there has been heightened awareness of the importance of disclosure of clinical study results. The use of the CSR as a key source document to fulfil emerging obligations has resulted in a re-examination of how ICH guidelines are applied in CSR preparation. The dynamic regulatory and modern drug development environments create emerging reporting challenges. Regulatory medical writing and statistical professionals developed Clarity and Openness in Reporting: E3-based (CORE) Reference over a 2-year period. Stakeholders contributing expertise included a global industry association, regulatory agency, patient advocate, academic and Principal Investigator representatives. CORE Reference should help authors navigate relevant guidelines as they create CSR content relevant for today's studies. It offers practical suggestions for developing CSRs that will require minimum redaction and modification prior to public disclosure.CORE Reference comprises a Preface, followed by the actual resource. The Preface clarifies intended use and underlying principles that inform resource utility. The Preface lists references contributing to development of the resource, which broadly fall into 'regulatory' and 'public disclosure' categories. The resource includes ICH E3 guidance text, ICH E3 Q & A 2012-derived guidance text and CORE Reference text, distinguished from one another through the use of shading. Rationale comments are used throughout for clarification purposes.A separate mapping tool comparing ICH E3 sectional structure and CORE Reference sectional structure is also provided.Together, CORE Reference and the mapping tool constitute the user manual. This publication is intended to enhance the use, understanding and dissemination of CORE Reference.The CORE Reference user manual and the associated website (http://www.core-reference.org) should improve the reporting of interventional clinical studies.Periodic updates of CORE Reference are planned to maintain its relevance. CORE Reference was registered with http://www.equator-network.org on 23 March 2015.
Vanderperre, Benoît; Lucier, Jean-François; Bissonnette, Cyntia; Motard, Julie; Tremblay, Guillaume; Vanderperre, Solène; Wisztorski, Maxence; Salzet, Michel; Boisvert, François-Michel; Roucou, Xavier
2013-01-01
A fully mature mRNA is usually associated to a reference open reading frame encoding a single protein. Yet, mature mRNAs contain unconventional alternative open reading frames (AltORFs) located in untranslated regions (UTRs) or overlapping the reference ORFs (RefORFs) in non-canonical +2 and +3 reading frames. Although recent ribosome profiling and footprinting approaches have suggested the significant use of unconventional translation initiation sites in mammals, direct evidence of large-scale alternative protein expression at the proteome level is still lacking. To determine the contribution of alternative proteins to the human proteome, we generated a database of predicted human AltORFs revealing a new proteome mainly composed of small proteins with a median length of 57 amino acids, compared to 344 amino acids for the reference proteome. We experimentally detected a total of 1,259 alternative proteins by mass spectrometry analyses of human cell lines, tissues and fluids. In plasma and serum, alternative proteins represent up to 55% of the proteome and may be a potential unsuspected new source for biomarkers. We observed constitutive co-expression of RefORFs and AltORFs from endogenous genes and from transfected cDNAs, including tumor suppressor p53, and provide evidence that out-of-frame clones representing AltORFs are mistakenly rejected as false positive in cDNAs screening assays. Functional importance of alternative proteins is strongly supported by significant evolutionary conservation in vertebrates, invertebrates, and yeast. Our results imply that coding of multiple proteins in a single gene by the use of AltORFs may be a common feature in eukaryotes, and confirm that translation of unconventional ORFs generates an as yet unexplored proteome. PMID:23950983
Operational aspects of asynchronous filtering for improved flood forecasting
NASA Astrophysics Data System (ADS)
Rakovec, Oldrich; Weerts, Albrecht; Sumihar, Julius; Uijlenhoet, Remko
2014-05-01
Hydrological forecasts can be made more reliable and less uncertain by recursively improving initial conditions. A common way of improving the initial conditions is to make use of data assimilation (DA), a feedback mechanism or update methodology which merges model estimates with available real world observations. The traditional implementation of the Ensemble Kalman Filter (EnKF; e.g. Evensen, 2009) is synchronous, commonly named a three dimensional (3-D) assimilation, which means that all assimilated observations correspond to the time of update. Asynchronous DA, also called four dimensional (4-D) assimilation, refers to an updating methodology, in which observations being assimilated into the model originate from times different to the time of update (Evensen, 2009; Sakov 2010). This study investigates how the capabilities of the DA procedure can be improved by applying alternative Kalman-type methods, e.g., the Asynchronous Ensemble Kalman Filter (AEnKF). The AEnKF assimilates observations with smaller computational costs than the original EnKF, which is beneficial for operational purposes. The results of discharge assimilation into a grid-based hydrological model for the Upper Ourthe catchment in Belgian Ardennes show that including past predictions and observations in the AEnKF improves the model forecasts as compared to the traditional EnKF. Additionally we show that elimination of the strongly non-linear relation between the soil moisture storage and assimilated discharge observations from the model update becomes beneficial for an improved operational forecasting, which is evaluated using several validation measures. In the current study we employed the HBV-96 model built within a recently developed open source modelling environment OpenStreams (2013). The advantage of using OpenStreams (2013) is that it enables direct communication with OpenDA (2013), an open source data assimilation toolbox. OpenDA provides a number of algorithms for model calibration and assimilation and is suitable to be connected to any kind of environmental model. This setup is embedded in the Delft Flood Early Warning System (Delft-FEWS, Werner et al., 2013) for making all simulations and forecast runs and handling of all hydrological and meteorological data. References: Evensen, G. (2009), Data Assimilation: The Ensemble Kalman Filter, Springer, doi:10.1007/978-3-642-03711-5. OpenDA (2013), The OpenDA data-assimilation toolbox, www.openda.org, (last access: 1 November 2013). OpenStreams (2013), OpenStreams, www.openstreams.nl, (last access: 1 November 2013). Sakov, P., G. Evensen, and L. Bertino (2010), Asynchronous data assimilation with the EnKF, Tellus, Series A: Dynamic Meteorology and Oceanography, 62(1), 24-29, doi:10.1111/j.1600-0870.2009.00417.x. Werner, M., J. Schellekens, P. Gijsbers, M. van Dijk, O. van den Akker, and K. Heynert (2013), The Delft-FEWS flow forecasting system, Environ. Mod. & Soft., 40(0), 65-77, doi: http://dx.doi.org/10.1016/j.envsoft.2012.07.010.
Choosing Open Source ERP Systems: What Reasons Are There For Doing So?
NASA Astrophysics Data System (ADS)
Johansson, Björn; Sudzina, Frantisek
Enterprise resource planning (ERP) systems attract a high attention and open source software does it as well. The question is then if, and if so, when do open source ERP systems take off. The paper describes the status of open source ERP systems. Based on literature review of ERP system selection criteria based on Web of Science articles, it discusses reported reasons for choosing open source or proprietary ERP systems. Last but not least, the article presents some conclusions that could act as input for future research. The paper aims at building up a foundation for the basic question: What are the reasons for an organization to adopt open source ERP systems.
An assessment of transient hydraulics phenomena and its characterization
NASA Technical Reports Server (NTRS)
Mortimer, R. W.
1974-01-01
A systematic search of the open literature was performed with the purpose of identifying the causes, effects, and characterization (modelling and solution techniques) of transient hydraulics phenomena. The governing partial differential equations are presented which were found to be used most often in the literature. Detail survey sheets are shown which contain the type of hydraulics problem, the cause, the modelling, the solution technique utilized, and experimental verification used for each paper. References and source documents are listed and a discussion of the purpose and accomplishments of the study is presented.
jmzML, an open-source Java API for mzML, the PSI standard for MS data.
Côté, Richard G; Reisinger, Florian; Martens, Lennart
2010-04-01
We here present jmzML, a Java API for the Proteomics Standards Initiative mzML data standard. Based on the Java Architecture for XML Binding and XPath-based XML indexer random-access XML parser, jmzML can handle arbitrarily large files in minimal memory, allowing easy and efficient processing of mzML files using the Java programming language. jmzML also automatically resolves internal XML references on-the-fly. The library (which includes a viewer) can be downloaded from http://jmzml.googlecode.com.
Millstone: software for multiplex microbial genome analysis and engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.
Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.
Millstone: software for multiplex microbial genome analysis and engineering.
Goodman, Daniel B; Kuznetsov, Gleb; Lajoie, Marc J; Ahern, Brian W; Napolitano, Michael G; Chen, Kevin Y; Chen, Changping; Church, George M
2017-05-25
Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. We describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.
Millstone: software for multiplex microbial genome analysis and engineering
Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.; ...
2017-05-25
Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
Developing open-source codes for electromagnetic geophysics using industry support
NASA Astrophysics Data System (ADS)
Key, K.
2017-12-01
Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Richard A.; Brown, Joseph M.; Colby, Sean M.
ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based onmore » majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.« less
Open Distant Learning: Pedagogical Terms of Reference and Dilemmas
ERIC Educational Resources Information Center
Tatkovic, Nevenka; Ruzic, Maja; Tatkovic, Sanja
2006-01-01
The paper first presents the essential viewpoints of general characteristics of open distance learning (OLD) and the short historical origins. The second part presents some pedagogical terms of reference for Open distance learning as the quality of ODL, the criteria of successful ODL (planning, successful interaction, work and emotional climate,…
Behind Linus's Law: Investigating Peer Review Processes in Open Source
ERIC Educational Resources Information Center
Wang, Jing
2013-01-01
Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…
ERIC Educational Resources Information Center
Kisworo, Marsudi Wahyu
2016-01-01
Information and Communication Technology (ICT)-supported learning using free and open source platform draws little attention as open source initiatives were focused in secondary or tertiary educations. This study investigates possibilities of ICT-supported learning using open source platform for primary educations. The data of this study is taken…
An Analysis of Open Source Security Software Products Downloads
ERIC Educational Resources Information Center
Barta, Brian J.
2014-01-01
Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…
Research on OpenStack of open source cloud computing in colleges and universities’ computer room
NASA Astrophysics Data System (ADS)
Wang, Lei; Zhang, Dandan
2017-06-01
In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.
HELI-DEM portal for geo-processing services
NASA Astrophysics Data System (ADS)
Cannata, Massimiliano; Antonovic, Milan; Molinari, Monia
2014-05-01
HELI-DEM (Helvetia-Italy Digital Elevation Model) is a project developed in the framework of Italy/Switzerland Operational Programme for Trans-frontier Cooperation 2007-2013 whose major aim is to create a unified digital terrain model that includes the alpine and sub-alpine areas between Italy and Switzerland. The partners of the project are: Lombardy Region, Piedmont Region, Polytechnic of Milan, Polytechnic of Turin and Fondazione Politecnico from Italy; Institute of Earth Sciences (SUPSI) from Switzerland. The digital terrain model has been produced by integrating and validating the different elevation data available for the areas of interest, characterized by different reference frame, resolutions and accuracies: DHM at 25 m resolution from Swisstopo, DTM at 20 m resolution from Lombardy Region, DTM at 5 m resolution from Piedmont Region and DTM LiDAR PST-A at about 1 m resolution, that covers the main river bed areas and is produced by the Italian Ministry of the Environment. Further results of the project are: the generation of a unique Italian Swiss geoid with an accuracy of few centimeters (Gilardoni et al. 2012); the establishment of a GNSS permanent network, prototype of a transnational positioning service; the development of a geo-portal, entirely based on open source technologies and open standards, which provides the cross-border DTM and offers some capabilities of analysis and processing through the Internet. With this talk, the authors want to present the main steps of the project with a focus on the HELI-DEM geo-portal development carried out by the Institute of Earth Sciences, which is the access point to the DTM outputted from the project. The portal, accessible at http://geoservice.ist.supsi.ch/helidem, is a demonstration of open source technologies combined for providing access to geospatial functionalities to wide non GIS expert public. In fact, the system is entirely developed using only Open Standards and Free and Open Source Software (FOSS) both on the server side (services) and on the client side (interface). In addition to self developed code the system relies mainly on teh software GRASS 7 [1], ZOO-project [2], Geoserver [3] and OpenLayers [4] and the standards WMS [5], WCS [6] and WPS [7]. At the time of writing, the portal offers features like profiling, contour extraction, watershed delineation and analysis, derivatives calculation, data extraction, coordinate conversion but it is evolving and it is planned to extend to a series of environmental modeling that the IST developed in the past like dam break simulation, landslide run-out estimation and floods due to landslide impact in artificial basins. [1] Neteler M., Mitasova H., Open Source GIS: A GRASS GIS Approach. 3rd Ed. 406 pp, Springer, New York, 2008. [2] Fenoy G., Bozon N., Raghavan V., ZOO Project: The Open Wps Platform. Proceeding of 1st International Workshop on Pervasive Web Mapping, Geoprocessing and Services (WebMGS). Como, http://www.isprs.org/proceedings/XXXVIII/4-W13/ID_32.pdf, 26-27 agosto 2010. [3] Giannecchini S., Aime A., GeoServer, il server open source per la gestione interoperabile dei dati geospaziali. Atti 15a Conferenza Nazionale ASITA. Reggia di Colorno, 15-18 novembre 2011. [4] Perez A.S., OpenLayers Cookbook. Packt Publishing, 2012. ISBN 1849517843. [5] OGC, OpenGIS Web Map Server Implementation Specification, http://www.opengeospatial.org/standards/wms, 2006. [6] OGC, OGC WCS 2.0 Interface Standard - Core, http://portal.opengeospatial.org/files/?artifact_id=41437, 2010b. [7] OGC, OpenGIS Web Processing Service, http://portal.opengeospatial.org/files/?artifact_id=24151, 2007.
2011-01-01
Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914
The 2017 Bioinformatics Open Source Conference (BOSC)
Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather
2017-01-01
The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year’s theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest. PMID:29118973
The 2017 Bioinformatics Open Source Conference (BOSC).
Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather
2017-01-01
The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year's theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest.
The Efficient Utilization of Open Source Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baty, Samuel R.
These are a set of slides on the efficient utilization of open source information. Open source information consists of a vast set of information from a variety of sources. Not only does the quantity of open source information pose a problem, the quality of such information can hinder efforts. To show this, two case studies are mentioned: Iran and North Korea, in order to see how open source information can be utilized. The huge breadth and depth of open source information can complicate an analysis, especially because open information has no guarantee of accuracy. Open source information can provide keymore » insights either directly or indirectly: looking at supporting factors (flow of scientists, products and waste from mines, government budgets, etc.); direct factors (statements, tests, deployments). Fundamentally, it is the independent verification of information that allows for a more complete picture to be formed. Overlapping sources allow for more precise bounds on times, weights, temperatures, yields or other issues of interest in order to determine capability. Ultimately, a "good" answer almost never comes from an individual, but rather requires the utilization of a wide range of skill sets held by a team of people.« less
The 2015 Bioinformatics Open Source Conference (BOSC 2015).
Harris, Nomi L; Cock, Peter J A; Lapp, Hilmar; Chapman, Brad; Davey, Rob; Fields, Christopher; Hokamp, Karsten; Munoz-Torres, Monica
2016-02-01
The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.
Open Source, Openness, and Higher Education
ERIC Educational Resources Information Center
Wiley, David
2006-01-01
In this article David Wiley provides an overview of how the general expansion of open source software has affected the world of education in particular. In doing so, Wiley not only addresses the development of open source software applications for teachers and administrators, he also discusses how the fundamental philosophy of the open source…
Meta-image navigation augmenters for unmanned aircraft systems (MINA for UAS)
NASA Astrophysics Data System (ADS)
Òªelik, Koray; Somani, Arun K.; Schnaufer, Bernard; Hwang, Patrick Y.; McGraw, Gary A.; Nadke, Jeremy
2013-05-01
GPS is a critical sensor for Unmanned Aircraft Systems (UASs) due to its accuracy, global coverage and small hardware footprint, but is subject to denial due to signal blockage or RF interference. When GPS is unavailable, position, velocity and attitude (PVA) performance from other inertial and air data sensors is not sufficient, especially for small UASs. Recently, image-based navigation algorithms have been developed to address GPS outages for UASs, since most of these platforms already include a camera as standard equipage. Performing absolute navigation with real-time aerial images requires georeferenced data, either images or landmarks, as a reference. Georeferenced imagery is readily available today, but requires a large amount of storage, whereas collections of discrete landmarks are compact but must be generated by pre-processing. An alternative, compact source of georeferenced data having large coverage area is open source vector maps from which meta-objects can be extracted for matching against real-time acquired imagery. We have developed a novel, automated approach called MINA (Meta Image Navigation Augmenters), which is a synergy of machine-vision and machine-learning algorithms for map aided navigation. As opposed to existing image map matching algorithms, MINA utilizes publicly available open-source geo-referenced vector map data, such as OpenStreetMap, in conjunction with real-time optical imagery from an on-board, monocular camera to augment the UAS navigation computer when GPS is not available. The MINA approach has been experimentally validated with both actual flight data and flight simulation data and results are presented in the paper.
The Emergence of Open-Source Software in North America
ERIC Educational Resources Information Center
Pan, Guohua; Bonk, Curtis J.
2007-01-01
Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…
Singlet-paired coupled cluster theory for open shells
NASA Astrophysics Data System (ADS)
Gomez, John A.; Henderson, Thomas M.; Scuseria, Gustavo E.
2016-06-01
Restricted single-reference coupled cluster theory truncated to single and double excitations accurately describes weakly correlated systems, but often breaks down in the presence of static or strong correlation. Good coupled cluster energies in the presence of degeneracies can be obtained by using a symmetry-broken reference, such as unrestricted Hartree-Fock, but at the cost of good quantum numbers. A large body of work has shown that modifying the coupled cluster ansatz allows for the treatment of strong correlation within a single-reference, symmetry-adapted framework. The recently introduced singlet-paired coupled cluster doubles (CCD0) method is one such model, which recovers correct behavior for strong correlation without requiring symmetry breaking in the reference. Here, we extend singlet-paired coupled cluster for application to open shells via restricted open-shell singlet-paired coupled cluster singles and doubles (ROCCSD0). The ROCCSD0 approach retains the benefits of standard coupled cluster theory and recovers correct behavior for strongly correlated, open-shell systems using a spin-preserving ROHF reference.
Attempt at forming an expression of Manning's 'n' for Open Channel Flow
NASA Astrophysics Data System (ADS)
De, S. K.; Khosa, R.
2016-12-01
Study of open channel hydraulics finds application in diverse areas such as design of river banks, bridges and other structures. Principal hydraulic elements used in these applications include surface water profiles and flow velocity and these carry significant influences of fluid properties, channel properties and boundary conditions. As per current practice, friction influences are routinely captured in a single factor and commonly referred to as the roughness coefficient and amongst the most widely used equation of flow that uses the latter coefficient is the Manning's equation. As of now, selection of the Manning's roughness coefficient is made from existing tabulated data and accompanying pictures and, clearly as per these practices, the selection and choice of this coefficient is inevitably very subjective and a source of uncertainty in the application of transport models. In this study, an attempt has been made to develop a more rational and computationally feasible expression of the Manning's constant 'n' so that it partially or fully eliminates the need to refer to a table whenever performing a computation. The development of an equation of the Manning's constant uses the basic parameters of the flow and also consideration for influences such as vegetation and form roughness as well.
PRGdb: a bioinformatics platform for plant resistance gene analysis
Sanseverino, Walter; Roma, Guglielmo; De Simone, Marco; Faino, Luigi; Melito, Sara; Stupka, Elia; Frusciante, Luigi; Ercolano, Maria Raffaella
2010-01-01
PRGdb is a web accessible open-source (http://www.prgdb.org) database that represents the first bioinformatic resource providing a comprehensive overview of resistance genes (R-genes) in plants. PRGdb holds more than 16 000 known and putative R-genes belonging to 192 plant species challenged by 115 different pathogens and linked with useful biological information. The complete database includes a set of 73 manually curated reference R-genes, 6308 putative R-genes collected from NCBI and 10463 computationally predicted putative R-genes. Thanks to a user-friendly interface, data can be examined using different query tools. A home-made prediction pipeline called Disease Resistance Analysis and Gene Orthology (DRAGO), based on reference R-gene sequence data, was developed to search for plant resistance genes in public datasets such as Unigene and Genbank. New putative R-gene classes containing unknown domain combinations were discovered and characterized. The development of the PRG platform represents an important starting point to conduct various experimental tasks. The inferred cross-link between genomic and phenotypic information allows access to a large body of information to find answers to several biological questions. The database structure also permits easy integration with other data types and opens up prospects for future implementations. PMID:19906694
Packet Traffic Dynamics Near Onset of Congestion in Data Communication Network Model
NASA Astrophysics Data System (ADS)
Lawniczak, A. T.; Tang, X.
2006-05-01
The dominant technology of data communication networks is the Packet Switching Network (PSN). It is a complex technology organized as various hierarchical layers according to the International Standard Organization (ISO) Open Systems Interconnect (OSI) Reference Model. The Network Layer of the ISO OSI Reference Model is responsible for delivering packets from their sources to their destinations and for dealing with congestion if it arises in a network. Thus, we focus on this layer and present an abstraction of the Network Layer of the ISO OSI Reference Model. Using this abstraction we investigate how onset of traffic congestion is affected for various routing algorithms by changes in network connection topology. We study how aggregate measures of network performance depend on network connection topology and routing. We explore packets traffic spatio-temporal dynamics near the phase transition point from free flow to congestion for various network connection topologies and routing algorithms. We consider static and adaptive routings. We present selected simulation results.
Medición de posiciones astrométricas con CCD en la zona de Rup 21
NASA Astrophysics Data System (ADS)
Bustos Fierro, I. H.; Calderón, J. H.
It is shown the utilization of the block adjustment method for the measurement of astrometric positions from a mosaic of sixteen CCD images with partial overlap, which were taken with the Telescope Jorge Sahade of CASLEO. The observations cover an area of 25' x 25' around the open cluster Rup21. The source of reference positions was ACT Reference Catalog. The internal error of the measured positions is analyzed, and the external error is estimated from the comparison with the catalog USNO-A. In this comparison it is found that the direct CCD images taken with focal reducer could be distorted by severe field curvature. The effect of the distortion presumably introduced by the optics is eliminated with the suitable corrections of the stellar positions measured on every frame, but a new systematic effect on scales of the entire field is observed, which could be due to the distribution of the reference stars.
Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on
2011-01-01
Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community. PMID:21999342
Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks
NASA Astrophysics Data System (ADS)
Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.
2010-12-01
Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC).
MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG
Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong
2017-01-01
Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies. PMID:29163006
MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.
Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong
2017-01-01
Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.
The Complex Nature of Opening Reference Questions
ERIC Educational Resources Information Center
Eichman, Thomas Lee
1978-01-01
The purpose of this study is to review communication theory models and some of the literature from philosophy and linguistics in an attempt to supply a rational explanation for the generality of opening reference questions. (Author)
The Open Source Teaching Project (OSTP): Research Note.
ERIC Educational Resources Information Center
Hirst, Tony
The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…
Free for All: Open Source Software
ERIC Educational Resources Information Center
Schneider, Karen
2008-01-01
Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…
Disintegration of the Aged Open Cluster Berkeley 17
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharya, Souradeep; Vaidya, Kaushar; Mishra, Ishan
We present the analysis of the morphological shape of Berkeley 17, the oldest known open cluster (∼10 Gyr), using the probabilistic star counting of Pan-STARRS point sources, and confirm its core-tail shape, plus an antitail, previously detected with the 2MASS data. The stellar population, as diagnosed by the color–magnitude diagram and theoretical isochrones, shows many massive members in the clusters core, whereas there is a paucity of such members in both of the tails. This manifests mass segregation in this aged star cluster with the low-mass members being stripped away from the system. It has been claimed that Berkeley 17more » is associated with an excessive number of blue straggler candidates. A comparison of nearby reference fields indicates that about half of these may be field contamination.« less
Reflections on the role of open source in health information system interoperability.
Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G
2007-01-01
This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.
Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness
ERIC Educational Resources Information Center
Committee for Economic Development, 2006
2006-01-01
Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…
The 2015 Bioinformatics Open Source Conference (BOSC 2015)
Harris, Nomi L.; Cock, Peter J. A.; Lapp, Hilmar
2016-01-01
The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included “Data Science;” “Standards and Interoperability;” “Open Science and Reproducibility;” “Translational Bioinformatics;” “Visualization;” and “Bioinformatics Open Source Project Updates”. In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled “Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community,” that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule. PMID:26914653
Forbes, Jessica L.; Kim, Regina E. Y.; Paulsen, Jane S.; Johnson, Hans J.
2016-01-01
The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%. PMID:27536233
Microbe-ID: an open source toolbox for microbial genotyping and species identification.
Tabima, Javier F; Everhart, Sydney E; Larsen, Meredith M; Weisberg, Alexandra J; Kamvar, Zhian N; Tancos, Matthew A; Smart, Christine D; Chang, Jeff H; Grünwald, Niklaus J
2016-01-01
Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available. To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database. Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database. Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org). In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID.
Forbes, Jessica L; Kim, Regina E Y; Paulsen, Jane S; Johnson, Hans J
2016-01-01
The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%.
Zazen meditation and no-task resting EEG compared with LORETA intracortical source localization.
Faber, Pascal L; Lehmann, Dietrich; Gianotti, Lorena R R; Milz, Patricia; Pascual-Marqui, Roberto D; Held, Marlene; Kochi, Kieko
2015-02-01
Meditation is a self-induced and willfully initiated practice that alters the state of consciousness. The meditation practice of Zazen, like many other meditation practices, aims at disregarding intrusive thoughts while controlling body posture. It is an open monitoring meditation characterized by detached moment-to-moment awareness and reduced conceptual thinking and self-reference. Which brain areas differ in electric activity during Zazen compared to task-free resting? Since scalp electroencephalography (EEG) waveforms are reference-dependent, conclusions about the localization of active brain areas are ambiguous. Computing intracerebral source models from the scalp EEG data solves this problem. In the present study, we applied source modeling using low resolution brain electromagnetic tomography (LORETA) to 58-channel scalp EEG data recorded from 15 experienced Zen meditators during Zazen and no-task resting. Zazen compared to no-task resting showed increased alpha-1 and alpha-2 frequency activity in an exclusively right-lateralized cluster extending from prefrontal areas including the insula to parts of the somatosensory and motor cortices and temporal areas. Zazen also showed decreased alpha and beta-2 activity in the left angular gyrus and decreased beta-1 and beta-2 activity in a large bilateral posterior cluster comprising the visual cortex, the posterior cingulate cortex and the parietal cortex. The results include parts of the default mode network and suggest enhanced automatic memory and emotion processing, reduced conceptual thinking and self-reference on a less judgmental, i.e., more detached moment-to-moment basis during Zazen compared to no-task resting.
Salamone, Francesco; Belussi, Lorenzo; Danza, Ludovico; Ghellere, Matteo; Meroni, Italo
2015-01-01
The Indoor Environmental Quality (IEQ) refers to the quality of the environment in relation to the health and well-being of the occupants. It is a holistic concept, which considers several categories, each related to a specific environmental parameter. This article describes a low-cost and open-source hardware architecture able to detect the indoor variables necessary for the IEQ calculation as an alternative to the traditional hardware used for this purpose. The system consists of some sensors and an Arduino board. One of the key strengths of Arduino is the possibility it affords of loading the script into the board’s memory and letting it run without interfacing with computers, thus granting complete independence, portability and accuracy. Recent works have demonstrated that the cost of scientific equipment can be reduced by applying open-source principles to their design using a combination of the Arduino platform and a 3D printer. The evolution of the 3D printer has provided a new means of open design capable of accelerating self-directed development. The proposed nano Environmental Monitoring System (nEMoS) instrument is shown to have good reliability and it provides the foundation for a more critical approach to the use of professional sensors as well as for conceiving new scenarios and potential applications. PMID:26053749
NASA Astrophysics Data System (ADS)
Mentis, Dimitrios; Howells, Mark; Rogner, Holger; Korkovelos, Alexandros; Arderne, Christopher; Siyal, Shahid; Zepeda, Eduardo; Taliotis, Constantinos; Bazilian, Morgan; de Roo, Ad; Tanvez, Yann; Oudalov, Alexandre; Scholtz, Ernst
2017-04-01
In September 2015, the United Nations General Assembly adopted Agenda 2030, which comprises a set of 17 Sustainable Development Goals (SDGs) defined by 169 targets. "Ensuring access to affordable, reliable, sustainable and modern energy for all by 2030" is the seventh goal (SDG7). While access to energy refers to more than electricity, the latter is the central focus of this work. According to the World Bank's 2015 Global Tracking Framework, roughly 15% of world population (or 1.1 billion people) lack access to electricity, and many more rely on poor quality electricity services. The majority of those without access (87%) reside in rural areas. This paper presents results of a Geographic Information Systems (GIS) approach coupled with open access data and linked to the Electricity Model Base for Africa (TEMBA), a model that represents each continental African country's electricity supply system. We present least-cost electrification strategies on a country-by-country basis for Sub-Saharan Africa. The electrification options include grid extension, mini-grid and stand-alone systems for rural, peri-urban, and urban contexts across the economy. At low levels of electricity demand there is a strong penetration of standalone technologies. However, higher electricity demand levels move the favourable electrification option from stand-alone systems to mini grid and to grid extensions.
Ponomarev, Valery A; Mueller, Andreas; Candrian, Gian; Grin-Yatsenko, Vera A; Kropotov, Juri D
2014-01-01
To investigate the performance of the spectral analysis of resting EEG, Current Source Density (CSD) and group independent components (gIC) in diagnosing ADHD adults. Power spectra of resting EEG, CSD and gIC (19 channels, linked ears reference, eyes open/closed) from 96 ADHD and 376 healthy adults were compared between eyes open and eyes closed conditions, and between groups of subjects. Pattern of differences in gIC and CSD spectral power between conditions was approximately similar, whereas it was more widely spatially distributed for EEG. Size effect (Cohen's d) of differences in gIC and CSD spectral power between groups of subjects was considerably greater than in the case of EEG. Significant reduction of gIC and CSD spectral power depending on conditions was found in ADHD patients. Reducing power in a wide frequency range in the fronto-central areas is a common phenomenon regardless of whether the eyes were open or closed. Spectral power of local EEG activity isolated by gICA or CSD in the fronto-central areas may be a suitable marker for discrimination of ADHD and healthy adults. Spectral analysis of gIC and CSD provides better sensitivity to discriminate ADHD and healthy adults. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Salamone, Francesco; Belussi, Lorenzo; Danza, Ludovico; Ghellere, Matteo; Meroni, Italo
2015-06-04
The Indoor Environmental Quality (IEQ) refers to the quality of the environment in relation to the health and well-being of the occupants. It is a holistic concept, which considers several categories, each related to a specific environmental parameter. This article describes a low-cost and open-source hardware architecture able to detect the indoor variables necessary for the IEQ calculation as an alternative to the traditional hardware used for this purpose. The system consists of some sensors and an Arduino board. One of the key strengths of Arduino is the possibility it affords of loading the script into the board's memory and letting it run without interfacing with computers, thus granting complete independence, portability and accuracy. Recent works have demonstrated that the cost of scientific equipment can be reduced by applying open-source principles to their design using a combination of the Arduino platform and a 3D printer. The evolution of the 3D printer has provided a new means of open design capable of accelerating self-directed development. The proposed nano Environmental Monitoring System (nEMoS) instrument is shown to have good reliability and it provides the foundation for a more critical approach to the use of professional sensors as well as for conceiving new scenarios and potential applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gabriele, Fatuzzo; Michele, Mangiameli, E-mail: amichele.mangiameli@dica.unict.it; Giuseppe, Mussumeci
The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is amore » set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-11
While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less
D-GENIES: dot plot large genomes in an interactive, efficient and simple way.
Cabanettes, Floréal; Klopp, Christophe
2018-01-01
Dot plots are widely used to quickly compare sequence sets. They provide a synthetic similarity overview, highlighting repetitions, breaks and inversions. Different tools have been developed to easily generated genomic alignment dot plots, but they are often limited in the input sequence size. D-GENIES is a standalone and web application performing large genome alignments using minimap2 software package and generating interactive dot plots. It enables users to sort query sequences along the reference, zoom in the plot and download several image, alignment or sequence files. D-GENIES is an easy-to-install, open-source software package (GPL) developed in Python and JavaScript. The source code is available at https://github.com/genotoul-bioinfo/dgenies and it can be tested at http://dgenies.toulouse.inra.fr/.
GenomeDiagram: a python package for the visualization of large-scale genomic data.
Pritchard, Leighton; White, Jennifer A; Birch, Paul R J; Toth, Ian K
2006-03-01
We present GenomeDiagram, a flexible, open-source Python module for the visualization of large-scale genomic, comparative genomic and other data with reference to a single chromosome or other biological sequence. GenomeDiagram may be used to generate publication-quality vector graphics, rastered images and in-line streamed graphics for webpages. The package integrates with datatypes from the BioPython project, and is available for Windows, Linux and Mac OS X systems. GenomeDiagram is freely available as source code (under GNU Public License) at http://bioinf.scri.ac.uk/lp/programs.html, and requires Python 2.3 or higher, and recent versions of the ReportLab and BioPython packages. A user manual, example code and images are available at http://bioinf.scri.ac.uk/lp/programs.html.
The 2016 Bioinformatics Open Source Conference (BOSC).
Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather
2016-01-01
Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Wholesale Electric Quadrant, which are incorporated herein by reference: (1) Open Access Same-Time....13, 001-1.0, 001-9.7, 001-14.1.3, and 001-15.1.2); (2) Open Access Same-Time Information Systems... minor corrections applied May 29, 2009 and September 8, 2009); (3) Open Access Same-Time Information...
Code of Federal Regulations, 2012 CFR
2012-04-01
... Wholesale Electric Quadrant, which are incorporated herein by reference: (1) Open Access Same-Time....13, 001-1.0, 001-9.7, 001-14.1.3, and 001-15.1.2); (2) Open Access Same-Time Information Systems... minor corrections applied May 29, 2009 and September 8, 2009); (3) Open Access Same-Time Information...
ERIC Educational Resources Information Center
Villano, Matt
2006-01-01
This article presents an interview with Jim Hirsch, an associate superintendent for technology at Piano Independent School District in Piano, Texas. Hirsch serves as a liaison for the open technologies committee of the Consortium for School Networking. In this interview, he shares his opinion on the significance of open source in K-12.
EMISSIONS OF ORGANIC AIR TOXICS FROM OPEN ...
A detailed literature search was performed to collect and collate available data reporting emissions of toxic organic substances into the air from open burning sources. Availability of data varied according to the source and the class of air toxics of interest. Volatile organic compound (VOC) and polycyclic aromatic hydrocarbon (PAH) data were available for many of the sources. Data on semivolatile organic compounds (SVOCs) that are not PAHs were available for several sources. Carbonyl and polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofuran (PCDD/F) data were available for only a few sources. There were several sources for which no emissions data were available at all. Several observations were made including: 1) Biomass open burning sources typically emitted less VOCs than open burning sources with anthropogenic fuels on a mass emitted per mass burned basis, particularly those where polymers were concerned; 2) Biomass open burning sources typically emitted less SVOCs and PAHs than anthropogenic sources on a mass emitted per mass burned basis. Burning pools of crude oil and diesel fuel produced significant amounts of PAHs relative to other types of open burning. PAH emissions were highest when combustion of polymers was taking place; and 3) Based on very limited data, biomass open burning sources typically produced higher levels of carbonyls than anthropogenic sources on a mass emitted per mass burned basis, probably due to oxygenated structures r
Source-sink-storage relationships of conifers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luxmoore, R.J.; Oren, R.; Sheriff, D.W.
1995-07-01
Irradiance, air temperature, saturation vapor pressure deficit, and soil temperature vary in association with Earth`s daily rotation, inducing significant hourly changes in the rates of plant physiological processes. These processes include carbon fixation in photosynthesis, sucrose translocation, and carbon utilization in growth, storage, and respiration. The sensitivity of these physiological processes to environmental factors such as temperature, soil water availability, and nutrient supply reveals differences that must be viewed as an interactive whole in order to comprehend whole-plant responses to the environment. Integrative frameworks for relationships between plant physiological processes are needed to provide syntheses of plant growth and development.more » Source-sink-storage relationships, addressed in this chapter, provide one framework for synthesis of whole-plant responses to external environmental variables. To address this issue, some examples of carbon assimilation and utilization responses of five conifer species to environmental factors from a range of field environments are first summarized. Next, the interactions between sources, sinks, and storages of carbon are examined at the leaf and tree scales, and finally, the review evaluates the proposition that processes involved with carbon utilization (sink activity) are more sensitive to the supply of water and nutrients (particularly nitrogen) than are the processes of carbon gain (source activity) and carbon storage. The terms {open_quotes}sink{close_quotes} and {open_quotes}source{close_quotes} refer to carbon utilization and carbon gain, respectively. The relative roles of stored carbon reserves and of current photosynthate in meeting sink demand are addressed. Discussions focus on source-sink-storage relationships within the diurnal, wetting-drying, and annual cycles of conifer growth and development, and some discussion of life cycle aspects is also presented.« less
Open-Source 3D-Printable Optics Equipment
Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.
2013-01-01
Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104
Open-source 3D-printable optics equipment.
Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M
2013-01-01
Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.
Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...
Reengineering Workflow for Curation of DICOM Datasets.
Bennett, William; Smith, Kirk; Jarosz, Quasar; Nolan, Tracy; Bosch, Walter
2018-06-15
Reusable, publicly available data is a pillar of open science and rapid advancement of cancer imaging research. Sharing data from completed research studies not only saves research dollars required to collect data, but also helps insure that studies are both replicable and reproducible. The Cancer Imaging Archive (TCIA) is a global shared repository for imaging data related to cancer. Insuring the consistency, scientific utility, and anonymity of data stored in TCIA is of utmost importance. As the rate of submission to TCIA has been increasing, both in volume and complexity of DICOM objects stored, the process of curation of collections has become a bottleneck in acquisition of data. In order to increase the rate of curation of image sets, improve the quality of the curation, and better track the provenance of changes made to submitted DICOM image sets, a custom set of tools was developed, using novel methods for the analysis of DICOM data sets. These tools are written in the programming language perl, use the open-source database PostgreSQL, make use of the perl DICOM routines in the open-source package Posda, and incorporate DICOM diagnostic tools from other open-source packages, such as dicom3tools. These tools are referred to as the "Posda Tools." The Posda Tools are open source and available via git at https://github.com/UAMS-DBMI/PosdaTools . In this paper, we briefly describe the Posda Tools and discuss the novel methods employed by these tools to facilitate rapid analysis of DICOM data, including the following: (1) use a database schema which is more permissive, and differently normalized from traditional DICOM databases; (2) perform integrity checks automatically on a bulk basis; (3) apply revisions to DICOM datasets on an bulk basis, either through a web-based interface or via command line executable perl scripts; (4) all such edits are tracked in a revision tracker and may be rolled back; (5) a UI is provided to inspect the results of such edits, to verify that they are what was intended; (6) identification of DICOM Studies, Series, and SOP instances using "nicknames" which are persistent and have well-defined scope to make expression of reported DICOM errors easier to manage; and (7) rapidly identify potential duplicate DICOM datasets by pixel data is provided; this can be used, e.g., to identify submission subjects which may relate to the same individual, without identifying the individual.
The Visible Human Data Sets (VHD) and Insight Toolkit (ITk): Experiments in Open Source Software
Ackerman, Michael J.; Yoo, Terry S.
2003-01-01
From its inception in 1989, the Visible Human Project was designed as an experiment in open source software. In 1994 and 1995 the male and female Visible Human data sets were released by the National Library of Medicine (NLM) as open source data sets. In 2002 the NLM released the first version of the Insight Toolkit (ITk) as open source software. PMID:14728278
Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments.
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-01-05
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-01-01
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. PMID:26745406
Ingargiola, A.; Laurence, T. A.; Boutelle, R.; ...
2015-12-23
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less
The 2016 Bioinformatics Open Source Conference (BOSC)
Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather
2016-01-01
Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science. PMID:27781083
NASA Astrophysics Data System (ADS)
De Vecchi, Daniele; Dell'Acqua, Fabio
2016-04-01
The EU FP7 MARSITE project aims at assessing the "state of the art" of seismic risk evaluation and management at European level, as a starting point to move a "step forward" towards new concepts of risk mitigation and management by long-term monitoring activities carried out both on land and at sea. Spaceborne Earth Observation (EO) is one of the means through which MARSITE is accomplishing this commitment, whose importance is growing as a consequence of the operational unfolding of the Copernicus initiative. Sentinel-2 data, with its open-data policy, represents an unprecedented opportunity to access global spaceborne multispectral data for various purposes including risk monitoring. In the framework of EU FP7 projects MARSITE, RASOR and SENSUM, our group has developed a suite of geospatial software tools to automatically extract risk-related features from EO data, especially on the exposure and vulnerability side of the "risk equation" [1]. These are for example the extension of a built-up area or the distribution of building density. These tools are available open-source as QGIS plug-ins [2] and their source code can be freely downloaded from GitHub [3]. A test case on the risk-prone mega city of Istanbul has been set up, and preliminary results will be presented in this paper. The output of the algorithms can be incorporated into a risk modeling process, whose output is very useful to stakeholders and decision makers who intend to assess and mitigate the risk level across the giant urban agglomerate. Keywords - Remote Sensing, Copernicus, Istanbul megacity, seismic risk, multi-risk, exposure, open-source References [1] Harb, M.M.; De Vecchi, D.; Dell'Acqua, F., "Physical Vulnerability Proxies from Remotes Sensing: Reviewing, Implementing and Disseminating Selected Techniques," Geoscience and Remote Sensing Magazine, IEEE , vol.3, no.1, pp.20,33, March 2015. doi: 10.1109/MGRS.2015.2398672 [2] SENSUM QGIS plugin, 2016, available online at: https://plugins.qgis.org/plugins/sensum_eo_tools/ [3] SENSUM QGIS code repository, 2016, available online at: https://github.com/SENSUM-project/sensum_rs_qgis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, Andrew; Haves, Philip; Jegi, Subhash
This paper describes a software system for automatically generating a reference (baseline) building energy model from the proposed (as-designed) building energy model. This system is built using the OpenStudio Software Development Kit (SDK) and is designed to operate on building energy models in the OpenStudio file format.
OpenMx: An Open Source Extended Structural Equation Modeling Framework
ERIC Educational Resources Information Center
Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John
2011-01-01
OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are…
a Framework for AN Open Source Geospatial Certification Model
NASA Astrophysics Data System (ADS)
Khan, T. U. R.; Davis, P.; Behr, F.-J.
2016-06-01
The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.
NASA Astrophysics Data System (ADS)
Kuipers, J.; Ueda, T.; Vermaseren, J. A. M.; Vollinga, J.
2013-05-01
We present version 4.0 of the symbolic manipulation system FORM. The most important new features are manipulation of rational polynomials and the factorization of expressions. Many other new functions and commands are also added; some of them are very general, while others are designed for building specific high level packages, such as one for Gröbner bases. New is also the checkpoint facility, that allows for periodic backups during long calculations. Finally, FORM 4.0 has become available as open source under the GNU General Public License version 3. Program summaryProgram title: FORM. Catalogue identifier: AEOT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 151599 No. of bytes in distributed program, including test data, etc.: 1 078 748 Distribution format: tar.gz Programming language: The FORM language. FORM itself is programmed in a mixture of C and C++. Computer: All. Operating system: UNIX, LINUX, Mac OS, Windows. Classification: 5. Nature of problem: FORM defines a symbolic manipulation language in which the emphasis lies on fast processing of very large formulas. It has been used successfully for many calculations in Quantum Field Theory and mathematics. In speed and size of formulas that can be handled it outperforms other systems typically by an order of magnitude. Special in this version: The version 4.0 contains many new features. Most important are factorization and rational arithmetic. The program has also become open source under the GPL. The code in CPC is for reference. You are encouraged to upload the most recent sources from www.nikhef.nl/form/formcvs.php because of frequent bug fixes. Solution method: See "Nature of Problem", above. Additional comments: NOTE: The code in CPC is for reference. You are encouraged to upload the most recent sources from www.nikhef.nl/form/formcvs.php because of frequent bug fixes.
Automated Testcase Generation for Numerical Support Functions in Embedded Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Schnieder, Stefan-Alexander
2014-01-01
We present a tool for the automatic generation of test stimuli for small numerical support functions, e.g., code for trigonometric functions, quaternions, filters, or table lookup. Our tool is based on KLEE to produce a set of test stimuli for full path coverage. We use a method of iterative deepening over abstractions to deal with floating-point values. During actual testing the stimuli exercise the code against a reference implementation. We illustrate our approach with results of experiments with low-level trigonometric functions, interpolation routines, and mathematical support functions from an open source UAS autopilot.
Acoustic Propagation in the Labrador Sea
1977-03-08
Difference a Number of Significance-Test D - S. (dB) (dB) Samples Results IA 3.5 1.2 19 Significant lB 25 1.6 146 Significant IC 1.4 1.9 78 Probably...extrapolated value to 914 m from Fig- 2 is 83 dBpfaIHz.. A Labrador Sea ambient-noise value of $6 dBlPaIHz was reported in Ref- 9 for the open ocean under...shot tracks rwill be discussed in a generally counterclockwise order starting with lB . All of the data given are with reference to the 18.3-m-deep source
2009-06-01
search engines are not up to this task, as they have been optimized to catalog information quickly and efficiently for user ease of access while promoting retail commerce at the same time. This thesis presents a performance analysis of a new search engine algorithm designed to help find IED education networks using the Nutch open-source search engine architecture. It reveals which web pages are more important via references from other web pages regardless of domain. In addition, this thesis discusses potential evaluation and monitoring techniques to be used in conjunction
Network traffic behaviour near phase transition point
NASA Astrophysics Data System (ADS)
Lawniczak, A. T.; Tang, X.
2006-03-01
We explore packet traffic dynamics in a data network model near phase transition point from free flow to congestion. The model of data network is an abstraction of the Network Layer of the OSI (Open Systems Interconnect) Reference Model of packet switching networks. The Network Layer is responsible for routing packets across the network from their sources to their destinations and for control of congestion in data networks. Using the model we investigate spatio-temporal packets traffic dynamics near the phase transition point for various network connection topologies, and static and adaptive routing algorithms. We present selected simulation results and analyze them.
Deconstructing The Bomb: Confessions of a Nuclear Archeologist
NASA Astrophysics Data System (ADS)
Coster-Mullen, John
2017-01-01
I am the author of the groundbreaking book Atom Bombs: The Top Secret Inside Story of Little Boy and Fat Man. I will be sharing some of my quarter century of research and methodology that has allowed me to be the first researcher ever to unravel with an unprecedented level of accuracy, the most closely-guarded secrets of the first two Atomic Bombs (``Little Boy'' and ``Fat Man'') created by the Manhattan Project that were used to end WWII. I refer to this methodology as ``Nuclear Archeology'' and will demonstrate that this was done using entirely ``Open Sources'' of information.
NASA Astrophysics Data System (ADS)
Buonanno, Sabatino; Fusco, Adele; Zeni, Giovanni; Manunta, Michele; Lanari, Riccardo
2017-04-01
This work describes the implementation of an efficient system for managing, viewing, analyzing and updating remotely sensed data, with special reference to Differential Interferometric Synthetic Aperture Radar (DInSAR) data. The DInSAR products measure Earth surface deformation both in space and time, producing deformation maps and time series[1,2]. The use of these data in research or operational contexts requires tools that have to handle temporal and spatial variability with high efficiency. For this aim we present an implementation based on Spatial Data Infrastructure (SDI) for data integration, management and interchange, by using standard protocols[3]. SDI tools provide access to static datasets that operate only with spatial variability . In this paper we use the open source project GeoNode as framework to extend SDI infrastructure functionalities to ingest very efficiently DInSAR deformation maps and deformation time series. GeoNode allows to realize comprehensive and distributed infrastructure, following the standards of the Open Geospatial Consortium, Inc. - OGC, for remote sensing data management, analysis and integration [4,5]. In the current paper we explain the methodology used for manage the data complexity and data integration using the opens source project GeoNode. The solution presented in this work for the ingestion of DinSAR products is a very promising starting point for future developments of the OGC compliant implementation of a semi-automatic remote sensing data processing chain . [1] Berardino, P., Fornaro, G., Lanari, R., & Sansosti, E. (2002). A new Algorithm for Surface Deformation Monitoring based on Small Baseline Differential SAR Interferograms. IEEE Transactions on Geoscience and Remote Sensing, 40, 11, pp. 2375-2383. [2] Lanari R., F. Casu, M. Manzo, G. Zeni,, P. Berardino, M. Manunta and A. Pepe (2007), An overview of the Small Baseline Subset Algorithm: a DInSAR Technique for Surface Deformation Analysis, P. Appl. Geophys., 164, doi: 10.1007/s00024-007-0192-9. [3] Nebert, D.D. (ed). 2000. Developing Spatial data Infrastructures: The SDI Cookbook. [4] Geonode (www.geonode.org) [5] Kolodziej, k. (ed). 2004. OGC OpenGIS Web Map Server Cookbook. Open Geospatial Consortium, 1.0.2 edition.
XNAT Central: Open sourcing imaging research data.
Herrick, Rick; Horton, William; Olsen, Timothy; McKay, Michael; Archie, Kevin A; Marcus, Daniel S
2016-01-01
XNAT Central is a publicly accessible medical imaging data repository based on the XNAT open-source imaging informatics platform. It hosts a wide variety of research imaging data sets. The primary motivation for creating XNAT Central was to provide a central repository to host and provide access to a wide variety of neuroimaging data. In this capacity, XNAT Central hosts a number of data sets from research labs and investigative efforts from around the world, including the OASIS Brains imaging studies, the NUSDAST study of schizophrenia, and more. Over time, XNAT Central has expanded to include imaging data from many different fields of research, including oncology, orthopedics, cardiology, and animal studies, but continues to emphasize neuroimaging data. Through the use of XNAT's DICOM metadata extraction capabilities, XNAT Central provides a searchable repository of imaging data that can be referenced by groups, labs, or individuals working in many different areas of research. The future development of XNAT Central will be geared towards greater ease of use as a reference library of heterogeneous neuroimaging data and associated synthetic data. It will also become a tool for making data available supporting published research and academic articles. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bratic, G.; Brovelli, M. A.; Molinari, M. E.
2018-04-01
The availability of thematic maps has significantly increased over the last few years. Validation of these maps is a key factor in assessing their suitability for different applications. The evaluation of the accuracy of classified data is carried out through a comparison with a reference dataset and the generation of a confusion matrix from which many quality indexes can be derived. In this work, an ad hoc free and open source Python tool was implemented to automatically compute all the matrix confusion-derived accuracy indexes proposed by literature. The tool was integrated into GRASS GIS environment and successfully applied to evaluate the quality of three high-resolution global datasets (GlobeLand30, Global Urban Footprint, Global Human Settlement Layer Built-Up Grid) in the Lombardy Region area (Italy). In addition to the most commonly used accuracy measures, e.g. overall accuracy and Kappa, the tool allowed to compute and investigate less known indexes such as the Ground Truth and the Classification Success Index. The promising tool will be further extended with spatial autocorrelation analysis functions and made available to researcher and user community.
McIDAS-V: Data Analysis and Visualization for NPOESS and GOES-R
NASA Astrophysics Data System (ADS)
Rink, T.; Achtor, T. H.
2009-12-01
McIDAS-V, the next-generation McIDAS, is being built on top a modern, cross-platform software framework which supports development of 4-D, interactive displays and integration of wide-array of geophysical data. As the replacement of McIDAS, the development emphasis is on future satellite observation platforms such as NPOESS and GOES-R. Data interrogation, analysis and visualization capabilities have been developed for multi- and hyper-spectral instruments like MODIS, AIRS and IASI, and are being extended for application to VIIRS and CrIS. Compatibility with GOES-R ABI level1 and level2 product storage formats has been demonstrated. The abstract data model, which can internalize most any geophysical data, opens up new possibilities for data fusion techniques, for example, polar and geostationary, (LEO/GEO), synergy for research and validation. McIDAS-V follows an object-oriented design model, using the Java programming language, allowing specialized extensions for for new sources of data, and novel displays and interactive behavior. The reference application, what the user sees on startup, can be customized, and the system has a persistence mechanism allowing sharing of the application state across the internet. McIDAS-V is open-source, and free to the public.
Towards a Framework for Developing Semantic Relatedness Reference Standards
Pakhomov, Serguei V.S.; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B.; Ruggieri, Alexander; Chute, Christopher G.
2010-01-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the “moderate” range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. PMID:21044697
Momota, Ryusuke; Ohtsuka, Aiji
2018-01-01
Anatomy is the science and art of understanding the structure of the body and its components in relation to the functions of the whole-body system. Medicine is based on a deep understanding of anatomy, but quite a few introductory-level learners are overwhelmed by the sheer amount of anatomical terminology that must be understood, so they regard anatomy as a dull and dense subject. To help them learn anatomical terms in a more contextual way, we started a new open-source project, the Network of Anatomical Texts (NAnaTex), which visualizes relationships of body components by integrating text-based anatomical information using Cytoscape, a network visualization software platform. Here, we present a network of bones and muscles produced from literature descriptions. As this network is primarily text-based and does not require any programming knowledge, it is easy to implement new functions or provide extra information by making changes to the original text files. To facilitate collaborations, we deposited the source code files for the network into the GitHub repository ( https://github.com/ryusukemomota/nanatex ) so that anybody can participate in the evolution of the network and use it for their own non-profit purposes. This project should help not only introductory-level learners but also professional medical practitioners, who could use it as a quick reference.
SolTrace | Concentrating Solar Power | NREL
NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted
When Free Isn't Free: The Realities of Running Open Source in School
ERIC Educational Resources Information Center
Derringer, Pam
2009-01-01
Despite the last few years' growth in awareness of open-source software in schools and the potential savings it represents, its widespread adoption is still hampered. Randy Orwin, technology director of the Bainbridge Island School District in Washington State and a strong open-source advocate, cautions that installing an open-source…
Inexpensive, Low Power, Open-Source Data Logging hardware development
NASA Astrophysics Data System (ADS)
Sandell, C. T.; Schulz, B.; Wickert, A. D.
2017-12-01
Over the past six years, we have developed a suite of open-source, low-cost, and lightweight data loggers for scientific research. These loggers employ the popular and easy-to-use Arduino programming environment, but consist of custom hardware optimized for field research. They may be connected to a broad and expanding range of off-the-shelf sensors, with software support built in directly to the "ALog" library. Three main models exist: The ALog (for Autonomous or Arduino Logger) is the extreme low-power model for years-long deployments with only primary AA or D batteries. The ALog shield is a stripped-down ALog that nests with a standard Arduino board for prototyping or education. The TLog (for Telemetering Logger) contains an embedded radio with 500 m range and a GPS for communications and precision timekeeping. This enables meshed networks of loggers that can send their data back to an internet-connected "home base" logger for near-real-time field data retrieval. All boards feature feature a high-precision clock, full size SD card slot for high-volume data storage, large screw terminals to connect sensors, interrupts, SPI and I2C communication capability, and 3.3V/5V power outputs. The ALog and TLog have fourteen 16-bit analog inputs with a precision voltage reference for precise analog measurements. Their components are rated -40 to +85 degrees C, and they have been tested in harsh field conditions. These low-cost and open-source data loggers have enabled our research group to collect field data across North and South America on a limited budget, support student projects, and build toward better future scientific data systems.
An integrated SNP mining and utilization (ISMU) pipeline for next generation sequencing data.
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A V S K; Varshney, Rajeev K
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software.
Big data in food safety: An overview.
Marvin, Hans J P; Janssen, Esmée M; Bouzembrak, Yamine; Hendriksen, Peter J M; Staats, Martijn
2017-07-24
Technology is now being developed that is able to handle vast amounts of structured and unstructured data from diverse sources and origins. These technologies are often referred to as big data, and open new areas of research and applications that will have an increasing impact in all sectors of our society. In this paper we assessed to which extent big data is being applied in the food safety domain and identified several promising trends. In several parts of the world, governments stimulate the publication on internet of all data generated in public funded research projects. This policy opens new opportunities for stakeholders dealing with food safety to address issues which were not possible before. Application of mobile phones as detection devices for food safety and the use of social media as early warning of food safety problems are a few examples of the new developments that are possible due to big data.
Wang, Qingguo; Jia, Peilin; Zhao, Zhongming
2015-01-01
Fueled by widespread applications of high-throughput next generation sequencing (NGS) technologies and urgent need to counter threats of pathogenic viruses, large-scale studies were conducted recently to investigate virus integration in host genomes (for example, human tumor genomes) that may cause carcinogenesis or other diseases. A limiting factor in these studies, however, is rapid virus evolution and resulting polymorphisms, which prevent reads from aligning readily to commonly used virus reference genomes, and, accordingly, make virus integration sites difficult to detect. Another confounding factor is host genomic instability as a result of virus insertions. To tackle these challenges and improve our capability to identify cryptic virus-host fusions, we present a new approach that detects Virus intEgration sites through iterative Reference SEquence customization (VERSE). To the best of our knowledge, VERSE is the first approach to improve detection through customizing reference genomes. Using 19 human tumors and cancer cell lines as test data, we demonstrated that VERSE substantially enhanced the sensitivity of virus integration site detection. VERSE is implemented in the open source package VirusFinder 2 that is available at http://bioinfo.mc.vanderbilt.edu/VirusFinder/.
OMPC: an Open-Source MATLAB®-to-Python Compiler
Jurica, Peter; van Leeuwen, Cees
2008-01-01
Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577
ERIC Educational Resources Information Center
Villano, Matt
2006-01-01
Increasingly, colleges and universities are turning to open source as a way to meet their technology infrastructure and application needs. Open source has changed life for visionary CIOs and their campus communities nationwide. The author discusses what these technologists see as the benefits--and the considerations.
An Integrated Chemical Environment to Support 21st-Century Toxicology.
Bell, Shannon M; Phillips, Jason; Sedykh, Alexander; Tandon, Arpit; Sprankle, Catherine; Morefield, Stephen Q; Shapiro, Andy; Allen, David; Shah, Ruchir; Maull, Elizabeth A; Casey, Warren M; Kleinstreuer, Nicole C
2017-05-25
SUMMARY : Access to high-quality reference data is essential for the development, validation, and implementation of in vitro and in silico approaches that reduce and replace the use of animals in toxicity testing. Currently, these data must often be pooled from a variety of disparate sources to efficiently link a set of assay responses and model predictions to an outcome or hazard classification. To provide a central access point for these purposes, the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods developed the Integrated Chemical Environment (ICE) web resource. The ICE data integrator allows users to retrieve and combine data sets and to develop hypotheses through data exploration. Open-source computational workflows and models will be available for download and application to local data. ICE currently includes curated in vivo test data, reference chemical information, in vitro assay data (including Tox21 TM /ToxCast™ high-throughput screening data), and in silico model predictions. Users can query these data collections focusing on end points of interest such as acute systemic toxicity, endocrine disruption, skin sensitization, and many others. ICE is publicly accessible at https://ice.ntp.niehs.nih.gov. https://doi.org/10.1289/EHP1759.
An Integrated Chemical Environment to Support 21st-Century Toxicology
Bell, Shannon M.; Phillips, Jason; Sedykh, Alexander; Tandon, Arpit; Sprankle, Catherine; Morefield, Stephen Q.; Shapiro, Andy; Allen, David; Shah, Ruchir; Maull, Elizabeth A.; Casey, Warren M.
2017-01-01
Summary: Access to high-quality reference data is essential for the development, validation, and implementation of in vitro and in silico approaches that reduce and replace the use of animals in toxicity testing. Currently, these data must often be pooled from a variety of disparate sources to efficiently link a set of assay responses and model predictions to an outcome or hazard classification. To provide a central access point for these purposes, the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods developed the Integrated Chemical Environment (ICE) web resource. The ICE data integrator allows users to retrieve and combine data sets and to develop hypotheses through data exploration. Open-source computational workflows and models will be available for download and application to local data. ICE currently includes curated in vivo test data, reference chemical information, in vitro assay data (including Tox21TM/ToxCast™ high-throughput screening data), and in silico model predictions. Users can query these data collections focusing on end points of interest such as acute systemic toxicity, endocrine disruption, skin sensitization, and many others. ICE is publicly accessible at https://ice.ntp.niehs.nih.gov. https://doi.org/10.1289/EHP1759 PMID:28557712
Towards Complete, Geo-Referenced 3d Models from Crowd-Sourced Amateur Images
NASA Astrophysics Data System (ADS)
Hartmann, W.; Havlena, M.; Schindler, K.
2016-06-01
Despite a lot of recent research, photogrammetric reconstruction from crowd-sourced imagery is plagued by a number of recurrent problems. (i) The resulting models are chronically incomplete, because even touristic landmarks are photographed mostly from a few "canonical" viewpoints. (ii) Man-made constructions tend to exhibit repetitive structure and rotational symmetries, which lead to gross errors in the 3D reconstruction and aggravate the problem of incomplete reconstruction. (iii) The models are normally not geo-referenced. In this paper, we investigate the possibility of using sparse GNSS geo-tags from digital cameras to address these issues and push the boundaries of crowd-sourced photogrammetry. A small proportion of the images in Internet collections (≍ 10 %) do possess geo-tags. While the individual geo-tags are very inaccurate, they nevertheless can help to address the problems above. By providing approximate geo-reference for partial reconstructions they make it possible to fuse those pieces into more complete models; the capability to fuse partial reconstruction opens up the possibility to be more restrictive in the matching phase and avoid errors due to repetitive structure; and collectively, the redundant set of low-quality geo-tags can provide reasonably accurate absolute geo-reference. We show that even few, noisy geo-tags can help to improve architectural models, compared to puristic structure-from-motion only based on image correspondence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ingargiola, A.; Laurence, T. A.; Boutelle, R.
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-14
... contracts before commercial sources in the open market. The proposed rule amends FAR 8.002 as follows: The... requirements for supplies and services from commercial sources in the open market. The proposed FAR 8.004 would... subpart 8.6). (b) Commercial sources (including educational and non-profit institutions) in the open...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
Evans, Nicholas G; Selgelid, Michael J
2015-08-01
In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.
Scoring Coreference Partitions of Predicted Mentions: A Reference Implementation.
Pradhan, Sameer; Luo, Xiaoqiang; Recasens, Marta; Hovy, Eduard; Ng, Vincent; Strube, Michael
2014-06-01
The definitions of two coreference scoring metrics- B 3 and CEAF-are underspecified with respect to predicted , as opposed to key (or gold ) mentions. Several variations have been proposed that manipulate either, or both, the key and predicted mentions in order to get a one-to-one mapping. On the other hand, the metric BLANC was, until recently, limited to scoring partitions of key mentions. In this paper, we (i) argue that mention manipulation for scoring predicted mentions is unnecessary, and potentially harmful as it could produce unintuitive results; (ii) illustrate the application of all these measures to scoring predicted mentions; (iii) make available an open-source, thoroughly-tested reference implementation of the main coreference evaluation measures; and (iv) rescore the results of the CoNLL-2011/2012 shared task systems with this implementation. This will help the community accurately measure and compare new end-to-end coreference resolution algorithms.
Fix, Carolyn E.
1956-01-01
The bibliography consists of annotations or abstracts of selected reports that pertain to the geology and occurrence of uranium in marine black shales and their metamorphic equivalents in the United States. Only those reports that were available to the public prior to June 30, 1956, are included. Most of the reports may be consulted in the larger public, university, or scientific libraries. A few reports that have been released to the public in open file may be consulted at designated offices of the Geological Survey. An effort has been made to include only those references to shales whose uranium is believed to be of syngenetic origin and whose major source of radioactivity is uranium. Many general papers on the geology of uranium deposits refer to marine black shales, and some of these general papers have been included.
High rate, long cycle life battery electrode materials with an open framework structure
Wessells, Colin; Huggins, Robert; Cui, Yi; Pasta, Mauro
2015-02-10
A battery includes a cathode, an anode, and an aqueous electrolyte disposed between the cathode and the anode and including a cation A. At least one of the cathode and the anode includes an electrode material having an open framework crystal structure into which the cation A is reversibly inserted during operation of the battery. The battery has a reference specific capacity when cycled at a reference rate, and at least 75% of the reference specific capacity is retained when the battery is cycled at 10 times the reference rate.
Using R to implement spatial analysis in open source environment
NASA Astrophysics Data System (ADS)
Shao, Yixi; Chen, Dong; Zhao, Bo
2007-06-01
R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.
ASTRONAUTICS INFORMATION. OPEN LITERATURE SURVEY. VOLUME 1, PART D. (Entries 13,166-13,888)
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1960-01-15
An open literature survey dealing with astronautics is presented. The period covered by the survey is October 15 to December 15, 1959. A list of periodical references is presented alphabetically. Author and subject indexes are included. 700 references. (J.R.D.)
Ciobanu, O
2009-01-01
The objective of this study was to obtain three-dimensional (3D) images and to perform biomechanical simulations starting from DICOM images obtained by computed tomography (CT). Open source software were used to prepare digitized 2D images of tissue sections and to create 3D reconstruction from the segmented structures. Finally, 3D images were used in open source software in order to perform biomechanic simulations. This study demonstrates the applicability and feasibility of open source software developed in our days for the 3D reconstruction and biomechanic simulation. The use of open source software may improve the efficiency of investments in imaging technologies and in CAD/CAM technologies for implants and prosthesis fabrication which need expensive specialized software.
An Open and Holistic Approach for Geo and Space Sciences
NASA Astrophysics Data System (ADS)
Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Toshihiko, Iyemori; Yatagai, Akiyo; Koyama, Yukinobu; Murayama, Yasuhiro; King, Todd; Hughes, Steve; Fung, Shing; Galkin, Ivan; Hapgood, Mike; Belehaki, Anna
2016-04-01
Geo and space sciences thus far have been very successful, even often an open, cross-domain and holistic approach did not play an essential role. But this situation is changing rapidly. The research focus is shifting into more complex, non-linear and multi-domain specified phenomena, such as e.g. climate change or space environment. This kind of phenomena only can be understood step by step using the holistic idea. So, what is necessary for a successful cross-domain and holistic approach in geo and space sciences? Research and science in general become more and more dependent from a rich fundus of multi-domain data sources, related context information and the use of highly advanced technologies in data processing. Such buzzword phrases as Big Data and Deep Learning are reflecting this development. Big Data also addresses the real exponential growing of data and information produced by measurements or simulations. Deep Learning technology may help to detect new patterns and relationships in data describing high sophisticated natural phenomena. And further on, we should not forget science and humanities are only two sides of the same medal in the continuing human process of knowledge discovery. The concept of Open Data or in particular the open access to scientific data is addressing the free and open availability of -at least publicly founded and generated- data. The open availability of data covers the free use, reuse and redistribution of data which have been established with the formation of World Data Centers already more than 50 years ago. So, we should not forget, the foundation for open data is the responsibility of the individual scientist up until the big science institutions and organizations for a sustainable management of data. Other challenges are discovering and collecting the appropriate data, and preferably all of them or at least the majority of the right data. Therefore a network of individual or even better institutional catalog-based and at least domain-specific data servers is necessary. In times of the WWW or nowadays Semantic Web, context enriched and mashed-up open data catalogs pointing to the appropriate data sources, step-by-step will help to overcome the burden of the users to find the right data. Further on, the Semantic Web provides an interoperable and universal format for data and metadata. The Resource Description Formation (RDF) inherently enables a domain and cross-domain mashup of data, e.g. realized in the Linked Open Data project. Scientific work and appropriate papers in the geo and space domain often are based on data, physical models and previous publications, which again have been dependent on data, models and publications. So, in order to guarantee a high quality of scientific work, the complete verification process of the results is necessary. This is nothing new, but in times of Big Data a real challenge. So, what do we need for a complete verification of presented results? Yes, especially we need all the original data which has been used. But it is also necessary to get complete information about the context of the research objectives and the resulting constraints in the preparation of the raw data. Further on we need knowledge about the methods and the appropriate processing software, which has been used to generate the results. The Open Data approach enriched by the Open Archive idea is providing the concept for sustainable and verifiable scientific work. Open Archive of course stands for the free availability of scientific papers. But furthermore it focuses on mechanisms and methods within the realm of scientific publications for referencing and providing the underlying data, methods and software. Such reference mechanism are the use of Digital Object Identifier (DOI) or Uniform Resource Identifier (URI) within the Semantic Web -in our case for geo and space science data- but also methods and software code. Nowadays, more and more open and private publishers are demanding such kind of references in preparation of the publishing process. In addition, references to well documented earth and space science data are available via an increasing amount of data publications. This approach serves both, the institutional geo and space data centers which increase their awareness and importance, but also the scientists, which will find the right and already DOI-referenced data in the appropriate data journals. The Open Data and Open Archive approach finally merges in the concept of Open Science. Open Science emphasizes an open sharing of knowledge of all kind, based on a transparent multi-disciplinary and cross-domain scientific work. But Open Science is not just an idea, it also stands for a variety of projects which following the rules of Open Science, such as open methodology, open source, open data, open access, open peer review and open educational resources. Open Science also demands a new culture of scientific collaboration based on social media, and the use of shared cloud technology for data storage and computing. But, we should not forget, the WWW is not a one way road. As more data, methods and software for science research become freely available at the Internet, as more chances for a commercial or even destructive use of scientific data are opened. Already now, the giant search engine provider, such as Google or Microsoft and others are collecting, storing and analyzing all data which is available at the net. The usage of Deep Learning for the detection of semantical coherence of data for e.g. the creation of personalized on time and on location predictions using neuronal networks and artificial intelligence methods should not be reserved for them but also used within Open Science for the creation of new scientific knowledge. Open Science does not mean just to dump our scientific data, information and knowledge into the Web. Far from it, we are still responsible for a sustainable handling of our data for the benefit of humankind. The usage of the principles of Open Science is demonstrated on the scientific and software engineering activities for the mashup of the Japanese IUGONET, European Union ESPAS and GFZ ISDC related data server covering different geo and space science domains.
The Raptor Real-Time Processing Architecture
NASA Astrophysics Data System (ADS)
Galassi, M.; Starr, D.; Wozniak, P.; Brozdin, K.
The primary goal of Raptor is ambitious: to identify interesting optical transients from very wide field of view telescopes in real time, and then to quickly point the higher resolution Raptor ``fovea'' cameras and spectrometer to the location of the optical transient. The most interesting of Raptor's many applications is the real-time search for orphan optical counterparts of Gamma Ray Bursts. The sequence of steps (data acquisition, basic calibration, source extraction, astrometry, relative photometry, the smarts of transient identification and elimination of false positives, telescope pointing feedback, etc.) is implemented with a ``component'' approach. All basic elements of the pipeline functionality have been written from scratch or adapted (as in the case of SExtractor for source extraction) to form a consistent modern API operating on memory resident images and source lists. The result is a pipeline which meets our real-time requirements and which can easily operate as a monolithic or distributed processing system. Finally, the Raptor architecture is entirely based on free software (sometimes referred to as ``open source'' software). In this paper we also discuss the interplay between various free software technologies in this type of astronomical problem.
Raptor -- Mining the Sky in Real Time
NASA Astrophysics Data System (ADS)
Galassi, M.; Borozdin, K.; Casperson, D.; McGowan, K.; Starr, D.; White, R.; Wozniak, P.; Wren, J.
2004-06-01
The primary goal of Raptor is ambitious: to identify interesting optical transients from very wide field of view telescopes in real time, and then to quickly point the higher resolution Raptor ``fovea'' cameras and spectrometer to the location of the optical transient. The most interesting of Raptor's many applications is the real-time search for orphan optical counterparts of Gamma Ray Bursts. The sequence of steps (data acquisition, basic calibration, source extraction, astrometry, relative photometry, the smarts of transient identification and elimination of false positives, telescope pointing feedback...) is implemented with a ``component'' aproach. All basic elements of the pipeline functionality have been written from scratch or adapted (as in the case of SExtractor for source extraction) to form a consistent modern API operating on memory resident images and source lists. The result is a pipeline which meets our real-time requirements and which can easily operate as a monolithic or distributed processing system. Finally: the Raptor architecture is entirely based on free software (sometimes referred to as "open source" software). In this paper we also discuss the interplay between various free software technologies in this type of astronomical problem.
From Nonradiating Sources to Directionally Invisible Objects
NASA Astrophysics Data System (ADS)
Hurwitz, Elisa
The goal of this dissertation is to extend the understanding of invisible objects, in particular nonradiating sources and directional nonscattering scatterers. First, variations of null-field nonradiating sources are derived from Maxwell's equations. Next, it is shown how to design a nonscattering scatterer by applying the boundary conditions for nonradiating sources to the scalar wave equation, referred to here as the "field cloak method". This technique is used to demonstrate directionally invisible scatterers for an incident field with one direction of incidence, and the influence of symmetry on the directionality is explored. This technique, when applied to the scalar wave equation, is extended to show that a directionally invisible object may be invisible for multiple directions of incidence simultaneously. This opens the door to the creation of optically switchable, directionally invisible objects which could be implemented in couplers and other novel optical devices. Next, a version of the "field cloak method" is extended to the Maxwell's electro-magnetic vector equations, allowing more flexibility in the variety of directionally invisible objects that can be designed. This thesis concludes with examples of such objects and future applications.
[The Arabic influence in the "Colóquios dos simples e drogas da India" of Garcia da Orta].
Ricordel, Joëlle
2015-09-01
The "Colóquios dos simples e drogas he cousas medicinais de Índia" (Conversations on the simples, drugs and medicinal substances of India) (1563) of Garcia da Orta is a botanical and pharmacognosy book. The author is a Portuguese physician who studied in the Spanish universities and practiced medicine mainly in India. He studies in short chapters presented in the form of dialogues about sixty simples. Sources to which he refers are indicative of a "classical" training, but also the mark of a curious and open mind to different cultures. The Arabic sources are numerous and mainly concern the identification of substances by abundant synonyms of their names in foreign languages and different medicinal uses that may have been done by the ancient physicians. However, Da Orta is critical with respect to these sources, seeking contradictions and differences of opinion among authors. He confronts them with the oral information collected thanks to a wide network of contacts.
Hansen, Anthony D.
1990-01-01
An improved aethalometer (10) having a single light source (18) and a single light detector (20) and two light paths (21, 22) from the light source (18) to the light detector (20). A quartz fiber filter (13) is inserted in the device, the filter (13) having a collection area (23) in one light path (21) and a reference area (24) in the other light path (22). A gas flow path (46) through the aethalometer housing (11) allows ambient air to flow through the collection area (23) of the filter (13) so that aerosol particles can be collected on the filter. A rotating disk (31) with an opening (33) therethrough allows light for the light source (18) to pass alternately through the two light paths (21, 22). The voltage output of the detector (20) is applied to a VCO (52) and the VCO pulses for light transmission separately through the two light paths (21, 22 ) are counted and compared to determine the absorption coefficient of the collected aerosol particles.
Microbe-ID: an open source toolbox for microbial genotyping and species identification
Tabima, Javier F.; Everhart, Sydney E.; Larsen, Meredith M.; Weisberg, Alexandra J.; Kamvar, Zhian N.; Tancos, Matthew A.; Smart, Christine D.; Chang, Jeff H.
2016-01-01
Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available. To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database. Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database. Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org). In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID. PMID:27602267
Zöllner, Frank G; Daab, Markus; Sourbron, Steven P; Schad, Lothar R; Schoenberg, Stefan O; Weisser, Gerald
2016-01-14
Perfusion imaging has become an important image based tool to derive the physiological information in various applications, like tumor diagnostics and therapy, stroke, (cardio-) vascular diseases, or functional assessment of organs. However, even after 20 years of intense research in this field, perfusion imaging still remains a research tool without a broad clinical usage. One problem is the lack of standardization in technical aspects which have to be considered for successful quantitative evaluation; the second problem is a lack of tools that allow a direct integration into the diagnostic workflow in radiology. Five compartment models, namely, a one compartment model (1CP), a two compartment exchange (2CXM), a two compartment uptake model (2CUM), a two compartment filtration model (2FM) and eventually the extended Toft's model (ETM) were implemented as plugin for the DICOM workstation OsiriX. Moreover, the plugin has a clean graphical user interface and provides means for quality management during the perfusion data analysis. Based on reference test data, the implementation was validated against a reference implementation. No differences were found in the calculated parameters. We developed open source software to analyse DCE-MRI perfusion data. The software is designed as plugin for the DICOM Workstation OsiriX. It features a clean GUI and provides a simple workflow for data analysis while it could also be seen as a toolbox providing an implementation of several recent compartment models to be applied in research tasks. Integration into the infrastructure of a radiology department is given via OsiriX. Results can be saved automatically and reports generated automatically during data analysis ensure certain quality control.
Boulos, Maged N Kamel; Honda, Kiyoshi
2006-01-01
Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699
Rapid development of medical imaging tools with open-source libraries.
Caban, Jesus J; Joshi, Alark; Nagy, Paul
2007-11-01
Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.
Open-Source RTOS Space Qualification: An RTEMS Case Study
NASA Technical Reports Server (NTRS)
Zemerick, Scott
2017-01-01
NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.
NASA Astrophysics Data System (ADS)
Meyer, Stefan; Tulej, Marek; Wurz, Peter
2016-04-01
The exploration of habitable worlds around the gas giants in the Solar System is of major interest in upcoming planetary missions. Exactly this theme is addressed by the Jupiter Icy Moons Explorer (JUICE) mission of ESA, which will characterise Ganymede, Europa and Callisto as planetary objects and potential habitats [1], [2]. We developed a prototype of the Neutral gas and Ion Mass spectrometer (NIM) of the Particle Environment Package (PEP) for the JUICE mission intended for composition measurements of neutral gas and thermal plasma. NIM/PEP will be used to measure the chemical composition of the exospheres of the icy Jovian moons. Besides direct ion measurement, the NIM instrument is able to measure the inflowing neutral gas in two different modes: in neutral mode the gas enters directly the ion source (open source) and in thermal mode, the gas gets thermally accommodated to wall temperature by several collisions inside an equilibrium sphere before entering the ion source (closed source). We performed measurements with the prototype NIM using a neutral gas beam of 1 up to 5 km/s velocity in the neutral and thermal mode. The current trajectory of JUICE foresees a flyby velocity of 4 km/s at Europa, other flybys are in the range of 1 up to 7 km/s and velocity in Ganymede orbits is around 2 km/s. Different species are used for gas beam, such as noble gases Ne, Ar, Kr as well as molecules like H2, Methane, Ethane, Propane and more complex ones. We will present the results of these measurements with respect to fragmentation and density enhancements in the closed source mode. Furthermore, we will give a direct comparison between open and closed source mode measurements. References: [1] ESA, "JUICE assessment study report (Yellow Book)", ESA/SRE(2011)18, 2012. [2] O. Grasset, M.K. Dougherty, A. Coustenis, E.J. Bunce, C. Erd, D. Titov, M. Blanc, A. Coates, P. Drossart, L.N. Fletcher, H. Hussmann, R. Jaumann, N. Krupp, J.-P. Lebreton, O. Prieto-Ballesteros, P. Tortora, F. Tosi, T. Van Hoolst, "JUpiter Icy moons Explorer (JUICE): An ESA mission to orbit Ganymede and to characterise the Jupiter system", Planet. Space Sci., 2013, 78, pp. 1 - 21.
ERIC Educational Resources Information Center
Armbruster, Chris
2008-01-01
Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge…
Learning from hackers: open-source clinical trials.
Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico
2012-05-02
Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.
Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.
Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed
2015-02-01
Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.
KID Project: an internet-based digital video atlas of capsule endoscopy for research purposes.
Koulaouzidis, Anastasios; Iakovidis, Dimitris K; Yung, Diana E; Rondonotti, Emanuele; Kopylov, Uri; Plevris, John N; Toth, Ervin; Eliakim, Abraham; Wurm Johansson, Gabrielle; Marlicz, Wojciech; Mavrogenis, Georgios; Nemeth, Artur; Thorlacius, Henrik; Tontini, Gian Eugenio
2017-06-01
Capsule endoscopy (CE) has revolutionized small-bowel (SB) investigation. Computational methods can enhance diagnostic yield (DY); however, incorporating machine learning algorithms (MLAs) into CE reading is difficult as large amounts of image annotations are required for training. Current databases lack graphic annotations of pathologies and cannot be used. A novel database, KID, aims to provide a reference for research and development of medical decision support systems (MDSS) for CE. Open-source software was used for the KID database. Clinicians contribute anonymized, annotated CE images and videos. Graphic annotations are supported by an open-access annotation tool (Ratsnake). We detail an experiment based on the KID database, examining differences in SB lesion measurement between human readers and a MLA. The Jaccard Index (JI) was used to evaluate similarity between annotations by the MLA and human readers. The MLA performed best in measuring lymphangiectasias with a JI of 81 ± 6 %. The other lesion types were: angioectasias (JI 64 ± 11 %), aphthae (JI 64 ± 8 %), chylous cysts (JI 70 ± 14 %), polypoid lesions (JI 75 ± 21 %), and ulcers (JI 56 ± 9 %). MLA can perform as well as human readers in the measurement of SB angioectasias in white light (WL). Automated lesion measurement is therefore feasible. KID is currently the only open-source CE database developed specifically to aid development of MDSS. Our experiment demonstrates this potential.
OMPC: an Open-Source MATLAB-to-Python Compiler.
Jurica, Peter; van Leeuwen, Cees
2009-01-01
Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.
Achromatic self-referencing interferometer
Feldman, M.
1994-04-19
A self-referencing Mach-Zehnder interferometer is described for accurately measuring laser wavefronts over a broad wavelength range (for example, 600 nm to 900 nm). The apparatus directs a reference portion of an input beam to a reference arm and a measurement portion of the input beam to a measurement arm, recombines the output beams from the reference and measurement arms, and registers the resulting interference pattern ([open quotes]first[close quotes] interferogram) at a first detector. Optionally, subportions of the measurement portion are diverted to second and third detectors, which respectively register intensity and interferogram signals which can be processed to reduce the first interferogram's sensitivity to input noise. The reference arm includes a spatial filter producing a high quality spherical beam from the reference portion, a tilted wedge plate compensating for off-axis aberrations in the spatial filter output, and mirror collimating the radiation transmitted through the tilted wedge plate. The apparatus includes a thermally and mechanically stable baseplate which supports all reference arm optics, or at least the spatial filter, tilted wedge plate, and the collimator. The tilted wedge plate is mounted adjustably with respect to the spatial filter and collimator, so that it can be maintained in an orientation in which it does not introduce significant wave front errors into the beam propagating through the reference arm. The apparatus is polarization insensitive and has an equal path length configuration enabling measurement of radiation from broadband as well as closely spaced laser line sources. 3 figures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saminathan, S; Godson, H; Ponmalar, R
2016-06-15
Purpose: To evaluate the dosimetric characteristics of newly developed well type ionization chamber and to validate the results with the commercially available calibrated well chambers that are being used for the calibration of brachytherapy sources. Methods: The newly developed well type ionization chamber (BDS 1000) has been designed for the convenient use in brachytherapy which is open to atmospheric condition. The chamber has a volume of 240 cm3 and weight of 2.5 Kg. The calibration of the radioactive source with activities from 0.01 mCi to 20 Ci can be carried out using this chamber. The dosimetric parameters such as leakagemore » current, stability, scattering effect, ion collection efficiency, reference air kerma rate and nominal response with energy were carried out with the BDS 1000 well type ion chamber. The evaluated dosimetric characteristics of BDS1000 well chamber were validated with two other commercially available well chambers (HDR 1000 plus and BTC/3007). Results: The measured leakage current observed was negligible for the newly developed BDS 1000 well type ion chamber. The ion collection efficiency was close to 1 and the response of the chamber was found to be very stable. The determined sweet spot was at 42 mm from bottom of the chamber insert. The reference air kerma rate was found to be 4.634 × 105 Gym2hr-1A-1 for the BDS 1000 well chamber. The overall dosimetric characteristics of BDS 1000 well chamber was in good agreement with the dosimetric properties of other two well chambers. Conclusion: The dosimetric study shows that the newly developed BDS 1000 well type ionization chamber is high sensitive and reliable chamber for reference air kerma strength calibration. The results obtained confirm that this chamber can be used for the calibration of HDR and LDR brachytherapy sources.« less
Hazan, Hananel; Ziv, Noam E
2017-01-01
There is growing need for multichannel electrophysiological systems that record from and interact with neuronal systems in near real-time. Such systems are needed, for example, for closed loop, multichannel electrophysiological/optogenetic experimentation in vivo and in a variety of other neuronal preparations, or for developing and testing neuro-prosthetic devices, to name a few. Furthermore, there is a need for such systems to be inexpensive, reliable, user friendly, easy to set-up, open and expandable, and possess long life cycles in face of rapidly changing computing environments. Finally, they should provide powerful, yet reasonably easy to implement facilities for developing closed-loop protocols for interacting with neuronal systems. Here, we survey commercial and open source systems that address these needs to varying degrees. We then present our own solution, which we refer to as Closed Loop Experiments Manager (CLEM). CLEM is an open source, soft real-time, Microsoft Windows desktop application that is based on a single generic personal computer (PC) and an inexpensive, general-purpose data acquisition board. CLEM provides a fully functional, user-friendly graphical interface, possesses facilities for recording, presenting and logging electrophysiological data from up to 64 analog channels, and facilities for controlling external devices, such as stimulators, through digital and analog interfaces. Importantly, it includes facilities for running closed-loop protocols written in any programming language that can generate dynamic link libraries (DLLs). We describe the application, its architecture and facilities. We then demonstrate, using networks of cortical neurons growing on multielectrode arrays (MEA) that despite its reliance on generic hardware, its performance is appropriate for flexible, closed-loop experimentation at the neuronal network level.
Hazan, Hananel; Ziv, Noam E.
2017-01-01
There is growing need for multichannel electrophysiological systems that record from and interact with neuronal systems in near real-time. Such systems are needed, for example, for closed loop, multichannel electrophysiological/optogenetic experimentation in vivo and in a variety of other neuronal preparations, or for developing and testing neuro-prosthetic devices, to name a few. Furthermore, there is a need for such systems to be inexpensive, reliable, user friendly, easy to set-up, open and expandable, and possess long life cycles in face of rapidly changing computing environments. Finally, they should provide powerful, yet reasonably easy to implement facilities for developing closed-loop protocols for interacting with neuronal systems. Here, we survey commercial and open source systems that address these needs to varying degrees. We then present our own solution, which we refer to as Closed Loop Experiments Manager (CLEM). CLEM is an open source, soft real-time, Microsoft Windows desktop application that is based on a single generic personal computer (PC) and an inexpensive, general-purpose data acquisition board. CLEM provides a fully functional, user-friendly graphical interface, possesses facilities for recording, presenting and logging electrophysiological data from up to 64 analog channels, and facilities for controlling external devices, such as stimulators, through digital and analog interfaces. Importantly, it includes facilities for running closed-loop protocols written in any programming language that can generate dynamic link libraries (DLLs). We describe the application, its architecture and facilities. We then demonstrate, using networks of cortical neurons growing on multielectrode arrays (MEA) that despite its reliance on generic hardware, its performance is appropriate for flexible, closed-loop experimentation at the neuronal network level. PMID:29093659
41 CFR 302-17.13 - Source references.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false Source references. 302... references. The following references or publications have been used as source material for this part. (a...) Internal Revenue Service Publication 521, “Moving Expenses.” (c) Internal Revenue Service, Circular E...
The validity of open-source data when assessing jail suicides.
Thomas, Amanda L; Scott, Jacqueline; Mellow, Jeff
2018-05-09
The Bureau of Justice Statistics' Deaths in Custody Reporting Program is the primary source for jail suicide research, though the data is restricted from general dissemination. This study is the first to examine whether jail suicide data obtained from publicly available sources can help inform our understanding of this serious public health problem. Of the 304 suicides that were reported through the DCRP in 2009, roughly 56 percent (N = 170) of those suicides were identified through the open-source search protocol. Each of the sources was assessed based on how much information was collected on the incident and the types of variables available. A descriptive analysis was then conducted on the variables that were present in both data sources. The four variables present in each data source were: (1) demographic characteristics of the victim, (2) the location of occurrence within the facility, (3) the location of occurrence by state, and (4) the size of the facility. Findings demonstrate that the prevalence and correlates of jail suicides are extremely similar in both open-source and official data. However, for almost every variable measured, open-source data captured as much information as official data did, if not more. Further, variables not found in official data were identified in the open-source database, thus allowing researchers to have a more nuanced understanding of the situational characteristics of the event. This research provides support for the argument in favor of including open-source data in jail suicide research as it illustrates how open-source data can be used to provide additional information not originally found in official data. In sum, this research is vital in terms of possible suicide prevention, which may be directly linked to being able to manipulate environmental factors.
nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.
Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia
2017-12-01
Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Piñeyro-Garza, Everardo; Gómez-Silva, Magdalena; Gamino Peña, María Elena; Palmer, Jonathan; Berber, Arturo
2015-10-01
The oral retinoid agent isotretinoin (13-cis-retinoic acid) is approved for the treatment of severe recalcitrant cystic acne. For registrational renewal of Oratane® in Mexico (isotretinoin; Laboratorios Dermatologicos Darier S.A. de C.V., Mexico), it was necessary to establish bioequivalence to the reference product Roaccutan® (Isotretinoin; Roche, Mannheim, Germany). Three prior studies failed to establish the bioequivalence of Oratane to Mexican-sourced Roaccutan. However, 13 studies demonstrated the bioequivalence of Oratane to Roaccutane® from multiple sources. This study compared the bioavailability of Oratane with that of Mexicansourced Roaccutan and Australian-sourced Roaccutane. Study participants received each of the three agents in a randomized, open-label, 6-sequence, 3-way crossover study with a 2-week washout period between treatments. Pharmacokinetic analysis revealed that peak plasma concentration (Cmax) and area under the plasma concentration-time curve from time 0 (dosing) to infinite time (AUC0-∞) were lower for Roaccutan than for Roaccutane and Oratane (Cmax: 1,023.35, 1,223.08, and 1,224.25 ng/mL, respectively; AUC0-∞: 13,653.65, 15,681.35 and 15,733.55 ng/mL x h, respectively). The 90% CIs (test/reference) for the ratios of the geometric means indicated that Oratane was bioequivalent to Roaccutane but not to Roaccutan. In addition, Roaccutane (R2) was not bioequivalent to Roaccutan (R1; R1/R2 90% CIs: Cmax, 76.12 - 91.04; AUC0-t, 82.19 - 91.13; AUC0-∞, 82.94 - 91.57). Oratane and Australian-sourced Roaccutane could be considered bioequivalent, but neither formulation was found to be bioequivalent to Mexican-sourced Roaccutan.
An open-source framework for stress-testing non-invasive foetal ECG extraction algorithms.
Andreotti, Fernando; Behar, Joachim; Zaunseder, Sebastian; Oster, Julien; Clifford, Gari D
2016-05-01
Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be avoided. Data, extraction algorithms and evaluation routines were released as part of the fecgsyn toolbox on Physionet under an GNU GPL open-source license. This contribution provides a standard framework for benchmarking and regulatory testing of NI-FECG extraction algorithms.
Oranusi, C K; Nwofor, Ame; Oranusi, I O
2012-01-01
Traditional open prostatectomies either transvesical or retropubic remains the reference standard for managing benign prostatic enlargement in some centers, especially in developing countries. The comparison of complication rates between the various types of open prostatectomies is usually a source of significant debate among urologists, most times with conflicting results. The Clavien-Dindo classification system is an excellent attempt at standardization of reporting complications associated with surgeries. We reviewed retrospectively the records of patients who had open transvesical prostatectomy (TVP) in three specialist urology centers in Anambra state, Southeast Nigeria, over a period of 5 years (January 2004-December 2009), with the aim of documenting medical and surgical complications arising from open TVP. These complications were then categorized according to the Clavien-Dindo system. A total of 362 patients had open TVP over the period under review. Of this number, 145 had documented evidence of complications. The mean age of the patients was 66.3 years (SD 9.4 years; range 49-96 years). The mean follow-up period was 27.8 months (SD 12.6 months; range 6-33 months). The overall complication rate for open TVP in this study was 40.1% (145/362). Complication rates for grades i, id, ii, iiia, and iiib were 0.8%, 0.6%, 35.1%, 0.6%, and 3.0%, respectively. Most complications of open TVP occur in the early postoperative period. Open TVP still remains a valid surgical option in contemporary environment where advanced techniques for transurethral resection of the prostate and laparoscopic prostatectomy are unavailable. Most complications occur in the early postoperative period, with bleeding requiring several units of blood transfusion accounting for the commonest complication. This should be explained to patients during the preoperative counselling.
Open source tools and toolkits for bioinformatics: significance, and where are we?
Stajich, Jason E; Lapp, Hilmar
2006-09-01
This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.
Open Source 2010: Reflections on 2007
ERIC Educational Resources Information Center
Wheeler, Brad
2007-01-01
Colleges and universities and commercial firms have demonstrated great progress in realizing the vision proffered for "Open Source 2007," and 2010 will mark even greater progress. Although much work remains in refining open source for higher education applications, the signals are now clear: the collaborative development of software can provide…
Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments
ERIC Educational Resources Information Center
Wang, Shuo; Wang, Jing; Gao, Yanjing
2017-01-01
An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…
Creating Open Source Conversation
ERIC Educational Resources Information Center
Sheehan, Kate
2009-01-01
Darien Library, where the author serves as head of knowledge and learning services, launched a new website on September 1, 2008. The website is built with Drupal, an open source content management system (CMS). In this article, the author describes how she and her colleagues overhauled the library's website to provide an open source content…
Integrating an Automatic Judge into an Open Source LMS
ERIC Educational Resources Information Center
Georgouli, Katerina; Guerreiro, Pedro
2011-01-01
This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…
76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... be held in the General Services Administration (GSA), Central Office Auditorium, 1800 F Street NW...
The open-source movement: an introduction for forestry professionals
Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove
2005-01-01
In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....
Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.
ERIC Educational Resources Information Center
Newby, Gregory B.; Greenberg, Jane; Jones, Paul
2003-01-01
Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)
Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat.
Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart
2015-04-21
Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation.
Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat
Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart
2015-01-01
Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation. PMID:25897892
Adaptive sequential controller
El-Sharkawi, Mohamed A.; Xing, Jian; Butler, Nicholas G.; Rodriguez, Alonso
1994-01-01
An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.
ERIC Educational Resources Information Center
Guhlin, Miguel
2007-01-01
A switch to free open source software can minimize cost and allow funding to be diverted to equipment and other programs. For instance, the OpenOffice suite is an alternative to expensive basic application programs offered by major vendors. Many such programs on the market offer features seldom used in education but for which educators must pay.…
What Is the Reference? An Examination of Alternatives to the Reference Sources Used in IES TM-30-15
DOE Office of Scientific and Technical Information (OSTI.GOV)
Royer, Michael P.
A study was undertaken to document the role of the reference illuminant in the IES TM-30-15 method for evaluating color rendition. TM-30-15 relies on a relative reference scheme; that is, the reference illuminant and test source always have the same correlated color temperature (CCT). The reference illuminant is a Planckian radiator, model of daylight, or combination of those two, depending on the exact CCT of the test source. Three alternative reference schemes were considered: 1) either using all Planckian radiators or all daylight models; 2) using only one of ten possible illuminants (Planckian, daylight, or equal energy), regardless of themore » CCT of the test source; 3) using an off-Planckian reference illuminant (i.e., a source with a negative Duv). No reference scheme is inherently superior to another, with differences in metric values largely a result of small differences in gamut shape of the reference alternatives. While using any of the alternative schemes is more reasonable in the TM-30-15 evaluation framework than it was with the CIE CRI framework, the differences still ultimately manifest only as changes in interpretation of the results. References are employed in color rendering measures to provide a familiar point of comparison, not to establish an ideal source.« less
Bhardwaj, Anshu; Scaria, Vinod; Raghava, Gajendra Pal Singh; Lynn, Andrew Michael; Chandra, Nagasuma; Banerjee, Sulagna; Raghunandanan, Muthukurussi V; Pandey, Vikas; Taneja, Bhupesh; Yadav, Jyoti; Dash, Debasis; Bhattacharya, Jaijit; Misra, Amit; Kumar, Anil; Ramachandran, Srinivasan; Thomas, Zakir; Brahmachari, Samir K
2011-09-01
It is being realized that the traditional closed-door and market driven approaches for drug discovery may not be the best suited model for the diseases of the developing world such as tuberculosis and malaria, because most patients suffering from these diseases have poor paying capacity. To ensure that new drugs are created for patients suffering from these diseases, it is necessary to formulate an alternate paradigm of drug discovery process. The current model constrained by limitations for collaboration and for sharing of resources with confidentiality hampers the opportunities for bringing expertise from diverse fields. These limitations hinder the possibilities of lowering the cost of drug discovery. The Open Source Drug Discovery project initiated by Council of Scientific and Industrial Research, India has adopted an open source model to power wide participation across geographical borders. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multi-faceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Since the inventions are community generated, the new chemical entities developed by Open Source Drug Discovery will be taken up for clinical trial in a non-exclusive manner by participation of multiple companies with majority funding from Open Source Drug Discovery. This will ensure availability of drugs through a lower cost community driven drug discovery process for diseases afflicting people with poor paying capacity. Hopefully what LINUX the World Wide Web have done for the information technology, Open Source Drug Discovery will do for drug discovery. Copyright © 2011 Elsevier Ltd. All rights reserved.
Prediction of aerodynamic tonal noise from open rotors
NASA Astrophysics Data System (ADS)
Sharma, Anupam; Chen, Hsuan-nien
2013-08-01
A numerical approach for predicting tonal aerodynamic noise from "open rotors" is presented. "Open rotor" refers to an engine architecture with a pair of counter-rotating propellers. Typical noise spectra from an open rotor consist of dominant tones, which arise due to both the steady loading/thickness and the aerodynamic interaction between the two bladerows. The proposed prediction approach utilizes Reynolds Averaged Navier-Stokes (RANS) Computational Fluid Dynamics (CFD) simulations to obtain near-field description of the noise sources. The near-to-far-field propagation is then carried out by solving the Ffowcs Williams-Hawkings equation. Since the interest of this paper is limited to tone noise, a linearized, frequency domain approach is adopted to solve the wake/vortex-blade interaction problem.This paper focuses primarily on the speed scaling of the aerodynamic tonal noise from open rotors. Even though there is no theoretical mode cut-off due to the absence of nacelle in open rotors, the far-field noise is a strong function of the azimuthal mode order. While the steady loading/thickness noise has circumferential modes of high order, due to the relatively large number of blades (≈10-12), the interaction noise typically has modes of small orders. The high mode orders have very low radiation efficiency and exhibit very strong scaling with Mach number, while the low mode orders show a relatively weaker scaling. The prediction approach is able to capture the speed scaling (observed in experiment) of the overall aerodynamic noise very well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gomez, John A.; Henderson, Thomas M.; Scuseria, Gustavo E.
Restricted single-reference coupled cluster theory truncated to single and double excitations accurately describes weakly correlated systems, but often breaks down in the presence of static or strong correlation. Good coupled cluster energies in the presence of degeneracies can be obtained by using a symmetry-broken reference, such as unrestricted Hartree-Fock, but at the cost of good quantum numbers. A large body of work has shown that modifying the coupled cluster ansatz allows for the treatment of strong correlation within a single-reference, symmetry-adapted framework. The recently introduced singlet-paired coupled cluster doubles (CCD0) method is one such model, which recovers correct behavior formore » strong correlation without requiring symmetry breaking in the reference. Here, we extend singlet-paired coupled cluster for application to open shells via restricted open-shell singlet-paired coupled cluster singles and doubles (ROCCSD0). The ROCCSD0 approach retains the benefits of standard coupled cluster theory and recovers correct behavior for strongly correlated, open-shell systems using a spin-preserving ROHF reference.« less
State-of-the-practice and lessons learned on implementing open data and open source policies.
DOT National Transportation Integrated Search
2012-05-01
This report describes the current government, academic, and private sector practices associated with open data and open source application development. These practices are identified; and the potential uses with the ITS Programs Data Capture and M...
Your Personal Analysis Toolkit - An Open Source Solution
NASA Astrophysics Data System (ADS)
Mitchell, T.
2009-12-01
Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!
All-source Information Management and Integration for Improved Collective Intelligence Production
2011-06-01
Intelligence (ELINT) • Open Source Intelligence ( OSINT ) • Technical Intelligence (TECHINT) These intelligence disciplines produce... intelligence , measurement and signature intelligence , signals intelligence , and open - source data, in the production of intelligence . All- source intelligence ...All- Source Information Integration and Management) R&D Project 3 All- Source Intelligence
Automated generation and ensemble-learned matching of X-ray absorption spectra
NASA Astrophysics Data System (ADS)
Zheng, Chen; Mathew, Kiran; Chen, Chi; Chen, Yiming; Tang, Hanmei; Dozier, Alan; Kas, Joshua J.; Vila, Fernando D.; Rehr, John J.; Piper, Louis F. J.; Persson, Kristin A.; Ong, Shyue Ping
2018-12-01
X-ray absorption spectroscopy (XAS) is a widely used materials characterization technique to determine oxidation states, coordination environment, and other local atomic structure information. Analysis of XAS relies on comparison of measured spectra to reliable reference spectra. However, existing databases of XAS spectra are highly limited both in terms of the number of reference spectra available as well as the breadth of chemistry coverage. In this work, we report the development of XASdb, a large database of computed reference XAS, and an Ensemble-Learned Spectra IdEntification (ELSIE) algorithm for the matching of spectra. XASdb currently hosts more than 800,000 K-edge X-ray absorption near-edge spectra (XANES) for over 40,000 materials from the open-science Materials Project database. We discuss a high-throughput automation framework for FEFF calculations, built on robust, rigorously benchmarked parameters. FEFF is a computer program uses a real-space Green's function approach to calculate X-ray absorption spectra. We will demonstrate that the ELSIE algorithm, which combines 33 weak "learners" comprising a set of preprocessing steps and a similarity metric, can achieve up to 84.2% accuracy in identifying the correct oxidation state and coordination environment of a test set of 19 K-edge XANES spectra encompassing a diverse range of chemistries and crystal structures. The XASdb with the ELSIE algorithm has been integrated into a web application in the Materials Project, providing an important new public resource for the analysis of XAS to all materials researchers. Finally, the ELSIE algorithm itself has been made available as part of veidt, an open source machine-learning library for materials science.
Designing EEG Neurofeedback Procedures to Enhance Open-Ended versus Closed-Ended Creative Potentials
ERIC Educational Resources Information Center
Lin, Wei-Lun; Shih, Yi-Ling
2016-01-01
Recent empirical evidence demonstrated that open-ended creativity (which refers to creativity measures that require various and numerous responses, such as divergent thinking) correlated with alpha brain wave activation, whereas closed-ended creativity (which refers to creativity measures that ask for one final correct answer, such as insight…
A Federated Reference Structure for Open Informational Ecosystems
ERIC Educational Resources Information Center
Heinen, Richard; Kerres, Michael; Scharnberg, Gianna; Blees, Ingo; Rittberger, Marc
2016-01-01
The paper describes the concept of a federated ecosystem for Open Educational Resources (OER) in the German education system. Here, a variety of OER repositories (ROER) (Muuß-Merholz & Schaumburg, 2014) and reference platforms have been established in the recent past. In order to develop this ecosystem, not only are metadata standards…
40 CFR 60.711 - Definitions, symbols, and cross reference tables.
Code of Federal Regulations, 2011 CFR
2011-07-01
... audio or video recording or information storage. (14) Natural draft opening means any opening in a room... control device. (18) Utilize refers to the use of solvent that is delivered to coating mix preparation... participate in atmospheric photochemical reactions or that are measured by Method 18, 24, 25, or 25A or an...
40 CFR 60.711 - Definitions, symbols, and cross reference tables.
Code of Federal Regulations, 2013 CFR
2013-07-01
... audio or video recording or information storage. (14) Natural draft opening means any opening in a room... control device. (18) Utilize refers to the use of solvent that is delivered to coating mix preparation... participate in atmospheric photochemical reactions or that are measured by Method 18, 24, 25, or 25A or an...
40 CFR 60.711 - Definitions, symbols, and cross reference tables.
Code of Federal Regulations, 2012 CFR
2012-07-01
... audio or video recording or information storage. (14) Natural draft opening means any opening in a room... control device. (18) Utilize refers to the use of solvent that is delivered to coating mix preparation... participate in atmospheric photochemical reactions or that are measured by Method 18, 24, 25, or 25A or an...
40 CFR 60.711 - Definitions, symbols, and cross reference tables.
Code of Federal Regulations, 2010 CFR
2010-07-01
... audio or video recording or information storage. (14) Natural draft opening means any opening in a room... control device. (18) Utilize refers to the use of solvent that is delivered to coating mix preparation... participate in atmospheric photochemical reactions or that are measured by Method 18, 24, 25, or 25A or an...
40 CFR 60.711 - Definitions, symbols, and cross reference tables.
Code of Federal Regulations, 2014 CFR
2014-07-01
... audio or video recording or information storage. (14) Natural draft opening means any opening in a room... control device. (18) Utilize refers to the use of solvent that is delivered to coating mix preparation... participate in atmospheric photochemical reactions or that are measured by Method 18, 24, 25, or 25A or an...
Towards a framework for developing semantic relatedness reference standards.
Pakhomov, Serguei V S; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B; Ruggieri, Alexander; Chute, Christopher G
2011-04-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the "moderate" range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. Copyright © 2010 Elsevier Inc. All rights reserved.
A new phase-correlation-based iris matching for degraded images.
Krichen, Emine; Garcia-Salicetti, Sonia; Dorizzi, Bernadette
2009-08-01
In this paper, we present a new phase-correlation-based iris matching approach in order to deal with degradations in iris images due to unconstrained acquisition procedures. Our matching system is a fusion of global and local Gabor phase-correlation schemes. The main originality of our local approach is that we do not only consider the correlation peak amplitudes but also their locations in different regions of the images. Results on several degraded databases, namely, the CASIA-BIOSECURE and Iris Challenge Evaluation 2005 databases, show the improvement of our method compared to two available reference systems, Masek and Open Source for Iris (OSRIS), in verification mode.
Integrated cluster management at Manchester
NASA Astrophysics Data System (ADS)
McNab, Andrew; Forti, Alessandra
2012-12-01
We describe an integrated management system using third-party, open source components used in operating a large Tier-2 site for particle physics. This system tracks individual assets and records their attributes such as MAC and IP addresses; derives DNS and DHCP configurations from this database; creates each host's installation and re-configuration scripts; monitors the services on each host according to the records of what should be running; and cross references tickets with asset records and per-asset monitoring pages. In addition, scripts which detect problems and automatically remove hosts record these new states in the database which are available to operators immediately through the same interface as tickets and monitoring.
A universal heliostat control system
NASA Astrophysics Data System (ADS)
Gross, Fabian; Geiger, Mark; Buck, Reiner
2017-06-01
This paper describes the development of a universal heliostat control system as part of the AutoR project [1]. The system can control multiple receivers and heliostat types in a single application. The system offers support for multiple operators on different machines and is designed to be as adaptive as possible. Thus, the system can be used for different heliostat field setups with only minor adaptations of the system's source code. This is achieved by extensive usage of modern programming techniques like reflection and dependency injection. Furthermore, the system features co-simulation of a ray tracer, a reference PID-controller implementation for open volumetric receivers and methods for heliostat calibration and monitoring.
Getting Open Source Software into Schools: Strategies and Challenges
ERIC Educational Resources Information Center
Hepburn, Gary; Buley, Jan
2006-01-01
In this article Gary Hepburn and Jan Buley outline different approaches to implementing open source software (OSS) in schools; they also address the challenges that open source advocates should anticipate as they try to convince educational leaders to adopt OSS. With regard to OSS implementation, they note that schools have a flexible range of…
Open Source Library Management Systems: A Multidimensional Evaluation
ERIC Educational Resources Information Center
Balnaves, Edmund
2008-01-01
Open source library management systems have improved steadily in the last five years. They now present a credible option for small to medium libraries and library networks. An approach to their evaluation is proposed that takes account of three additional dimensions that only open source can offer: the developer and support community, the source…
Open Source as Appropriate Technology for Global Education
ERIC Educational Resources Information Center
Carmichael, Patrick; Honour, Leslie
2002-01-01
Economic arguments for the adoption of "open source" software in business have been widely discussed. In this paper we draw on personal experience in the UK, South Africa and Southeast Asia to forward compelling reasons why open source software should be considered as an appropriate and affordable alternative to the currently prevailing…
Government Technology Acquisition Policy: The Case of Proprietary versus Open Source Software
ERIC Educational Resources Information Center
Hemphill, Thomas A.
2005-01-01
This article begins by explaining the concepts of proprietary and open source software technology, which are now competing in the marketplace. A review of recent individual and cooperative technology development and public policy advocacy efforts, by both proponents of open source software and advocates of proprietary software, subsequently…
Open Source Communities in Technical Writing: Local Exigence, Global Extensibility
ERIC Educational Resources Information Center
Conner, Trey; Gresham, Morgan; McCracken, Jill
2011-01-01
By offering open-source software (OSS)-based networks as an affordable technology alternative, we partnered with a nonprofit community organization. In this article, we narrate the client-based experiences of this partnership, highlighting the ways in which OSS and open-source culture (OSC) transformed our students' and our own expectations of…
2015-06-01
ground.aspx?p=1 Texas Tech Security Group, “Automated Open Source Intelligence ( OSINT ) Using APIs.” RaiderSec, Sunday 30 December 2012, http...Open Source Intelligence ( OSINT ) Using APIs,” RaiderSec, Sunday 30 December 2012, http://raidersec.blogspot.com/2012/12/automated-open- source
Open-Source Unionism: New Workers, New Strategies
ERIC Educational Resources Information Center
Schmid, Julie M.
2004-01-01
In "Open-Source Unionism: Beyond Exclusive Collective Bargaining," published in fall 2002 in the journal Working USA, labor scholars Richard B. Freeman and Joel Rogers use the term "open-source unionism" to describe a form of unionization that uses Web technology to organize in hard-to-unionize workplaces. Rather than depend on the traditional…
Perceptions of Open Source versus Commercial Software: Is Higher Education Still on the Fence?
ERIC Educational Resources Information Center
van Rooij, Shahron Williams
2007-01-01
This exploratory study investigated the perceptions of technology and academic decision-makers about open source benefits and risks versus commercial software applications. The study also explored reactions to a concept for outsourcing campus-wide deployment and maintenance of open source. Data collected from telephone interviews were analyzed,…
Open Source for Knowledge and Learning Management: Strategies beyond Tools
ERIC Educational Resources Information Center
Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.
2007-01-01
In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…
Open-Source Learning Management Systems: A Predictive Model for Higher Education
ERIC Educational Resources Information Center
van Rooij, S. Williams
2012-01-01
The present study investigated the role of pedagogical, technical, and institutional profile factors in an institution of higher education's decision to select an open-source learning management system (LMS). Drawing on the results of previous research that measured patterns of deployment of open-source software (OSS) in US higher education and…
ERIC Educational Resources Information Center
Rodriguez-Sanchez, M. C.; Torrado-Carvajal, Angel; Vaquero, Joaquin; Borromeo, Susana; Hernandez-Tamames, Juan A.
2016-01-01
This paper presents a case study analyzing the advantages and disadvantages of using project-based learning (PBL) combined with collaborative learning (CL) and industry best practices, integrated with information communication technologies, open-source software, and open-source hardware tools, in a specialized microcontroller and embedded systems…
Technology collaboration by means of an open source government
NASA Astrophysics Data System (ADS)
Berardi, Steven M.
2009-05-01
The idea of open source software originally began in the early 1980s, but it never gained widespread support until recently, largely due to the explosive growth of the Internet. Only the Internet has made this kind of concept possible, bringing together millions of software developers from around the world to pool their knowledge. The tremendous success of open source software has prompted many corporations to adopt the culture of open source and thus share information they previously held secret. The government, and specifically the Department of Defense (DoD), could also benefit from adopting an open source culture. In acquiring satellite systems, the DoD often builds walls between program offices, but installing doors between programs can promote collaboration and information sharing. This paper addresses the challenges and consequences of adopting an open source culture to facilitate technology collaboration for DoD space acquisitions. DISCLAIMER: The views presented here are the views of the author, and do not represent the views of the United States Government, United States Air Force, or the Missile Defense Agency.
Open source software integrated into data services of Japanese planetary explorations
NASA Astrophysics Data System (ADS)
Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.
2015-12-01
Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.
Embracing Open Source for NASA's Earth Science Data Systems
NASA Technical Reports Server (NTRS)
Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin
2017-01-01
The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.
Open source Modeling and optimization tools for Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peles, S.
Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less
Limitations of Phased Array Beamforming in Open Rotor Noise Source Imaging
NASA Technical Reports Server (NTRS)
Horvath, Csaba; Envia, Edmane; Podboy, Gary G.
2013-01-01
Phased array beamforming results of the F31/A31 historical baseline counter-rotating open rotor blade set were investigated for measurement data taken on the NASA Counter-Rotating Open Rotor Propulsion Rig in the 9- by 15-Foot Low-Speed Wind Tunnel of NASA Glenn Research Center as well as data produced using the LINPROP open rotor tone noise code. The planar microphone array was positioned broadside and parallel to the axis of the open rotor, roughly 2.3 rotor diameters away. The results provide insight as to why the apparent noise sources of the blade passing frequency tones and interaction tones appear at their nominal Mach radii instead of at the actual noise sources, even if those locations are not on the blades. Contour maps corresponding to the sound fields produced by the radiating sound waves, taken from the simulations, are used to illustrate how the interaction patterns of circumferential spinning modes of rotating coherent noise sources interact with the phased array, often giving misleading results, as the apparent sources do not always show where the actual noise sources are located. This suggests that a more sophisticated source model would be required to accurately locate the sources of each tone. The results of this study also have implications with regard to the shielding of open rotor sources by airframe empennages.
McPherson, Malcolm J.; Bellman, Robert A.
1984-01-01
A precision manometer gauge which locates a zero height and a measured height of liquid using an open tube in communication with a reservoir adapted to receive the pressure to be measured. The open tube has a reference section carried on a positioning plate which is moved vertically with machine tool precision. Double scales are provided to read the height of the positioning plate accurately, the reference section being inclined for accurate meniscus adjustment, and means being provided to accurately locate a zero or reference position.
McPherson, M.J.; Bellman, R.A.
1982-09-27
A precision manometer gauge which locates a zero height and a measured height of liquid using an open tube in communication with a reservoir adapted to receive the pressure to be measured. The open tube has a reference section carried on a positioning plate which is moved vertically with machine tool precision. Double scales are provided to read the height of the positioning plate accurately, the reference section being inclined for accurate meniscus adjustment, and means being provided to accurately locate a zero or reference position.
Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform
NASA Astrophysics Data System (ADS)
Liu, H. S.; Liao, H. M.
2015-08-01
Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.
ERIC Educational Resources Information Center
Masten, Lisa
This annotated bibliography provides a selected list of marketing reference sources for undergraduate and graduate business students interested in marketing and related topics. All sources listed are available in the Reference Department at the University Library at the University of Rhode Island Kingston campus. Most sources, with the exception…
Development of an Open Source, Air-Deployable Weather Station
NASA Astrophysics Data System (ADS)
Krejci, A.; Lopez Alcala, J. M.; Nelke, M.; Wagner, J.; Udell, C.; Higgins, C. W.; Selker, J. S.
2017-12-01
We created a packaged weather station intended to be deployed in the air on tethered systems. The device incorporates lightweight sensors and parts and runs for up to 24 hours off of lithium polymer batteries, allowing the entire package to be supported by a thin fiber. As the fiber does not provide a stable platform, additional data (pitch and roll) from typical weather parameters (e.g. temperature, pressure, humidity, wind speed, and wind direction) are determined using an embedded inertial motion unit. All designs are open sourced including electronics, CAD drawings, and descriptions of assembly and can be found on the OPEnS lab website at http://www.open-sensing.org/lowcost-weather-station/. The Openly Published Environmental Sensing Lab (OPEnS: Open-Sensing.org) expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting-edge technology. New OPEnS labs are now being established in India, France, Switzerland, the Netherlands, and Ghana.
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana L. Kelly
Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition,more » substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.« less
Diffusion spectral imaging modules correlate with EEG LORETA neuroimaging modules.
Thatcher, Robert W; North, Duane M; Biver, Carl J
2012-05-01
The purpose of this study was to test the hypothesis that the highest temporal correlations between 3-dimensional EEG current source density corresponds to anatomical Modules of high synaptic connectivity. Eyes closed and eyes open EEG was recorded from 19 scalp locations with a linked ears reference from 71 subjects age 13-42 years. LORETA was computed from 1 to 30 Hz in 2,394 cortical gray matter voxels that were grouped into six anatomical Modules corresponding to the ROIs in the Hagmann et al.'s [2008] diffusion spectral imaging (DSI) study. All possible cross-correlations between voxels within a DSI Module were compared with the correlations between Modules. The Hagmann et al. [ 2008] Module correlation structure was replicated in the correlation structure of EEG three-dimensional current source density. EEG Temporal correlation between brain regions is related to synaptic density as measured by diffusion spectral imaging. Copyright © 2011 Wiley-Liss, Inc.
Consensus nomenclature rules for radiopharmaceutical chemistry — Setting the record straight
Coenen, Heinz H.; Gee, Antony D.; Adam, Michael; ...
2017-10-21
Over recent years, within the community of radiopharmaceutical sciences, there has been an increased incidence of incorrect usage of established scientific terms and conventions, and even the emergence of ‘self-invented’ terms. Here, in order to address these concerns, an international Working Group on ‘Nomenclature in Radiopharmaceutical Chemistry and related areas’ was established in 2015 to achieve clarification of terms and to generate consensus on the utilisation of a standardised nomenclature pertinent to the field. Upon open consultation, the following consensus guidelines were agreed, which aim to: Provide a reference source for nomenclature good practice in the radiopharma-ceutical sciences; Clarify themore » use of terms and rules concerning exclusively radiopharmaceutical terminology, i.e. nuclear- and radiochemical terms, symbols and expressions; Address gaps and inconsistencies in existing radiochemistry nomenclature rules; Provide source literature for further harmonisation beyond our immediate peer group (publishers, editors, IUPAC, pharmacopoeias, etc.).« less
The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science
Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo
2008-01-01
The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570
Broadband infrared vibrational nano-spectroscopy using thermal blackbody radiation
O’Callahan, Brian T.; Lewis, William E.; Möbius, Silke; ...
2015-12-03
Infrared vibrational nano-spectroscopy based on scattering scanning near-field optical microscopy (s-SNOM) provides intrinsic chemical specificity with nanometer spatial resolution. Here we use incoherent infrared radiation from a 1400 K thermal blackbody emitter for broadband infrared (IR) nano-spectroscopy.With optimized interferometric heterodyne signal amplification we achieve few-monolayer sensitivity in phonon polariton spectroscopy and attomolar molecular vibrational spectroscopy. Near-field localization and nanoscale spatial resolution is demonstrated in imaging flakes of hexagonal boron nitride (hBN) and determination of its phonon polariton dispersion relation. The signal-to-noise ratio calculations and analysis for different samples and illumination sources provide a reference for irradiance requirements and the attainablemore » near-field signal levels in s-SNOM in general. As a result, the use of a thermal emitter as an IR source thus opens s-SNOM for routine chemical FTIR nano-spectroscopy.« less
13Check_RNA: A tool to evaluate 13C chemical shifts assignments of RNA.
Icazatti, A A; Martin, O A; Villegas, M; Szleifer, I; Vila, J A
2018-06-19
Chemical shifts (CS) are an important source of structural information of macromolecules such as RNA. In addition to the scarce availability of CS for RNA, the observed values are prone to errors due to a wrong re-calibration or miss assignments. Different groups have dedicated their efforts to correct CS systematic errors on RNA. Despite this, there are not automated and freely available algorithms for correct assignments of RNA 13C CS before their deposition to the BMRB or re-reference already deposited CS with systematic errors. Based on an existent method we have implemented an open source python module to correct 13C CS (from here on 13Cexp) systematic errors of RNAs and then return the results in 3 formats including the nmrstar one. This software is available on GitHub at https://github.com/BIOS-IMASL/13Check_RNA under a MIT license. Supplementary data are available at Bioinformatics online.
Broadband infrared vibrational nano-spectroscopy using thermal blackbody radiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
O’Callahan, Brian T.; Lewis, William E.; Möbius, Silke
Infrared vibrational nano-spectroscopy based on scattering scanning near-field optical microscopy (s-SNOM) provides intrinsic chemical specificity with nanometer spatial resolution. Here we use incoherent infrared radiation from a 1400 K thermal blackbody emitter for broadband infrared (IR) nano-spectroscopy.With optimized interferometric heterodyne signal amplification we achieve few-monolayer sensitivity in phonon polariton spectroscopy and attomolar molecular vibrational spectroscopy. Near-field localization and nanoscale spatial resolution is demonstrated in imaging flakes of hexagonal boron nitride (hBN) and determination of its phonon polariton dispersion relation. The signal-to-noise ratio calculations and analysis for different samples and illumination sources provide a reference for irradiance requirements and the attainablemore » near-field signal levels in s-SNOM in general. As a result, the use of a thermal emitter as an IR source thus opens s-SNOM for routine chemical FTIR nano-spectroscopy.« less
Consensus nomenclature rules for radiopharmaceutical chemistry — Setting the record straight
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coenen, Heinz H.; Gee, Antony D.; Adam, Michael
Over recent years, within the community of radiopharmaceutical sciences, there has been an increased incidence of incorrect usage of established scientific terms and conventions, and even the emergence of ‘self-invented’ terms. Here, in order to address these concerns, an international Working Group on ‘Nomenclature in Radiopharmaceutical Chemistry and related areas’ was established in 2015 to achieve clarification of terms and to generate consensus on the utilisation of a standardised nomenclature pertinent to the field. Upon open consultation, the following consensus guidelines were agreed, which aim to: Provide a reference source for nomenclature good practice in the radiopharma-ceutical sciences; Clarify themore » use of terms and rules concerning exclusively radiopharmaceutical terminology, i.e. nuclear- and radiochemical terms, symbols and expressions; Address gaps and inconsistencies in existing radiochemistry nomenclature rules; Provide source literature for further harmonisation beyond our immediate peer group (publishers, editors, IUPAC, pharmacopoeias, etc.).« less
Rotatable spin-polarized electron source for inverse-photoemission experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stolwijk, S. D., E-mail: Sebastian.Stolwijk@wwu.de; Wortelen, H.; Schmidt, A. B.
2014-01-15
We present a ROtatable Spin-polarized Electron source (ROSE) for the use in spin- and angle-resolved inverse-photoemission (SR-IPE) experiments. A key feature of the ROSE is a variable direction of the transversal electron beam polarization. As a result, the inverse-photoemission experiment becomes sensitive to two orthogonal in-plane polarization directions, and, for nonnormal electron incidence, to the out-of-plane polarization component. We characterize the ROSE and test its performance on the basis of SR-IPE experiments. Measurements on magnetized Ni films on W(110) serve as a reference to demonstrate the variable spin sensitivity. Moreover, investigations of the unoccupied spin-dependent surface electronic structure of Tl/Si(111)more » highlight the capability to analyze complex phenomena like spin rotations in momentum space. Essentially, the ROSE opens the way to further studies on complex spin-dependent effects in the field of surface magnetism and spin-orbit interaction at surfaces.« less
Open source electronic health records and chronic disease management.
Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri
2014-02-01
To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC.
Analysis of jet-airfoil interaction noise sources by using a microphone array technique
NASA Astrophysics Data System (ADS)
Fleury, Vincent; Davy, Renaud
2016-03-01
The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.
What an open source clinical trial community can learn from hackers
Dunn, Adam G.; Day, Richard O.; Mandl, Kenneth D.; Coiera, Enrico
2014-01-01
Summary Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. Since a similar gap has already been addressed in the software industry by the open source software movement, we examine how the social and technical principles of the movement can be used to guide the growth of an open source clinical trial community. PMID:22553248
ERIC Educational Resources Information Center
Ozdamli, Fezile
2007-01-01
Distance education is becoming more important in the universities and schools. The aim of this research is to evaluate the current existing Open Source Learning Management Systems according to Administration tool and Curriculum Design. For this, seventy two Open Source Learning Management Systems have been subjected to a general evaluation. After…
ERIC Educational Resources Information Center
Samuels, Ruth Gallegos; Griffy, Henry
2012-01-01
This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…
ERIC Educational Resources Information Center
Vlas, Radu Eduard
2012-01-01
Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…
ERIC Educational Resources Information Center
O'Connor, Eileen A.
2015-01-01
Opening with the history, recent advances, and emerging ways to use avatar-based virtual reality, an instructor who has used virtual environments since 2007 shares how these environments bring more options to community building, teaching, and education. With the open-source movement, where the source code for virtual environments was made…
ERIC Educational Resources Information Center
Wen, Wen
2012-01-01
While open source software (OSS) emphasizes open access to the source code and avoids the use of formal appropriability mechanisms, there has been little understanding of how the existence and exercise of formal intellectual property rights (IPR) such as patents influence the direction of OSS innovation. This dissertation seeks to bridge this gap…
Migrations of the Mind: The Emergence of Open Source Education
ERIC Educational Resources Information Center
Glassman, Michael; Bartholomew, Mitchell; Jones, Travis
2011-01-01
The authors describe an Open Source approach to education. They define Open Source Education (OSE) as a teaching and learning framework where the use and presentation of information is non-hierarchical, malleable, and subject to the needs and contributions of students as they become "co-owners" of the course. The course transforms itself into an…
ERIC Educational Resources Information Center
Waters, John K.
2010-01-01
Open source software is poised to make a profound impact on K-12 education. For years industry experts have been predicting the widespread adoption of open source tools by K-12 school districts. They're about to be proved right. The impact may not yet have been profound, but it's fair to say that some open source systems and non-proprietary…
7 Questions to Ask Open Source Vendors
ERIC Educational Resources Information Center
Raths, David
2012-01-01
With their budgets under increasing pressure, many campus IT directors are considering open source projects for the first time. On the face of it, the savings can be significant. Commercial emergency-planning software can cost upward of six figures, for example, whereas the open source Kuali Ready might run as little as $15,000 per year when…
ERIC Educational Resources Information Center
Heric, Matthew; Carter, Jenn
2011-01-01
Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…
O'Connell, Timothy; Chang, Debra
2012-01-01
While on call, radiology residents review imaging studies and issue preliminary reports to referring clinicians. In the absence of an integrated reporting system at the training sites of the authors' institution, residents were typing and faxing preliminary reports. To partially automate the on-call resident workflow, a Web-based system for resident reporting was developed by using the free open-source xAMP Web application framework and an open-source DICOM (Digital Imaging and Communications in Medicine) software toolkit, with the goals of reducing errors and lowering barriers to education. This reporting system integrates with the picture archiving and communication system to display a worklist of studies. Patient data are automatically entered in the preliminary report to prevent identification errors and simplify the report creation process. When the final report for a resident's on-call study is available, the reporting system queries the report broker for the final report, and then displays the preliminary report side by side with the final report, thus simplifying the review process and encouraging review of all of the resident's reports. The xAMP Web application framework should be considered for development of radiology department informatics projects owing to its zero cost, minimal hardware requirements, ease of programming, and large support community.
Description and User Instructions for the Quaternion_to_Orbit_v3 Software
NASA Technical Reports Server (NTRS)
Strekalov, Dmitry V.; Kruizinga, Gerhard L.; Paik, Meegyeong; Yuan, Dah-Ning; Asmar, Sami W.
2012-01-01
For a given inertial frame of reference, the software combines the spacecraft orbits with the spacecraft attitude quaternions, and rotates the body-fixed reference frame of a particular spacecraft to the inertial reference frame. The conversion assumes that the two spacecraft are aligned with respect to the mutual line of sight, with a parameterized time tag. The software is implemented in Python and is completely open source. It is very versatile, and may be applied under various circumstances and for other related purposes. Based on the solid linear algebra analysis, it has an extra option for compensating the linear pitch. This software has been designed for simulation of the calibration maneuvers performed by the two spacecraft comprising the GRAIL mission to the Moon, but has potential use for other applications. In simulations of formation flights, one needs to coordinate the spacecraft orbits represented in an appropriate inertial reference frame and the spacecraft attitudes. The latter are usually given as the time series of quaternions rotating the body-fixed reference frame of a particular spacecraft to the inertial reference frame. It is often desirable to simulate the same maneuver for different segments of the orbit. It is also useful to study various maneuvers that could be performed at the same orbit segment. These two lines of study are more timeand labor-efficient if the attitude and orbit data are generated independently, so that the part of the data that has not been changed can be recycled in the course of multiple simulations.
Open source IPSEC software in manned and unmanned space missions
NASA Astrophysics Data System (ADS)
Edwards, Jacob
Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.
Upon the Shoulders of Giants: Open-Source Hardware and Software in Analytical Chemistry.
Dryden, Michael D M; Fobel, Ryan; Fobel, Christian; Wheeler, Aaron R
2017-04-18
Isaac Newton famously observed that "if I have seen further it is by standing on the shoulders of giants." We propose that this sentiment is a powerful motivation for the "open-source" movement in scientific research, in which creators provide everything needed to replicate a given project online, as well as providing explicit permission for users to use, improve, and share it with others. Here, we write to introduce analytical chemists who are new to the open-source movement to best practices and concepts in this area and to survey the state of open-source research in analytical chemistry. We conclude by considering two examples of open-source projects from our own research group, with the hope that a description of the process, motivations, and results will provide a convincing argument about the benefits that this movement brings to both creators and users.
Open-Source 3-D Platform for Low-Cost Scientific Instrument Ecosystem.
Zhang, C; Wijnen, B; Pearce, J M
2016-08-01
The combination of open-source software and hardware provides technically feasible methods to create low-cost, highly customized scientific research equipment. Open-source 3-D printers have proven useful for fabricating scientific tools. Here the capabilities of an open-source 3-D printer are expanded to become a highly flexible scientific platform. An automated low-cost 3-D motion control platform is presented that has the capacity to perform scientific applications, including (1) 3-D printing of scientific hardware; (2) laboratory auto-stirring, measuring, and probing; (3) automated fluid handling; and (4) shaking and mixing. The open-source 3-D platform not only facilities routine research while radically reducing the cost, but also inspires the creation of a diverse array of custom instruments that can be shared and replicated digitally throughout the world to drive down the cost of research and education further. © 2016 Society for Laboratory Automation and Screening.
OpenSesame: an open-source, graphical experiment builder for the social sciences.
Mathôt, Sebastiaan; Schreij, Daniel; Theeuwes, Jan
2012-06-01
In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality, such as support for eyetrackers, input devices, and video playback, is available through plug-ins. OpenSesame can be used in combination with existing software for creating experiments.
Limiting Magnitude, τ, t eff, and Image Quality in DES Year 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
H. Neilsen, Jr.; Bernstein, Gary; Gruendl, Robert
The Dark Energy Survey (DES) is an astronomical imaging survey being completed with the DECam imager on the Blanco telescope at CTIO. After each night of observing, the DES data management (DM) group performs an initial processing of that night's data, and uses the results to determine which exposures are of acceptable quality, and which need to be repeated. The primary measure by which we declare an image of acceptable quality ismore » $$\\tau$$, a scaling of the exposure time. This is the scale factor that needs to be applied to the open shutter time to reach the same photometric signal to noise ratio for faint point sources under a set of canonical good conditions. These conditions are defined to be seeing resulting in a PSF full width at half maximum (FWHM) of 0.9" and a pre-defined sky brightness which approximates the zenith sky brightness under fully dark conditions. Point source limiting magnitude and signal to noise should therefore vary with t in the same way they vary with exposure time. Measurements of point sources and $$\\tau$$ in the first year of DES data confirm that they do. In the context of DES, the symbol $$t_{eff}$$ and the expression "effective exposure time" usually refer to the scaling factor, $$\\tau$$, rather than the actual effective exposure time; the "effective exposure time" in this case refers to the effective duration of one second, rather than the effective duration of an exposure.« less
The Privacy and Security Implications of Open Data in Healthcare.
Kobayashi, Shinji; Kane, Thomas B; Paton, Chris
2018-04-22
The International Medical Informatics Association (IMIA) Open Source Working Group (OSWG) initiated a group discussion to discuss current privacy and security issues in the open data movement in the healthcare domain from the perspective of the OSWG membership. Working group members independently reviewed the recent academic and grey literature and sampled a number of current large-scale open data projects to inform the working group discussion. This paper presents an overview of open data repositories and a series of short case reports to highlight relevant issues present in the recent literature concerning the adoption of open approaches to sharing healthcare datasets. Important themes that emerged included data standardisation, the inter-connected nature of the open source and open data movements, and how publishing open data can impact on the ethics, security, and privacy of informatics projects. The open data and open source movements in healthcare share many common philosophies and approaches including developing international collaborations across multiple organisations and domains of expertise. Both movements aim to reduce the costs of advancing scientific research and improving healthcare provision for people around the world by adopting open intellectual property licence agreements and codes of practice. Implications of the increased adoption of open data in healthcare include the need to balance the security and privacy challenges of opening data sources with the potential benefits of open data for improving research and healthcare delivery. Georg Thieme Verlag KG Stuttgart.
Simulation of partially coherent light propagation using parallel computing devices
NASA Astrophysics Data System (ADS)
Magalhães, Tiago C.; Rebordão, José M.
2017-08-01
Light acquires or loses coherence and coherence is one of the few optical observables. Spectra can be derived from coherence functions and understanding any interferometric experiment is also relying upon coherence functions. Beyond the two limiting cases (full coherence or incoherence) the coherence of light is always partial and it changes with propagation. We have implemented a code to compute the propagation of partially coherent light from the source plane to the observation plane using parallel computing devices (PCDs). In this paper, we restrict the propagation in free space only. To this end, we used the Open Computing Language (OpenCL) and the open-source toolkit PyOpenCL, which gives access to OpenCL parallel computation through Python. To test our code, we chose two coherence source models: an incoherent source and a Gaussian Schell-model source. In the former case, we divided into two different source shapes: circular and rectangular. The results were compared to the theoretical values. Our implemented code allows one to choose between the PyOpenCL implementation and a standard one, i.e using the CPU only. To test the computation time for each implementation (PyOpenCL and standard), we used several computer systems with different CPUs and GPUs. We used powers of two for the dimensions of the cross-spectral density matrix (e.g. 324, 644) and a significant speed increase is observed in the PyOpenCL implementation when compared to the standard one. This can be an important tool for studying new source models.
An Open Source Simulation System
NASA Technical Reports Server (NTRS)
Slack, Thomas
2005-01-01
An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.
NASA Astrophysics Data System (ADS)
Zheng, Sifa; Liu, Haitao; Dan, Jiabi; Lian, Xiaomin
2015-05-01
Linear time-invariant assumption for the determination of acoustic source characteristics, the source strength and the source impedance in the frequency domain has been proved reasonable in the design of an exhaust system. Different methods have been proposed to its identification and the multi-load method is widely used for its convenience by varying the load number and impedance. Theoretical error analysis has rarely been referred to and previous results have shown an overdetermined set of open pipes can reduce the identification error. This paper contributes a theoretical error analysis for the load selection. The relationships between the error in the identification of source characteristics and the load selection were analysed. A general linear time-invariant model was built based on the four-load method. To analyse the error of the source impedance, an error estimation function was proposed. The dispersion of the source pressure was obtained by an inverse calculation as an indicator to detect the accuracy of the results. It was found that for a certain load length, the load resistance at the frequency points of one-quarter wavelength of odd multiples results in peaks and in the maximum error for source impedance identification. Therefore, the load impedance of frequency range within the one-quarter wavelength of odd multiples should not be used for source impedance identification. If the selected loads have more similar resistance values (i.e., the same order of magnitude), the identification error of the source impedance could be effectively reduced.
A clinic compatible, open source electrophysiology system.
Hermiz, John; Rogers, Nick; Kaestner, Erik; Ganji, Mehran; Cleary, Dan; Snider, Joseph; Barba, David; Dayeh, Shadi; Halgren, Eric; Gilja, Vikash
2016-08-01
Open source electrophysiology (ephys) recording systems have several advantages over commercial systems such as customization and affordability enabling more researchers to conduct ephys experiments. Notable open source ephys systems include Open-Ephys, NeuroRighter and more recently Willow, all of which have high channel count (64+), scalability, and advanced software to develop on top of. However, little work has been done to build an open source ephys system that is clinic compatible, particularly in the operating room where acute human electrocorticography (ECoG) research is performed. We developed an affordable (<; $10,000) and open system for research purposes that features power isolation for patient safety, compact and water resistant enclosures and 256 recording channels sampled up to 20ksam/sec, 16-bit. The system was validated by recording ECoG with a high density, thin film device for an acute, awake craniotomy study at UC San Diego, Thornton Hospital Operating Room.
Freeing Worldview's development process: Open source everything!
NASA Astrophysics Data System (ADS)
Gunnoe, T.
2016-12-01
Freeing your code and your project are important steps for creating an inviting environment for collaboration, with the added side effect of keeping a good relationship with your users. NASA Worldview's codebase was released with the open source NOSA (NASA Open Source Agreement) license in 2014, but this is only the first step. We also have to free our ideas, empower our users by involving them in the development process, and open channels that lead to the creation of a community project. There are many highly successful examples of Free and Open Source Software (FOSS) projects of which we can take note: the Linux kernel, Debian, GNOME, etc. These projects owe much of their success to having a passionate mix of developers/users with a great community and a common goal in mind. This presentation will describe the scope of this openness and how Worldview plans to move forward with a more community-inclusive approach.
OpenFLUID: an open-source software environment for modelling fluxes in landscapes
NASA Astrophysics Data System (ADS)
Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc
2013-04-01
Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org
Interim Open Source Software (OSS) Policy
This interim Policy establishes a framework to implement the requirements of the Office of Management and Budget's (OMB) Federal Source Code Policy to achieve efficiency, transparency and innovation through reusable and open source software.
Radiant Temperature Nulling Radiometer
NASA Technical Reports Server (NTRS)
Ryan, Robert (Inventor)
2003-01-01
A self-calibrating nulling radiometer for non-contact temperature measurement of an object, such as a body of water, employs a black body source as a temperature reference, an optomechanical mechanism, e.g., a chopper, to switch back and forth between measuring the temperature of the black body source and that of a test source, and an infrared detection technique. The radiometer functions by measuring radiance of both the test and the reference black body sources; adjusting the temperature of the reference black body so that its radiance is equivalent to the test source; and, measuring the temperature of the reference black body at this point using a precision contact-type temperature sensor, to determine the radiative temperature of the test source. The radiation from both sources is detected by an infrared detector that converts the detected radiation to an electrical signal that is fed with a chopper reference signal to an error signal generator, such as a synchronous detector, that creates a precision rectified signal that is approximately proportional to the difference between the temperature of the reference black body and that of the test infrared source. This error signal is then used in a feedback loop to adjust the reference black body temperature until it equals that of the test source, at which point the error signal is nulled to zero. The chopper mechanism operates at one or more Hertz allowing minimization of l/f noise. It also provides pure chopping between the black body and the test source and allows continuous measurements.
Open Source Molecular Modeling
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-01-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126
ERIC Educational Resources Information Center
Long, Ju
2009-01-01
Open Source Software (OSS) is a major force in today's Information Technology (IT) landscape. Companies are increasingly using OSS in mission-critical applications. The transparency of the OSS technology itself with openly available source codes makes it ideal for students to participate in the OSS project development. OSS can provide unique…
Open Source Initiative Powers Real-Time Data Streams
NASA Technical Reports Server (NTRS)
2014-01-01
Under an SBIR contract with Dryden Flight Research Center, Creare Inc. developed a data collection tool called the Ring Buffered Network Bus. The technology has now been released under an open source license and is hosted by the Open Source DataTurbine Initiative. DataTurbine allows anyone to stream live data from sensors, labs, cameras, ocean buoys, cell phones, and more.
ERIC Educational Resources Information Center
Dunlap, Joanna C.; Wilson, Brent G.; Young, David L.
This paper describes how Open Source philosophy, a movement that has developed in opposition to the proprietary software industry, has influenced educational practice in the pursuit of scholarly freedom and authentic learning activities for students and educators. This paper provides a brief overview of the Open Source movement, and describes…
ERIC Educational Resources Information Center
van Rooij, Shahron Williams
2009-01-01
Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…
ERIC Educational Resources Information Center
Daniels, Daniel B., III
2014-01-01
There is a lack of literature linking end-user behavior to the availability of open-source intelligence (OSINT). Most OSINT literature has been focused on the use and assessment of open-source intelligence, not the proliferation of personally or organizationally identifiable information (PII/OII). Additionally, information security studies have…
Looking toward the Future: A Case Study of Open Source Software in the Humanities
ERIC Educational Resources Information Center
Quamen, Harvey
2006-01-01
In this article Harvey Quamen examines how the philosophy of open source software might be of particular benefit to humanities scholars in the near future--particularly for academic journals with limited financial resources. To this end he provides a case study in which he describes his use of open source technology (MySQL database software and…
Preparing a scientific manuscript in Linux: Today's possibilities and limitations.
Tchantchaleishvili, Vakhtang; Schmitto, Jan D
2011-10-22
Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux.
Open Source Service Agent (OSSA) in the intelligence community's Open Source Architecture
NASA Technical Reports Server (NTRS)
Fiene, Bruce F.
1994-01-01
The Community Open Source Program Office (COSPO) has developed an architecture for the intelligence community's new Open Source Information System (OSIS). The architecture is a multi-phased program featuring connectivity, interoperability, and functionality. OSIS is based on a distributed architecture concept. The system is designed to function as a virtual entity. OSIS will be a restricted (non-public), user configured network employing Internet communications. Privacy and authentication will be provided through firewall protection. Connection to OSIS can be made through any server on the Internet or through dial-up modems provided the appropriate firewall authentication system is installed on the client.
Exploring the Role of Value Networks for Software Innovation
NASA Astrophysics Data System (ADS)
Morgan, Lorraine; Conboy, Kieran
This paper describes a research-in-progress that aims to explore the applicability and implications of open innovation practices in two firms - one that employs agile development methods and another that utilizes open source software. The open innovation paradigm has a lot in common with open source and agile development methodologies. A particular strength of agile approaches is that they move away from 'introverted' development, involving only the development personnel, and intimately involves the customer in all areas of software creation, supposedly leading to the development of a more innovative and hence more valuable information system. Open source software (OSS) development also shares two key elements of the open innovation model, namely the collaborative development of the technology and shared rights to the use of the technology. However, one shortfall with agile development in particular is the narrow focus on a single customer representative. In response to this, we argue that current thinking regarding innovation needs to be extended to include multiple stakeholders both across and outside the organization. Additionally, for firms utilizing open source, it has been found that their position in a network of potential complementors determines the amount of superior value they create for their customers. Thus, this paper aims to get a better understanding of the applicability and implications of open innovation practices in firms that employ open source and agile development methodologies. In particular, a conceptual framework is derived for further testing.
Design and Deployment of a General Purpose, Open Source LoRa to Wi-Fi Hub and Data Logger
NASA Astrophysics Data System (ADS)
DeBell, T. C.; Udell, C.; Kwon, M.; Selker, J. S.; Lopez Alcala, J. M.
2017-12-01
Methods and technologies facilitating internet connectivity and near-real-time status updates for in site environmental sensor data are of increasing interest in Earth Science. However, Open Source, Do-It-Yourself technologies that enable plug and play functionality for web-connected sensors and devices remain largely inaccessible for typical researchers in our community. The Openly Published Environmental Sensing Lab at Oregon State University (OPEnS Lab) constructed an Open Source 900 MHz Long Range Radio (LoRa) receiver hub with SD card data logger, Ethernet and Wi-Fi shield, and 3D printed enclosure that dynamically uploads transmissions from multiple wirelessly-connected environmental sensing devices. Data transmissions may be received from devices up to 20km away. The hub time-stamps, saves to SD card, and uploads all transmissions to a Google Drive spreadsheet to be accessed in near-real-time by researchers and GeoVisualization applications (such as Arc GIS) for access, visualization, and analysis. This research expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting edge technology. This poster details our methods and evaluates the application of using 3D printing, Arduino Integrated Development Environment (IDE), Adafruit's Open-Hardware Feather development boards, and the WIZNET5500 Ethernet shield for designing this open-source, general purpose LoRa to Wi-Fi data logger.
Nelson, Chase W; Moncla, Louise H; Hughes, Austin L
2015-11-15
New applications of next-generation sequencing technologies use pools of DNA from multiple individuals to estimate population genetic parameters. However, no publicly available tools exist to analyse single-nucleotide polymorphism (SNP) calling results directly for evolutionary parameters important in detecting natural selection, including nucleotide diversity and gene diversity. We have developed SNPGenie to fill this gap. The user submits a FASTA reference sequence(s), a Gene Transfer Format (.GTF) file with CDS information and a SNP report(s) in an increasing selection of formats. The program estimates nucleotide diversity, distance from the reference and gene diversity. Sites are flagged for multiple overlapping reading frames, and are categorized by polymorphism type: nonsynonymous, synonymous, or ambiguous. The results allow single nucleotide, single codon, sliding window, whole gene and whole genome/population analyses that aid in the detection of positive and purifying natural selection in the source population. SNPGenie version 1.2 is a Perl program with no additional dependencies. It is free, open-source, and available for download at https://github.com/hugheslab/snpgenie. nelsoncw@email.sc.edu or austin@biol.sc.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Zhou, Ping; Lin, Hui; Zhang, Qi
2018-01-01
The reference source system is a key factor to ensure the successful location of the satellite interference source. Currently, the traditional system used a mechanical rotating antenna which leaded to the disadvantages of slow rotation and high failure-rate, which seriously restricted the system’s positioning-timeliness and became its obvious weaknesses. In this paper, a multi-beam antenna scheme based on the horn array was proposed as a reference source for the satellite interference location, which was used as an alternative to the traditional reference source antenna. The new scheme has designed a small circularly polarized horn antenna as an element and proposed a multi-beamforming algorithm based on planar array. Moreover, the simulation analysis of horn antenna pattern, multi-beam forming algorithm and simulated satellite link cross-ambiguity calculation have been carried out respectively. Finally, cross-ambiguity calculation of the traditional reference source system has also been tested. The comparison between the results of computer simulation and the actual test results shows that the scheme is scientific and feasible, obviously superior to the traditional reference source system.
Archaeology: A Student's Guide to Reference Sources.
ERIC Educational Resources Information Center
Desautels, Almuth, Comp.
This bibliography lists reference sources for research in archaeology. It is arranged in sections by type of reference source with subsections for general works and works covering specific areas. Categorized are handbooks; directories, biographies, and museums; encyclopedias; dictionaries; atlases; guides, manuals, and surveys; bibliographies; and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, David, E-mail: rcfilmconsulting@gmail.com; Devic, Slobodan
Purpose: In radiochromic film dosimetry systems, measurements are usually obtained from film images acquired on a CCD-based flatbed scanner. The authors investigated factors affecting scan-to-scan response variability leading to increased dose measurement uncertainty. Methods: The authors used flatbed document scanners to repetitively scan EBT3 radiochromic films exposed to doses 0–1000 cGy, together with three neutral density filters and three blue optical filters. Scanning was performed under two conditions: scanner lid closed and scanner lid opened/closed between scans. The authors also placed a scanner in a cold room at 9 °C and later in a room at 22 °C and scanned EBT3 filmsmore » to explore temperature effects. Finally, the authors investigated the effect of altering the distance between the film and the scanner’s light source. Results: Using a measurement protocol to isolate the contribution of the CCD and electronic circuitry of the scanners, the authors found that the standard deviation of response measurements for the EBT3 film model was about 0.17% for one scanner and 0.09% for the second. When the lid of the first scanner was opened and closed between scans, the average scan-to-scan difference of responses increased from 0.12% to 0.27%. Increasing the sample temperature during scanning changed the RGB response values by about −0.17, −0.14, and −0.05%/°C, respectively. Reducing the film-to-light source distance increased the RBG response values about 1.1, 1.3, and 1.4%/mm, respectively. The authors observed that films and film samples were often not flat with some areas up to 8 mm away from the scanner’s glass window. Conclusions: In the absence of measures to deal with the response irregularities, each factor the authors investigated could lead to dose uncertainty >2%. Those factors related to the film-to-light source distance could be particularly impactful since the authors observed many instances where the curl of film samples had the potential to cause dose uncertainty in excess of 5%. Two expedients will eliminate the uncertainties: a transparent sheet (preferably glass) placed over the scanned film keeps the film-to-light source distance constant, and an EBT3 reference film included in all scans provides correction factors for measured response values.« less
Synthesis of phylogeny and taxonomy into a comprehensive tree of life
Hinchliff, Cody E.; Smith, Stephen A.; Allman, James F.; Burleigh, J. Gordon; Chaudhary, Ruchi; Coghill, Lyndon M.; Crandall, Keith A.; Deng, Jiabin; Drew, Bryan T.; Gazis, Romina; Gude, Karl; Hibbett, David S.; Katz, Laura A.; Laughinghouse, H. Dail; McTavish, Emily Jane; Midford, Peter E.; Owen, Christopher L.; Ree, Richard H.; Rees, Jonathan A.; Soltis, Douglas E.; Williams, Tiffani; Cranston, Karen A.
2015-01-01
Reconstructing the phylogenetic relationships that unite all lineages (the tree of life) is a grand challenge. The paucity of homologous character data across disparately related lineages currently renders direct phylogenetic inference untenable. To reconstruct a comprehensive tree of life, we therefore synthesized published phylogenies, together with taxonomic classifications for taxa never incorporated into a phylogeny. We present a draft tree containing 2.3 million tips—the Open Tree of Life. Realization of this tree required the assembly of two additional community resources: (i) a comprehensive global reference taxonomy and (ii) a database of published phylogenetic trees mapped to this taxonomy. Our open source framework facilitates community comment and contribution, enabling the tree to be continuously updated when new phylogenetic and taxonomic data become digitally available. Although data coverage and phylogenetic conflict across the Open Tree of Life illuminate gaps in both the underlying data available for phylogenetic reconstruction and the publication of trees as digital objects, the tree provides a compelling starting point for community contribution. This comprehensive tree will fuel fundamental research on the nature of biological diversity, ultimately providing up-to-date phylogenies for downstream applications in comparative biology, ecology, conservation biology, climate change, agriculture, and genomics. PMID:26385966
Synthesis of phylogeny and taxonomy into a comprehensive tree of life.
Hinchliff, Cody E; Smith, Stephen A; Allman, James F; Burleigh, J Gordon; Chaudhary, Ruchi; Coghill, Lyndon M; Crandall, Keith A; Deng, Jiabin; Drew, Bryan T; Gazis, Romina; Gude, Karl; Hibbett, David S; Katz, Laura A; Laughinghouse, H Dail; McTavish, Emily Jane; Midford, Peter E; Owen, Christopher L; Ree, Richard H; Rees, Jonathan A; Soltis, Douglas E; Williams, Tiffani; Cranston, Karen A
2015-10-13
Reconstructing the phylogenetic relationships that unite all lineages (the tree of life) is a grand challenge. The paucity of homologous character data across disparately related lineages currently renders direct phylogenetic inference untenable. To reconstruct a comprehensive tree of life, we therefore synthesized published phylogenies, together with taxonomic classifications for taxa never incorporated into a phylogeny. We present a draft tree containing 2.3 million tips-the Open Tree of Life. Realization of this tree required the assembly of two additional community resources: (i) a comprehensive global reference taxonomy and (ii) a database of published phylogenetic trees mapped to this taxonomy. Our open source framework facilitates community comment and contribution, enabling the tree to be continuously updated when new phylogenetic and taxonomic data become digitally available. Although data coverage and phylogenetic conflict across the Open Tree of Life illuminate gaps in both the underlying data available for phylogenetic reconstruction and the publication of trees as digital objects, the tree provides a compelling starting point for community contribution. This comprehensive tree will fuel fundamental research on the nature of biological diversity, ultimately providing up-to-date phylogenies for downstream applications in comparative biology, ecology, conservation biology, climate change, agriculture, and genomics.
The taxonomic name resolution service: an online tool for automated standardization of plant names
2013-01-01
Background The digitization of biodiversity data is leading to the widespread application of taxon names that are superfluous, ambiguous or incorrect, resulting in mismatched records and inflated species numbers. The ultimate consequences of misspelled names and bad taxonomy are erroneous scientific conclusions and faulty policy decisions. The lack of tools for correcting this ‘names problem’ has become a fundamental obstacle to integrating disparate data sources and advancing the progress of biodiversity science. Results The TNRS, or Taxonomic Name Resolution Service, is an online application for automated and user-supervised standardization of plant scientific names. The TNRS builds upon and extends existing open-source applications for name parsing and fuzzy matching. Names are standardized against multiple reference taxonomies, including the Missouri Botanical Garden's Tropicos database. Capable of processing thousands of names in a single operation, the TNRS parses and corrects misspelled names and authorities, standardizes variant spellings, and converts nomenclatural synonyms to accepted names. Family names can be included to increase match accuracy and resolve many types of homonyms. Partial matching of higher taxa combined with extraction of annotations, accession numbers and morphospecies allows the TNRS to standardize taxonomy across a broad range of active and legacy datasets. Conclusions We show how the TNRS can resolve many forms of taxonomic semantic heterogeneity, correct spelling errors and eliminate spurious names. As a result, the TNRS can aid the integration of disparate biological datasets. Although the TNRS was developed to aid in standardizing plant names, its underlying algorithms and design can be extended to all organisms and nomenclatural codes. The TNRS is accessible via a web interface at http://tnrs.iplantcollaborative.org/ and as a RESTful web service and application programming interface. Source code is available at https://github.com/iPlantCollaborativeOpenSource/TNRS/. PMID:23324024
Second-Order Moller-Plesset Perturbation Theory for Molecular Dirac-Hartree-Fock Wave Functions
NASA Technical Reports Server (NTRS)
Dyall, Kenneth G.; Arnold, James O. (Technical Monitor)
1994-01-01
Moller-Plesset perturbation theory is developed to second order for a selection of Kramers restricted Dirac-Hartree-Fock closed and open-shell reference wave functions. The open-shell wave functions considered are limited to those with no more than two electrons in open shells, but include the case of a two-configuration SCF reference. Denominator shifts are included in the style of Davidson's OPT2 method. An implementation which uses unordered integrals with labels is presented, and results are given for a few test cases.
Establishing a celestial VLBI reference frame. 1: Searching for VLBI sources
NASA Technical Reports Server (NTRS)
Preston, R. A.; Morabito, D. D.; Williams, J. G.; Slade, M. A.; Harris, A. W.; Finley, S. G.; Skjerve, L. J.; Tanida, L.; Spitzmesser, D. J.; Johnson, B.
1978-01-01
The Deep Space Network is currently engaged in establishing a new high-accuracy VLBI celestial reference frame. The present status of the task of finding suitable celestial radio sources for constructing this reference frame is discussed. To date, 564 VLBI sources were detected, with 166 of these lying within 10 deg of the ecliptic plane. The variation of the sky distribution of these sources with source strength is examined.
The use of open source electronic health records within the federal safety net.
Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri
2014-01-01
To conduct a federally funded study that examines the acquisition, implementation and operation of open source electronic health records (EHR) within safety net medical settings, such as federally qualified health centers (FQHC). The study was conducted by the National Opinion Research Center (NORC) at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to West Virginia, California and Arizona FQHC that were currently using an open source EHR. Five of the six sites that were chosen as part of the study found a number of advantages in the use of their open source EHR system, such as utilizing a large community of users and developers to modify their EHR to fit the needs of their provider and patient communities, and lower acquisition and implementation costs as compared to a commercial system. Despite these advantages, many of the informants and site visit participants felt that widespread dissemination and use of open source was restrained due to a negative connotation regarding this type of software. In addition, a number of participants stated that there is a necessary level of technical acumen needed within the FQHC to make an open source EHR effective. An open source EHR provides advantages for FQHC that have limited resources to acquire and implement an EHR, but additional study is needed to evaluate its overall effectiveness.
Open source electronic health records and chronic disease management
Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri
2014-01-01
Objective To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). Methods and Materials The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Results Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. Discussion The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. Conclusions The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC. PMID:23813566
Peterson, Jeffrey R.; Blieden, Lauren S.; Chuang, Alice Z.; Baker, Laura A.; Rigi, Mohammed; Feldman, Robert M.; Bell, Nicholas P.
2016-01-01
Purpose Define criteria for iris-related parameters in an adult open angle population as measured with swept source Fourier domain anterior segment optical coherence tomography (ASOCT). Methods Ninety-eight eyes of 98 participants with open angles were included and stratified into 5 age groups (18–35, 36–45, 46–55, 56–65, and 66–79 years). ASOCT scans with 3D mode angle analysis were taken with the CASIA SS-1000 (Tomey Corporation, Nagoya, Japan) and analyzed using the Anterior Chamber Analysis and Interpretation software. Anterior iris surface length (AISL), length of scleral spur landmark (SSL) to pupillary margin (SSL-to-PM), iris contour ratio (ICR = AISL/SSL-to-PM), pupil radius, radius of iris centroid (RICe), and iris volume were measured. Outcome variables were summarized for all eyes and age groups, and mean values among age groups were compared using one-way analysis of variance. Stepwise regression analysis was used to investigate demographic and ocular characteristic factors that affected each iris-related parameter. Results Mean (±SD) values were 2.24 mm (±0.46), 4.06 mm (±0.27), 3.65 mm (±0.48), 4.16 mm (±0.47), 1.14 (±0.04), 1.51 mm2 (±0.23), and 38.42 μL (±4.91) for pupillary radius, RICe, SSL-to-PM, AISL, ICR, iris cross-sectional area, and iris volume, respectively. Both pupillary radius (P = 0.002) and RICe (P = 0.027) decreased with age, while SSL-to-PM (P = 0.002) and AISL increased with age (P = 0.001). ICR (P = 0.54) and iris volume (P = 0.49) were not affected by age. Conclusion This study establishes reference values for iris-related parameters in an adult open angle population, which will be useful for future studies examining the role of iris changes in pathologic states. PMID:26815917
Peterson, Jeffrey R; Blieden, Lauren S; Chuang, Alice Z; Baker, Laura A; Rigi, Mohammed; Feldman, Robert M; Bell, Nicholas P
2016-01-01
Define criteria for iris-related parameters in an adult open angle population as measured with swept source Fourier domain anterior segment optical coherence tomography (ASOCT). Ninety-eight eyes of 98 participants with open angles were included and stratified into 5 age groups (18-35, 36-45, 46-55, 56-65, and 66-79 years). ASOCT scans with 3D mode angle analysis were taken with the CASIA SS-1000 (Tomey Corporation, Nagoya, Japan) and analyzed using the Anterior Chamber Analysis and Interpretation software. Anterior iris surface length (AISL), length of scleral spur landmark (SSL) to pupillary margin (SSL-to-PM), iris contour ratio (ICR = AISL/SSL-to-PM), pupil radius, radius of iris centroid (RICe), and iris volume were measured. Outcome variables were summarized for all eyes and age groups, and mean values among age groups were compared using one-way analysis of variance. Stepwise regression analysis was used to investigate demographic and ocular characteristic factors that affected each iris-related parameter. Mean (±SD) values were 2.24 mm (±0.46), 4.06 mm (±0.27), 3.65 mm (±0.48), 4.16 mm (±0.47), 1.14 (±0.04), 1.51 mm2 (±0.23), and 38.42 μL (±4.91) for pupillary radius, RICe, SSL-to-PM, AISL, ICR, iris cross-sectional area, and iris volume, respectively. Both pupillary radius (P = 0.002) and RICe (P = 0.027) decreased with age, while SSL-to-PM (P = 0.002) and AISL increased with age (P = 0.001). ICR (P = 0.54) and iris volume (P = 0.49) were not affected by age. This study establishes reference values for iris-related parameters in an adult open angle population, which will be useful for future studies examining the role of iris changes in pathologic states.
KID Project: an internet-based digital video atlas of capsule endoscopy for research purposes
Koulaouzidis, Anastasios; Iakovidis, Dimitris K.; Yung, Diana E.; Rondonotti, Emanuele; Kopylov, Uri; Plevris, John N.; Toth, Ervin; Eliakim, Abraham; Wurm Johansson, Gabrielle; Marlicz, Wojciech; Mavrogenis, Georgios; Nemeth, Artur; Thorlacius, Henrik; Tontini, Gian Eugenio
2017-01-01
Background and aims Capsule endoscopy (CE) has revolutionized small-bowel (SB) investigation. Computational methods can enhance diagnostic yield (DY); however, incorporating machine learning algorithms (MLAs) into CE reading is difficult as large amounts of image annotations are required for training. Current databases lack graphic annotations of pathologies and cannot be used. A novel database, KID, aims to provide a reference for research and development of medical decision support systems (MDSS) for CE. Methods Open-source software was used for the KID database. Clinicians contribute anonymized, annotated CE images and videos. Graphic annotations are supported by an open-access annotation tool (Ratsnake). We detail an experiment based on the KID database, examining differences in SB lesion measurement between human readers and a MLA. The Jaccard Index (JI) was used to evaluate similarity between annotations by the MLA and human readers. Results The MLA performed best in measuring lymphangiectasias with a JI of 81 ± 6 %. The other lesion types were: angioectasias (JI 64 ± 11 %), aphthae (JI 64 ± 8 %), chylous cysts (JI 70 ± 14 %), polypoid lesions (JI 75 ± 21 %), and ulcers (JI 56 ± 9 %). Conclusion MLA can perform as well as human readers in the measurement of SB angioectasias in white light (WL). Automated lesion measurement is therefore feasible. KID is currently the only open-source CE database developed specifically to aid development of MDSS. Our experiment demonstrates this potential. PMID:28580415
Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics
Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A.; Caron, Christophe
2015-01-01
Summary: The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. Availability and implementation: http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). Contact: contact@workflow4metabolomics.org PMID:25527831
Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics.
Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A; Caron, Christophe
2015-05-01
The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). contact@workflow4metabolomics.org. © The Author 2014. Published by Oxford University Press.
Integration of tools for binding archetypes to SNOMED CT.
Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan
2008-10-27
The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.
Integration of tools for binding archetypes to SNOMED CT
Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Åhlfeldt, Hans; Rector, Alan
2008-01-01
Background The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Methods Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. Results An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Conclusion Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail. PMID:19007444
Xu, Hao; Tong, Na; Huang, Shaobin; Zhou, Shaofeng; Li, Shuang; Li, Jianjun; Zhang, Yongqing
2018-05-03
This study aimed to investigate the degradation efficiency of 2,4,6-trichlorophenol through a batch of potentiostatic experiments (0.2 V vs. Ag/AgCl). Efficiencies in the presence and absence of acetate and glucose were compared through open-circuit reference experiments. Significant differences in degradation efficiency were observed in six reactors. The highest and lowest degradation efficiencies were observed in the closed-circuit reactor fed with glucose and in the open-circuit reactor, respectively. This finding was due to the enhanced bacterial metabolism caused by the application of micro-electrical field and degradable organics as co-substrates. The different treatment efficiencies were also caused by the distinct bacterial communities. The composition of bacterial community was affected by adding different organics as co-substrates. At the phylum level, the most dominant bacteria in the reactor with the added acetate and glucose were Proteobacteria and Firmicutes, respectively. Copyright © 2018 Elsevier Ltd. All rights reserved.
VISIBIOweb: visualization and layout services for BioPAX pathway models
Dilek, Alptug; Belviranli, Mehmet E.; Dogrusoz, Ugur
2010-01-01
With recent advancements in techniques for cellular data acquisition, information on cellular processes has been increasing at a dramatic rate. Visualization is critical to analyzing and interpreting complex information; representing cellular processes or pathways is no exception. VISIBIOweb is a free, open-source, web-based pathway visualization and layout service for pathway models in BioPAX format. With VISIBIOweb, one can obtain well-laid-out views of pathway models using the standard notation of the Systems Biology Graphical Notation (SBGN), and can embed such views within one's web pages as desired. Pathway views may be navigated using zoom and scroll tools; pathway object properties, including any external database references available in the data, may be inspected interactively. The automatic layout component of VISIBIOweb may also be accessed programmatically from other tools using Hypertext Transfer Protocol (HTTP). The web site is free and open to all users and there is no login requirement. It is available at: http://visibioweb.patika.org. PMID:20460470
NASA Technical Reports Server (NTRS)
1981-01-01
The use of an International Standards Organization (ISO) Open Systems Interconnection (OSI) Reference Model and its relevance to interconnecting an Applications Data Service (ADS) pilot program for data sharing is discussed. A top level mapping between the conjectured ADS requirements and identified layers within the OSI Reference Model was performed. It was concluded that the OSI model represents an orderly architecture for the ADS networking planning and that the protocols being developed by the National Bureau of Standards offer the best available implementation approach.
Coal fly ash as a source of iron in atmospheric dust.
Chen, Haihan; Laskin, Alexander; Baltrusaitis, Jonas; Gorski, Christopher A; Scherer, Michelle M; Grassian, Vicki H
2012-02-21
Anthropogenic coal fly ash (FA) aerosol may represent a significant source of bioavailable iron in the open ocean. Few measurements have been made that compare the solubility of atmospheric iron from anthropogenic aerosols and other sources. We report here an investigation of iron dissolution for three FA samples in acidic aqueous solutions and compare the solubilities with that of Arizona test dust (AZTD), a reference material for mineral dust. The effects of pH, simulated cloud processing, and solar radiation on iron solubility have been explored. Similar to previously reported results on mineral dust, iron in aluminosilicate phases provides the predominant component of dissolved iron. Iron solubility of FA is substantially higher than of the crystalline minerals comprising AZTD. Simulated atmospheric processing elevates iron solubility due to significant changes in the morphology of aluminosilicate glass, a dominant material in FA particles. Iron is continuously released into the aqueous solution as FA particles break up into smaller fragments. These results suggest that the assessment of dissolved atmospheric iron deposition fluxes and their effect on the biogeochemistry at the ocean surface should be constrained by the source, environmental pH, iron speciation, and solar radiation.
OpenCFU, a new free and open-source software to count cell colonies and other circular objects.
Geissmann, Quentin
2013-01-01
Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net.
Using Open Source Software in Visual Simulation Development
2005-09-01
increased the use of the technology in training activities. Using open source/free software tools in the process can expand these possibilities...resulting in even greater cost reduction and allowing the flexibility needed in a training environment. This thesis presents a configuration and architecture...to be used when developing training visual simulations using both personal computers and open source tools. Aspects of the requirements needed in a
Open-Source Intelligence in the Czech Military: Knowledge System and Process Design
2002-06-01
in Open-Source Intelligence OSINT, as one of the intelligence disciplines, bears some of the general problems of intelligence " business " OSINT...ADAPTING KNOWLEDGE MANAGEMENT THEORY TO THE CZECH MILITARY INTELLIGENCE Knowledge work is the core business of the military intelligence . As...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited OPEN-SOURCE INTELLIGENCE IN THE
ERIC Educational Resources Information Center
Ballentine, Brian D.
2009-01-01
Writing programs and more specifically, Writing in the Disciplines (WID) initiatives have begun to embrace the use of and the ideology inherent to, open source software. The Conference on College Composition and Communication has passed a resolution stating that whenever feasible educators and their institutions consider open source applications.…
Anatomy of BioJS, an open source community for the life sciences.
Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel
2015-07-08
BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects.
Build, Buy, Open Source, or Web 2.0?: Making an Informed Decision for Your Library
ERIC Educational Resources Information Center
Fagan, Jody Condit; Keach, Jennifer A.
2010-01-01
When improving a web presence, today's libraries have a choice: using a free Web 2.0 application, opting for open source, buying a product, or building a web application. This article discusses how to make an informed decision for one's library. The authors stress that deciding whether to use a free Web 2.0 application, to choose open source, to…
ERIC Educational Resources Information Center
Simpson, James Daniel
2014-01-01
Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…
State-of-the-Art in Open Courseware Initiatives Worldwide
ERIC Educational Resources Information Center
Vladoiu, Monica
2011-01-01
We survey here the state-of-the-art in open courseware initiatives worldwide. First, the MIT OpenCourseWare project is overviewed, as it has been the real starting point of the OCW movement. Usually, open courseware refers to a free and open digital publication of high quality university level educational materials that are organized as courses,…
A Framework for the Systematic Collection of Open Source Intelligence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pouchard, Line Catherine; Trien, Joseph P; Dobson, Jonathan D
2009-01-01
Following legislative directions, the Intelligence Community has been mandated to make greater use of Open Source Intelligence (OSINT). Efforts are underway to increase the use of OSINT but there are many obstacles. One of these obstacles is the lack of tools helping to manage the volume of available data and ascertain its credibility. We propose a unique system for selecting, collecting and storing Open Source data from the Web and the Open Source Center. Some data management tasks are automated, document source is retained, and metadata containing geographical coordinates are added to the documents. Analysts are thus empowered to search,more » view, store, and analyze Web data within a single tool. We present ORCAT I and ORCAT II, two implementations of the system.« less
The open-source neutral-mass spectrometer on Atmosphere Explorer-C, -D, and -E.
NASA Technical Reports Server (NTRS)
Nier, A. O.; Potter, W. E.; Hickman, D. R.; Mauersberger, K.
1973-01-01
The open-source mass spectrometer will be used to obtain the number densities of the neutral atmospheric gases in the mass range 1 to 48 amu at the satellite location. The ion source has been designed to allow gas particles to enter the ionizing region with the minimum practicable number of prior collisions with surfaces. This design minimizes the loss of atomic oxygen and other reactive species due to reactions with the walls of the ion source. The principal features of the open-source spectrometer and the laboratory calibration system are discussed.
Jewish Studies: A Guide to Reference Sources.
ERIC Educational Resources Information Center
McGill Univ., Montreal (Quebec). McLennan Library.
An annotated bibliography to the reference sources for Jewish Studies in the McLennan Library of McGill University (Canada) is presented. Any titles in Hebrew characters are listed by their transliterated equivalents. There is also a list of relevant Library of Congress Subject Headings. General reference sources listed are: encyclopedias,…
Economics: A Guide to Reference Sources.
ERIC Educational Resources Information Center
Mason, Mary, Comp.
Approximately 84 reference materials on economics located in the McLennan Library, McGill University (Montreal), are cited in this annotated bibliography. The bibliography serves to provide an overview of the printed bibliographic and reference sources useful for the study of economics. Financial and business sources and statistical compendia and…
2011-03-31
evidence based medicine into clinical practice. It will decrease costs and enable multiple stakeholders to work in an open content/source environment to exchange clinical content, develop and test technology and explore processes in applied CDS. Design: Comparative study between the KMR infrastructure and capabilities developed as an open source, vendor agnostic solution for aCPG execution within AHLTA and the current DoD/MHS standard evaluating: H1: An open source, open standard KMR and Clinical Decision Support Engine can enable organizations to share domain
U.S. Department of Energy Reference Model Program RM2: Experimental Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, Craig; Neary, Vincent Sinclair; Gunawan, Budi
2014-08-01
The Reference Model Project (RMP), sponsored by the U.S. Department of Energy’s (DOE) Wind and Water Power Technologies Program within the Office of Energy Efficiency & Renewable Energy (EERE), aims at expediting industry growth and efficiency by providing non-proprietary Reference Models (RM) of MHK technology designs as study objects for open-source research and development (Neary et al. 2014a,b). As part of this program, MHK turbine models were tested in a large open channel facility at the University of Minnesota’s St. Anthony Falls Laboratory (UMN - SAFL) . Reference Model 2 (RM2) is a 1:15 geometric scale dual - rotor crossmore » flow vertical axis device with counter - rotating rotors, each with a rotor diameter d T = 0.43m and rotor height, h T = 0.323 m. RM2 is a river turbine designed for a site modeled after a reach in the lower Mississippi River near Baton Rouge, Louisiana (Barone et al. 2014) . Precise blade angular position and torque measurements were synchronized with three acoustic Doppler velocimeters (ADV) aligned with each rotor and the midpoint for RM2 . Flow conditions for each case were controlled such that depth, h = 1m, and volumetric flow rate, Q w = 2. 35m 3s -1 , resulting in a hub height velocity of approximately U hub = 1. 2 ms -1 and blade chord length Reynolds numbers of Re c = 6 .1x10 4. Vertical velocity profiles collected in the wake of each device from 1 to 10 rotor diameters are used to estimate the velocity recovery and turbulent characteristics in the wake, as well as the interaction of the counter-rotating rotor wakes. The development of this high resolution laboratory investigation provides a robust dataset that enables assessing computational fluid dynamics (CFD) models and their ability to accurately simulate turbulent inflow environments, device performance metrics, and to reproduce wake velocity deficit, recovery and higher order turbulent statistics.« less
Semiotic foundation for multisensor-multilook fusion
NASA Astrophysics Data System (ADS)
Myler, Harley R.
1998-07-01
This paper explores the concept of an application of semiotic principles to the design of a multisensor-multilook fusion system. Semiotics is an approach to analysis that attempts to process media in a united way using qualitative methods as opposed to quantitative. The term semiotic refers to signs, or signatory data that encapsulates information. Semiotic analysis involves the extraction of signs from information sources and the subsequent processing of the signs into meaningful interpretations of the information content of the source. The multisensor fusion problem predicated on a semiotic system structure and incorporating semiotic analysis techniques is explored and the design for a multisensor system as an information fusion system is explored. Semiotic analysis opens the possibility of using non-traditional sensor sources and modalities in the fusion process, such as verbal and textual intelligence derived from human observers. Examples of how multisensor/multimodality data might be analyzed semiotically is shown and discussion on how a semiotic system for multisensor fusion could be realized is outlined. The architecture of a semiotic multisensor fusion processor that can accept situational awareness data is described, although an implementation has not as yet been constructed.
OpenFIRE - A Web GIS Service for Distributing the Finnish Reflection Experiment Datasets
NASA Astrophysics Data System (ADS)
Väkevä, Sakari; Aalto, Aleksi; Heinonen, Aku; Heikkinen, Pekka; Korja, Annakaisa
2017-04-01
The Finnish Reflection Experiment (FIRE) is a land-based deep seismic reflection survey conducted between 2001 and 2003 by a research consortium of the Universities of Helsinki and Oulu, the Geological Survey of Finland, and a Russian state-owned enterprise SpetsGeofysika. The dataset consists of 2100 kilometers of high-resolution profiles across the Archaean and Proterozoic nuclei of the Fennoscandian Shield. Although FIRE data have been available on request since 2009, the data have remained underused outside the original research consortium. The original FIRE data have been quality-controlled. The shot gathers have been cross-checked and comprehensive errata has been created. The brute stacks provided by the Russian seismic contractor have been reprocessed into seismic sections and replotted. A complete documentation of the intermediate processing steps is provided together with guidelines for setting up a computing environment and plotting the data. An open access web service "OpenFIRE" for the visualization and the downloading of FIRE data has been created. The service includes a mobile-responsive map application capable of enriching seismic sections with data from other sources such as open data from the National Land Survey and the Geological Survey of Finland. The AVAA team of the Finnish Open Science and Research Initiative has provided a tailored Liferay portal with necessary web components such as an API (Application Programming Interface) for download requests. INSPIRE (Infrastructure for Spatial Information in Europe) -compliant discovery metadata have been produced and geospatial data will be exposed as Open Geospatial Consortium standard services. The technical guidelines of the European Plate Observing System have been followed and the service could be considered as a reference application for sharing reflection seismic data. The OpenFIRE web service is available at www.seismo.helsinki.fi/openfire
Preparing a scientific manuscript in Linux: Today's possibilities and limitations
2011-01-01
Background Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Findings Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux. PMID:22018246
Open source bioimage informatics for cell biology.
Swedlow, Jason R; Eliceiri, Kevin W
2009-11-01
Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.
Implementation, reliability, and feasibility test of an Open-Source PACS.
Valeri, Gianluca; Zuccaccia, Matteo; Badaloni, Andrea; Ciriaci, Damiano; La Riccia, Luigi; Mazzoni, Giovanni; Maggi, Stefania; Giovagnoni, Andrea
2015-12-01
To implement a hardware and software system able to perform the major functions of an Open-Source PACS, and to analyze it in a simulated real-world environment. A small home network was implemented, and the Open-Source operating system Ubuntu 11.10 was installed in a laptop containing the Dcm4chee suite with the software devices needed. The Open-Source PACS implemented is compatible with Linux OS, Microsoft OS, and Mac OS X; furthermore, it was used with operating systems that guarantee the operation in portable devices (smartphone, tablet) Android and iOS. An OSS PACS is useful for making tutorials and workshops on post-processing techniques for educational and training purposes.
Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle
NASA Astrophysics Data System (ADS)
Vinay, S.; Downs, R. R.
2012-12-01
Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.
Open source data assimilation framework for hydrological modeling
NASA Astrophysics Data System (ADS)
Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik
2013-04-01
An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent processes from a different domain or have different spatial and temporal resolutions. An open source framework that bridges OpenMI and OpenDA is presented. The framework provides a generic and easy means for any OpenMI compliant model to assimilate observation measurements. An example test case will be presented using MikeSHE, and OpenMI compliant fully coupled integrated hydrological model that can accurately simulate the feedback dynamics of overland flow, unsaturated zone and saturated zone.
Building high-quality assay libraries for targeted analysis of SWATH MS data.
Schubert, Olga T; Gillet, Ludovic C; Collins, Ben C; Navarro, Pedro; Rosenberger, George; Wolski, Witold E; Lam, Henry; Amodei, Dario; Mallick, Parag; MacLean, Brendan; Aebersold, Ruedi
2015-03-01
Targeted proteomics by selected/multiple reaction monitoring (S/MRM) or, on a larger scale, by SWATH (sequential window acquisition of all theoretical spectra) MS (mass spectrometry) typically relies on spectral reference libraries for peptide identification. Quality and coverage of these libraries are therefore of crucial importance for the performance of the methods. Here we present a detailed protocol that has been successfully used to build high-quality, extensive reference libraries supporting targeted proteomics by SWATH MS. We describe each step of the process, including data acquisition by discovery proteomics, assertion of peptide-spectrum matches (PSMs), generation of consensus spectra and compilation of MS coordinates that uniquely define each targeted peptide. Crucial steps such as false discovery rate (FDR) control, retention time normalization and handling of post-translationally modified peptides are detailed. Finally, we show how to use the library to extract SWATH data with the open-source software Skyline. The protocol takes 2-3 d to complete, depending on the extent of the library and the computational resources available.
A systematic review of current and emergent manipulator control approaches
NASA Astrophysics Data System (ADS)
Ajwad, Syed Ali; Iqbal, Jamshed; Ullah, Muhammad Imran; Mehmood, Adeel
2015-06-01
Pressing demands of productivity and accuracy in today's robotic applications have highlighted an urge to replace classical control strategies with their modern control counterparts. This recent trend is further justified by the fact that the robotic manipulators have complex nonlinear dynamic structure with uncertain parameters. Highlighting the authors' research achievements in the domain of manipulator design and control, this paper presents a systematic and comprehensive review of the state-of-the-art control techniques that find enormous potential in controlling manipulators to execute cuttingedge applications. In particular, three kinds of strategies, i.e., intelligent proportional-integral-derivative (PID) scheme, robust control and adaptation based approaches, are reviewed. Future trend in the subject area is commented. Open-source simulators to facilitate controller design are also tabulated. With a comprehensive list of references, it is anticipated that the review will act as a firsthand reference for researchers, engineers and industrialinterns to realize the control laws for multi-degree of freedom (DOF) manipulators.
Kim, Oh Seok; Newell, Joshua P
2015-10-01
This paper proposes a new land-change model, the Geographic Emission Benchmark (GEB), as an approach to quantify land-cover changes associated with deforestation and forest degradation. The GEB is designed to determine 'baseline' activity data for reference levels. Unlike other models that forecast business-as-usual future deforestation, the GEB internally (1) characterizes 'forest' and 'deforestation' with minimal processing and ground-truthing and (2) identifies 'deforestation hotspots' using open-source spatial methods to estimate regional rates of deforestation. The GEB also characterizes forest degradation and identifies leakage belts. This paper compares the accuracy of GEB with GEOMOD, a popular land-change model used in the UN-REDD (Reducing Emissions from Deforestation and Forest Degradation) Program. Using a case study of the Chinese tropics for comparison, GEB's projection is more accurate than GEOMOD's, as measured by Figure of Merit. Thus, the GEB produces baseline activity data that are moderately accurate for the setting of reference levels.
Plant Reactome: a resource for plant pathways and comparative analysis
Naithani, Sushma; Preece, Justin; D'Eustachio, Peter; Gupta, Parul; Amarasinghe, Vindhya; Dharmawardhana, Palitha D.; Wu, Guanming; Fabregat, Antonio; Elser, Justin L.; Weiser, Joel; Keays, Maria; Fuentes, Alfonso Munoz-Pomer; Petryszak, Robert; Stein, Lincoln D.; Ware, Doreen; Jaiswal, Pankaj
2017-01-01
Plant Reactome (http://plantreactome.gramene.org/) is a free, open-source, curated plant pathway database portal, provided as part of the Gramene project. The database provides intuitive bioinformatics tools for the visualization, analysis and interpretation of pathway knowledge to support genome annotation, genome analysis, modeling, systems biology, basic research and education. Plant Reactome employs the structural framework of a plant cell to show metabolic, transport, genetic, developmental and signaling pathways. We manually curate molecular details of pathways in these domains for reference species Oryza sativa (rice) supported by published literature and annotation of well-characterized genes. Two hundred twenty-two rice pathways, 1025 reactions associated with 1173 proteins, 907 small molecules and 256 literature references have been curated to date. These reference annotations were used to project pathways for 62 model, crop and evolutionarily significant plant species based on gene homology. Database users can search and browse various components of the database, visualize curated baseline expression of pathway-associated genes provided by the Expression Atlas and upload and analyze their Omics datasets. The database also offers data access via Application Programming Interfaces (APIs) and in various standardized pathway formats, such as SBML and BioPAX. PMID:27799469
An Integrated SNP Mining and Utilization (ISMU) Pipeline for Next Generation Sequencing Data
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M.; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A. V. S. K.; Varshney, Rajeev K.
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software. PMID:25003610
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.
When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modularmore » In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.« less
Open source OCR framework using mobile devices
NASA Astrophysics Data System (ADS)
Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan
2008-02-01
Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.
Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M
2013-04-19
The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.
Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.
2013-01-01
The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032
Absolute calibration of Doppler coherence imaging velocity images
NASA Astrophysics Data System (ADS)
Samuell, C. M.; Allen, S. L.; Meyer, W. H.; Howard, J.
2017-08-01
A new technique has been developed for absolutely calibrating a Doppler Coherence Imaging Spectroscopy interferometer for measuring plasma ion and neutral velocities. An optical model of the interferometer is used to generate zero-velocity reference images for the plasma spectral line of interest from a calibration source some spectral distance away. Validation of this technique using a tunable diode laser demonstrated an accuracy better than 0.2 km/s over an extrapolation range of 3.5 nm; a two order of magnitude improvement over linear approaches. While a well-characterized and very stable interferometer is required, this technique opens up the possibility of calibrated velocity measurements in difficult viewing geometries and for complex spectral line-shapes.
Kajihata, Shuichi; Furusawa, Chikara; Matsuda, Fumio; Shimizu, Hiroshi
2014-01-01
The in vivo measurement of metabolic flux by (13)C-based metabolic flux analysis ((13)C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a (13)C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas (13)C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary (13)C metabolic flux analysis (INST-(13)C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-(13)C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-(13)C-MFA. Confidence intervals determined by INST-(13)C-MFA were less than those determined by conventional methods, indicating the potential of INST-(13)C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-(13)C-MFA.
A structured approach to recording AIDS-defining illnesses in Kenya: A SNOMED CT based solution
Oluoch, Tom; de Keizer, Nicolette; Langat, Patrick; Alaska, Irene; Ochieng, Kenneth; Okeyo, Nicky; Kwaro, Daniel; Cornet, Ronald
2016-01-01
Introduction Several studies conducted in sub-Saharan Africa (SSA) have shown that routine clinical data in HIV clinics often have errors. Lack of structured and coded documentation of diagnosis of AIDS defining illnesses (ADIs) can compromise data quality and decisions made on clinical care. Methods We used a structured framework to derive a reference set of concepts and terms used to describe ADIs. The four sources used were: (i) CDC/Accenture list of opportunistic infections, (ii) SNOMED Clinical Terms (SNOMED CT), (iii) Focus Group Discussion (FGD) among clinicians and nurses attending to patients at a referral provincial hospital in western Kenya, and (iv) chart abstraction from the Maternal Child Health (MCH) and HIV clinics at the same hospital. Using the January 2014 release of SNOMED CT, concepts were retrieved that matched terms abstracted from approach iii & iv, and the content coverage assessed. Post-coordination matching was applied when needed. Results The final reference set had 1054 unique ADI concepts which were described by 1860 unique terms. Content coverage of SNOMED CT was high (99.9% with pre-coordinated concepts; 100% with post-coordination). The resulting reference set for ADIs was implemented as the interface terminology on OpenMRS data entry forms. Conclusion Different sources demonstrate complementarity in the collection of concepts and terms for an interface terminology. SNOMED CT provides a high coverage in the domain of ADIs. Further work is needed to evaluate the effect of the interface terminology on data quality and quality of care. PMID:26184057
Simulation for Dynamic Situation Awareness and Prediction III
2010-03-01
source Java ™ library for capturing and sending network packets; 4) Groovy – an open source, Java -based scripting language (version 1.6 or newer). Open...DMOTH Analyzer application. Groovy is an open source dynamic scripting language for the Java Virtual Machine. It is consistent with Java syntax...between temperature, pressure, wind and relative humidity, and 3) a precipitation editing algorithm. The Editor can be used to prepare scripted changes
ERIC Educational Resources Information Center
Pfaffman, Jay
2008-01-01
Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…
Managing Digital Archives Using Open Source Software Tools
NASA Astrophysics Data System (ADS)
Barve, S.; Dongare, S.
2007-10-01
This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.
Open source tools for fluorescent imaging.
Hamilton, Nicholas A
2012-01-01
As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin
2016-01-01
In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…
Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.
Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel
2015-01-01
There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).
The use of open source electronic health records within the federal safety net
Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri
2014-01-01
Objective To conduct a federally funded study that examines the acquisition, implementation and operation of open source electronic health records (EHR) within safety net medical settings, such as federally qualified health centers (FQHC). Methods and materials The study was conducted by the National Opinion Research Center (NORC) at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to West Virginia, California and Arizona FQHC that were currently using an open source EHR. Results Five of the six sites that were chosen as part of the study found a number of advantages in the use of their open source EHR system, such as utilizing a large community of users and developers to modify their EHR to fit the needs of their provider and patient communities, and lower acquisition and implementation costs as compared to a commercial system. Discussion Despite these advantages, many of the informants and site visit participants felt that widespread dissemination and use of open source was restrained due to a negative connotation regarding this type of software. In addition, a number of participants stated that there is a necessary level of technical acumen needed within the FQHC to make an open source EHR effective. Conclusions An open source EHR provides advantages for FQHC that have limited resources to acquire and implement an EHR, but additional study is needed to evaluate its overall effectiveness. PMID:23744787
APPLYING OPEN-PATH OPTICAL SPECTROSCOPY TO HEAVY-DUTY DIESEL EMISSIONS
Non-dispersive infrared absorption has been used to measure gaseous emissions for both stationary and mobile sources. Fourier transform infrared spectroscopy has been used for stationary sources as both extractive and open-path methods. We have applied the open-path method for bo...
Stakeholder co-development of farm level nutrient management software
NASA Astrophysics Data System (ADS)
Buckley, Cathal; Mechan, Sarah; Macken-Walsh, Aine; Heanue, Kevin
2013-04-01
Over the last number of decades intensification in the use nitrogen (N) and phosphorus (P) in agricultural production has lead to excessive accumulations of these nutrients in soils, groundwaters and surface water bodies (Sutton et al., 2011). According to the European Environment Agency (2012) despite some progress diffuse pollution from agriculture is still significant in more than 40% of Europe's water bodies in rivers and coastal waters, and in one third of the water bodies in lakes and transitional waters. Recently it was estimated that approximately 29% of monitored river channel length is polluted to some degree across the Republic of Ireland. Agricultural sources were suspected in 47 per cent of cases (EPA, 2012). Farm level management practices to reduce nutrient transfers from agricultural land to watercourses can be divided into source reduction and source interception approaches (Ribaudo et al., 2001). Source interception approaches involve capturing nutrients post mobilisation through policy instruments such as riparian buffer zones or wetlands. Conversely, the source reduction approach is preventative in nature and promotes strict management of nutrient at farm and field level to reduce risk of mobilisation in the first instance. This has the potential to deliver a double dividend of reduced nutrient loss to the wider ecosystem while maximising economic return to agricultural production at the field and farm levels. Adoption and use of nutrient management plans among farmers is far from the norm. This research engages key farmer and extension stakeholders to explore how current nutrient management planning software and outputs should be developed to make it more user friendly and usable in a practical way. An open innovation technology co-development approach was adopted to investigate what is demanded by the end users - farm advisors and farmers. Open innovation is a knowledge management strategy that uses the input of stakeholders to improve internal innovation processes. Open innovation incorporates processes such as 'user-led' (farmer and advisor) innovation and the 'co-development' (by technologists and users) of a technology. This strategy is increasingly used by a variety of organisations across sectors to try to ensure that the use of their outputs (products/services/technologies) is optimised by their target customers/clients, by incorporating user insights into the development of outputs. This research use the open innovation co-development framework through farmer and farm advisor focus group sessions to inform the development of a desirable software package for nutrient management planners (farm advisors) and desirable output formats for the end user (farmers). References Sutton, M., Oenema, O., Erisman, J. W., Leip, A., Grinsven, H. & Winiwarter, W. 2011. Too much of a good thing. Nature, 472, 159.161. European Environment Agency, 2012. European waters — assessment of status and pressures. Environmental Protection Agency, 2012. Ireland's Environment: An assessment 2012. Ribaudo, M.O., Heimlich, R., Claassen, R., Peters, M., 2001. Least-cost management of nonpoint source pollution: source reduction versus interception strategies for controlling nitrogen loss in the Mississippi Basin. Ecological Economics, 37, 183-197.
Russian History; A Guide to Reference Sources.
ERIC Educational Resources Information Center
McGill Univ., Montreal (Quebec). McLennan Library.
This guide identifies reference sources for the study of Russian and Soviet history available in the McGill University (Montreal) McLennan Library. Russian, English, French, and German language works covering Russian history from its origins to World War II are included. The guide is arranged in two parts: general reference sources and…
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Specific licenses for the manufacture or initial transfer... manufacture or initial transfer of calibration or reference sources. (a) An application for a specific license to manufacture or initially transfer calibration or reference sources containing plutonium, for...
Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.
Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D
2016-12-01
Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.
Note: Tormenta: An open source Python-powered control software for camera based optical microscopy
NASA Astrophysics Data System (ADS)
Barabas, Federico M.; Masullo, Luciano A.; Stefani, Fernando D.
2016-12-01
Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.
OpenCFU, a New Free and Open-Source Software to Count Cell Colonies and Other Circular Objects
Geissmann, Quentin
2013-01-01
Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net. PMID:23457446
Utilization of open source electronic health record around the world: A systematic review.
Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahamdi, Maryam
2014-01-01
Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems.
A New Architecture for Visualization: Open Mission Control Technologies
NASA Technical Reports Server (NTRS)
Trimble, Jay
2017-01-01
Open Mission Control Technologies (MCT) is a new architecture for visualisation of mission data. Driven by requirements for new mission capabilities, including distributed mission operations, access to data anywhere, customization by users, synthesis of multiple data sources, and flexibility for multi-mission adaptation, Open MCT provides users with an integrated customizable environment. Developed at NASAs Ames Research Center (ARC), in collaboration with NASAs Advanced Multimission Operations System (AMMOS) and NASAs Jet Propulsion Laboratory (JPL), Open MCT is getting its first mission use on the Jason 3 Mission, and is also available in the testbed for the Mars 2020 Rover and for development use for NASAs Resource Prospector Lunar Rover. The open source nature of the project provides for use outside of space missions, including open source contributions from a community of users. The defining features of Open MCT for mission users are data integration, end user composition and multiple views. Data integration provides access to mission data across domains in one place, making data such as activities, timelines, telemetry, imagery, event timers and procedures available in one place, without application switching. End user composition provides users with layouts, which act as a canvas to assemble visualisations. Multiple views provide the capability to view the same data in different ways, with live switching of data views in place. Open MCT is browser based, and works on the desktop as well as tablets and phones, providing access to data anywhere. An early use case for mobile data access took place on the Resource Prospector (RP) Mission Distributed Operations Test, in which rover engineers in the field were able to view telemetry on their phones. We envision this capability providing decision support to on console operators from off duty personnel. The plug-in architecture also allows for adaptation for different mission capabilities. Different data types and capabilities may be added or removed using plugins. An API provides a means to write new capabilities and to create data adaptors. Data plugins exist for mission data sources for NASA missions. Adaptors have been written by international and commercial users. Open MCT is open source. Open source enables collaborative development across organizations and also makes the product available outside of the space community, providing a potential source of usage and ideas to drive product design and development. The combination of open source with an Apache 2 license, and distribution on GitHub, has enabled an active community of users and contributors. The spectrum of users for Open MCT is, to our knowledge, unprecedented for mission software. In addition to our NASA users, we have, through open source, had users and inquires on projects ranging from Internet of Things, to radio hobbyists, to farming projects. We have an active community of contributors, enabling a flow of ideas inside and outside of the space community.
Apparatus and method for detecting gamma radiation
Sigg, Raymond A.
1994-01-01
A high efficiency radiation detector for measuring X-ray and gamma radiation from small-volume, low-activity liquid samples with an overall uncertainty better than 0.7% (one sigma SD). The radiation detector includes a hyperpure germanium well detector, a collimator, and a reference source. The well detector monitors gamma radiation emitted by the reference source and a radioactive isotope or isotopes in a sample source. The radiation from the reference source is collimated to avoid attenuation of reference source gamma radiation by the sample. Signals from the well detector are processed and stored, and the stored data is analyzed to determine the radioactive isotope(s) content of the sample. Minor self-attenuation corrections are calculated from chemical composition data.
Using stable isotope systematics and trace metals to constrain the dispersion of fish farm pollution
NASA Astrophysics Data System (ADS)
Torchinsky, A.; Shiel, A. E.; Price, M.; Weis, D. A.
2010-12-01
Fish farming is a growing industry of great economic importance to coastal communities. Unfortunately, open-net fish farming is associated with the release of organic and metal pollution, which has the potential to adversely affect the coastal marine environment. The dispersion of fish farm pollution and its environmental impact are not well understood/quantified. Pollutants released by fish farms include organic products such as uneaten feed pellets and fish feces, as well as chemicals and pharmaceuticals, all of which may enter marine ecosystems. In this study, we took advantage of bioaccumulation in passive suspension feeding Manila Clams collected at varying distances from an open-net salmon farm located in the Discovery Islands of British Columbia. Measurements of stable C and N isotopes, as well as trace metal concentrations, in the clams were used to investigate the spread of pollutants by detecting the presence of fish farm waste in the clams’ diet. Lead isotopic measurements were used to identify other significant anthropogenic pollution sources, which may impact the study area. Clams located within the areal extent of waste discharged by a fish farm are expected to exhibit anomalous light stable isotope ratios and metal concentrations, reflecting the presence of pollutants accumulated directly from seawater and from their diet. Clams were collected in the Discovery Islands from three sites in the Octopus Islands, located 850 m, 2100 m and 3000 m north of the Cyrus Rocks salmon farm (near Quadra Island) and from a reference site on Penn Island. Light stable isotope ratios (δN = ~10‰, with little variation between sites, and δC from -14.5 to -17.3‰) of the clams suggest that the most distal site (i.e., 3000 m away) is most impacted by organic fish farm waste (i.e., food pellets and feces) and that contributions of organic waste actually decrease closer to the farm. Not surprisingly, the smallest contribution of organic waste was detected in clams from the reference site. It is thought that resuspension of particulate waste could be responsible for concentrating waste far from the fish farm. No pattern was observed in the trace metal concentration measurements (Cu = 4.11 - 9.64 ppm, Zn 40.0 - 107 ppm and Pb 0.008 - 0.086 ppm) of the clams suggesting differences in the dispersion of metal contaminants and organic waste. Lead isotope ratios (1.14874 to 1.74100 for 206Pb /207Pb and 2.07579 to 2.10615 for 208Pb /206Pb) indicate the importance of anthropogenic Pb sources in the study area (i.e., unleaded gasoline and diesel fuel consumption and metal smelting), however, the anthropogenic Pb sources are unlikely to be associated with the open-net salmon farm. Waste dispersion from open-net fish farms is complicated by physical oceanographic conditions, which characterize individual study areas, this must be taken into account when interpreting results and designing future studies.
The contribution of different information sources for adverse effects data.
Golder, Su; Loke, Yoon K
2012-04-01
The aim of this study is to determine the relative value and contribution of searching different sources to identify adverse effects data. The process of updating a systematic review and meta-analysis of thiazolidinedione-related fractures in patients with type 2 diabetes mellitus was used as a case study. For each source searched, a record was made for each relevant reference included in the review noting whether it was retrieved with the search strategy used and whether it was available but not retrieved. The sensitivity, precision, and number needed to read from searching each source and from different combinations of sources were also calculated. There were 58 relevant references which presented sufficient numerical data to be included in a meta-analysis of fractures and bone mineral density. The highest number of relevant references were retrieved from Science Citation Index (SCI) (35), followed by BIOSIS Previews (27) and EMBASE (24). The precision of the searches varied from 0.88% (Scirus) to 41.67% (CENTRAL). With the search strategies used, the minimum combination of sources required to retrieve all the relevant references was; the GlaxoSmithKline (GSK) website, Science Citation Index (SCI), EMBASE, BIOSIS Previews, British Library Direct, Medscape DrugInfo, handsearching and reference checking, AHFS First, and Thomson Reuters Integrity or Conference Papers Index (CPI). In order to identify all the relevant references for this case study a number of different sources needed to be searched. The minimum combination of sources required to identify all the relevant references did not include MEDLINE.
Survey of Non-Rigid Registration Tools in Medicine.
Keszei, András P; Berkels, Benjamin; Deserno, Thomas M
2017-02-01
We catalogue available software solutions for non-rigid image registration to support scientists in selecting suitable tools for specific medical registration purposes. Registration tools were identified using non-systematic search in Pubmed, Web of Science, IEEE Xplore® Digital Library, Google Scholar, and through references in identified sources (n = 22). Exclusions are due to unavailability or inappropriateness. The remaining (n = 18) tools were classified by (i) access and technology, (ii) interfaces and application, (iii) living community, (iv) supported file formats, and (v) types of registration methodologies emphasizing the similarity measures implemented. Out of the 18 tools, (i) 12 are open source, 8 are released under a permissive free license, which imposes the least restrictions on the use and further development of the tool, 8 provide graphical processing unit (GPU) support; (ii) 7 are built on software platforms, 5 were developed for brain image registration; (iii) 6 are under active development but only 3 have had their last update in 2015 or 2016; (iv) 16 support the Analyze format, while 7 file formats can be read with only one of the tools; and (v) 6 provide multiple registration methods and 6 provide landmark-based registration methods. Based on open source, licensing, GPU support, active community, several file formats, algorithms, and similarity measures, the tools Elastics and Plastimatch are chosen for the platform ITK and without platform requirements, respectively. Researchers in medical image analysis already have a large choice of registration tools freely available. However, the most recently published algorithms may not be included in the tools, yet.
NASA Astrophysics Data System (ADS)
Topping, David; Barley, Mark; Bane, Michael K.; Higham, Nicholas; Aumont, Bernard; Dingle, Nicholas; McFiggans, Gordon
2016-03-01
In this paper we describe the development and application of a new web-based facility, UManSysProp (http://umansysprop.seaes.manchester.ac.uk), for automating predictions of molecular and atmospheric aerosol properties. Current facilities include pure component vapour pressures, critical properties, and sub-cooled densities of organic molecules; activity coefficient predictions for mixed inorganic-organic liquid systems; hygroscopic growth factors and CCN (cloud condensation nuclei) activation potential of mixed inorganic-organic aerosol particles; and absorptive partitioning calculations with/without a treatment of non-ideality. The aim of this new facility is to provide a single point of reference for all properties relevant to atmospheric aerosol that have been checked for applicability to atmospheric compounds where possible. The group contribution approach allows users to upload molecular information in the form of SMILES (Simplified Molecular Input Line Entry System) strings and UManSysProp will automatically extract the relevant information for calculations. Built using open-source chemical informatics, and hosted at the University of Manchester, the facilities are provided via a browser and device-friendly web interface, or can be accessed using the user's own code via a JSON API (application program interface). We also provide the source code for all predictive techniques provided on the site, covered by the GNU GPL (General Public License) license to encourage development of a user community. We have released this via a Github repository (doi:10.5281/zenodo.45143). In this paper we demonstrate its use with specific examples that can be simulated using the web-browser interface.
SAIP2014, the 59th Annual Conference of the South African Institute of Physics
NASA Astrophysics Data System (ADS)
Engelbrecht, Chris; Karataglidis, Steven
2015-04-01
The International Celestial Reference Frame (ICRF) was adopted by the International Astronomical Union (IAU) in 1997. The current standard, the ICRF-2, is based on Very Long Baseline Interferometric (VLBI) radio observations of positions of 3414 extragalactic radio reference sources. The angular resolution achieved by the VLBI technique is on a scale of milliarcsecond to sub-milliarcseconds and defines the ICRF with the highest accuracy available at present. An ideal reference source used for celestial reference frame work should be unresolved or point-like on these scales. However, extragalactic radio sources, such as those that definevand maintain the ICRF, can exhibit spatially extended structures on sub-milliarsecond scalesvthat may vary both in time and frequency. This variability can introduce a significant error in the VLBI measurements thereby degrading the accuracy of the estimated source position. Reference source density in the Southern celestial hemisphere is also poor compared to the Northern hemisphere, mainly due to the limited number of radio telescopes in the south. In order to dene the ICRF with the highest accuracy, observational efforts are required to find more compact sources and to monitor their structural evolution. In this paper we show that the astrometric VLBI sessions can be used to obtain source structure information and we present preliminary imaging results for the source J1427-4206 at 2.3 and 8.4 GHz frequencies which shows that the source is compact and suitable as a reference source.
Open source tools for ATR development and performance evaluation
NASA Astrophysics Data System (ADS)
Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.
2002-07-01
Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.
Open Source Hbim for Cultural Heritage: a Project Proposal
NASA Astrophysics Data System (ADS)
Diara, F.; Rinaudo, F.
2018-05-01
Actual technologies are changing Cultural Heritage research, analysis, conservation and development ways, allowing new innovative approaches. The possibility of integrating Cultural Heritage data, like archaeological information, inside a three-dimensional environment system (like a Building Information Modelling) involve huge benefits for its management, monitoring and valorisation. Nowadays there are many commercial BIM solutions. However, these tools are thought and developed mostly for architecture design or technical installations. An example of better solution could be a dynamic and open platform that might consider Cultural Heritage needs as priority. Suitable solution for better and complete data usability and accessibility could be guaranteed by open source protocols. This choice would allow adapting software to Cultural Heritage needs and not the opposite, thus avoiding methodological stretches. This work will focus exactly on analysis and experimentations about specific characteristics of these kind of open source software (DBMS, CAD, Servers) applied to a Cultural Heritage example, in order to verifying their flexibility, reliability and then creating a dynamic HBIM open source prototype. Indeed, it might be a starting point for a future creation of a complete HBIM open source solution that we could adapt to others Cultural Heritage researches and analysis.
Open Source Clinical NLP - More than Any Single System.
Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.
Christopher W. Helm
2006-01-01
GLIMS is a NASA funded project that utilizes Open-Source Software to achieve its goal of creating a globally complete inventory of glaciers. The participation of many international institutions and the development of on-line mapping applications to provide access to glacial data have both been enhanced by Open-Source GIS capabilities and play a crucial role in the...
Meteorological Error Budget Using Open Source Data
2016-09-01
ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using Open- Source Data by J Cogan, J Smith, P...needed. Do not return it to the originator. ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using...Error Budget Using Open-Source Data 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) J Cogan, J Smith, P Haines
Open source bioimage informatics for cell biology
Swedlow, Jason R.; Eliceiri, Kevin W.
2009-01-01
Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery. PMID:19833518
Zheng, Gaoxing; Qi, Xiaoying; Li, Yuzhu; Zhang, Wei; Yu, Yuguo
2018-01-01
The choice of different reference electrodes plays an important role in deciphering the functional meaning of electroencephalography (EEG) signals. In recent years, the infinity zero reference using the reference electrode standard technique (REST) has been increasingly applied, while the average reference (AR) was generally advocated as the best available reference option in previous classical EEG studies. Here, we designed EEG experiments and performed a direct comparison between the influences of REST and AR on EEG-revealed brain activity features for three typical brain behavior states (eyes-closed, eyes-open and music-listening). The analysis results revealed the following observations: (1) there is no significant difference in the alpha-wave-blocking effect during the eyes-open state compared with the eyes-closed state for both REST and AR references; (2) there was clear frontal EEG asymmetry during the resting state, and the degree of lateralization under REST was higher than that under AR; (3) the global brain functional connectivity density (FCD) and local FCD have higher values for REST than for AR under different behavior states; and (4) the value of the small-world network characteristic in the eyes-closed state is significantly (in full, alpha, beta and gamma frequency bands) higher than that in the eyes-open state, and the small-world effect under the REST reference is higher than that under AR. In addition, the music-listening state has a higher small-world network effect than the eyes-closed state. The above results suggest that typical EEG features might be more clearly presented by applying the REST reference than by applying AR when using a 64-channel recording. PMID:29593490
Numerical Simulation of Dispersion from Urban Greenhouse Gas Sources
NASA Astrophysics Data System (ADS)
Nottrott, Anders; Tan, Sze; He, Yonggang; Winkler, Renato
2017-04-01
Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model scalar emissions from various components of the natural gas distribution system, to study the impact of urban meteorology on mobile greenhouse gas measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of plumes, due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments. The Boussinesq approximation was applied to investigate the effects of canopy layer temperature gradients and convection on sensor footprints.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Schedule C-prototype tests for calibration or reference... Licensed Items § 32.102 Schedule C—prototype tests for calibration or reference sources containing..., conduct prototype tests, in the order listed, on each of five prototypes of the source, which contains...
Building CHAOS: An Operating System for Livermore Linux Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garlick, J E; Dunlap, C M
2003-02-21
The Livermore Computing (LC) Linux Integration and Development Project (the Linux Project) produces and supports the Clustered High Availability Operating System (CHAOS), a cluster operating environment based on Red Hat Linux. Each CHAOS release begins with a set of requirements and ends with a formally tested, packaged, and documented release suitable for use on LC's production Linux clusters. One characteristic of CHAOS is that component software packages come from different sources under varying degrees of project control. Some are developed by the Linux Project, some are developed by other LC projects, some are external open source projects, and some aremore » commercial software packages. A challenge to the Linux Project is to adhere to release schedules and testing disciplines in a diverse, highly decentralized development environment. Communication channels are maintained for externally developed packages in order to obtain support, influence development decisions, and coordinate/understand release schedules. The Linux Project embraces open source by releasing locally developed packages under open source license, by collaborating with open source projects where mutually beneficial, and by preferring open source over proprietary software. Project members generally use open source development tools. The Linux Project requires system administrators and developers to work together to resolve problems that arise in production. This tight coupling of production and development is a key strategy for making a product that directly addresses LC's production requirements. It is another challenge to balance support and development activities in such a way that one does not overwhelm the other.« less
10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...
10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...
10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...
10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...
10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Inspection, maintenance, and opening of a source or source holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY..., for defects before each use to ensure that the equipment is in good working condition and that...
NASA Astrophysics Data System (ADS)
Hasenkopf, C. A.
2017-12-01
Increasingly, open data, open-source projects are unearthing rich datasets and tools, previously impossible for more traditional avenues to generate. These projects are possible, in part, because of the emergence of online collaborative and code-sharing tools, decreasing costs of cloud-based services to fetch, store, and serve data, and increasing interest of individuals to contribute their time and skills to 'open projects.' While such projects have generated palpable enthusiasm from many sectors, many of these projects face uncharted paths for sustainability, visibility, and acceptance. Our project, OpenAQ, is an example of an open-source, open data community that is currently forging its own uncharted path. OpenAQ is an open air quality data platform that aggregates and universally formats government and research-grade air quality data from 50 countries across the world. To date, we make available more than 76 million air quality (PM2.5, PM10, SO2, NO2, O3, CO and black carbon) data points through an open Application Programming Interface (API) and a user-customizable download interface at https://openaq.org. The goal of the platform is to enable an ecosystem of users to advance air pollution efforts from science to policy to the private sector. The platform is also an open-source project (https://github.com/openaq) and has only been made possible through the coding and data contributions of individuals around the world. In our first two years of existence, we have seen requests for data to our API skyrocket to more than 6 million datapoints per month, and use-cases as varied as ingesting data aggregated from our system into real-time models of wildfires to building open-source statistical packages (e.g. ropenaq and py-openaq) on top of the platform to creating public-friendly apps and chatbots. We will share a whirl-wind trip through our evolution and the many lessons learned so far related to platform structure, community engagement, organizational model type and sustainability.
Comprehensive Routing Security Development and Deployment for the Internet
2015-02-01
feature enhancement and bug fixes. • MySQL : MySQL is a widely used and popular open source database package. It was chosen for database support in the...RPSTIR depends on several other open source packages. • MySQL : MySQL is used for the the local RPKI database cache. • OpenSSL: OpenSSL is used for...cryptographic libraries for X.509 certificates. • ODBC mySql Connector: ODBC (Open Database Connectivity) is a standard programming interface (API) for
NASA Astrophysics Data System (ADS)
Yetman, G.; Downs, R. R.
2011-12-01
Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.
A comparison of age, size, and fecundity of harvested and reference White Sucker populations
Begley, Meg; Coghlan, Stephen M.; Zydlewski, Joseph D.
2017-01-01
White Suckers Catostomus commersonii are an important source of fresh bait for the Maine lobster fishery. The Maine Department of Inland Fisheries and Wildlife began issuing commercial harvest permits in 1991, without reporting requirements or limits on the number of permits. There is recent concern that overfishing may be occurring. To infer impact, we investigated demographic differences between White Sucker populations in lakes open to harvest and those in lakes closed to harvest. Each of three harvested lakes was paired to a nearby closed lake as a reference based on general size, morphometry, and information on harvest pressure. In total, 976 spawning White Suckers were collected from the six lakes in 2014 (120–282 individuals/lake). Fish size, estimated age, fecundity, and mortality rates were compared between lakes. We hypothesized that we would find smaller, younger, and less-fecund individuals in harvested lakes compared to reference lakes. Size and age distributions for both sexes differed between nearly all lake pairs (except between males from one pair). White Suckers from reference lakes were larger and older and had greater gonadosomatic indices and fecundity than fish from harvested lakes. Estimated annual mortality rates were at least twofold higher in harvested lakes than in reference lakes. We detected some differences in von Bertalanffy growth parameters between lake pairs, as might occur under selective harvest pressure. The growth coefficient was smaller for reference lakes than for harvested lakes, while asymptotic length was greater for reference lakes than for harvested lakes. The data suggest that current levels of exploitation are resulting in greater age truncation in existing White Sucker populations.
GIS-Based Noise Simulation Open Source Software: N-GNOIS
NASA Astrophysics Data System (ADS)
Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh
2015-12-01
Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.
Utilization of open source electronic health record around the world: A systematic review
Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahamdi, Maryam
2014-01-01
Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems. PMID:24672566
Martínez Barrio, Álvaro; Lagercrantz, Erik; Sperber, Göran O; Blomberg, Jonas; Bongcam-Rudloff, Erik
2009-01-01
Background The Distributed Annotation System (DAS) is a widely used network protocol for sharing biological information. The distributed aspects of the protocol enable the use of various reference and annotation servers for connecting biological sequence data to pertinent annotations in order to depict an integrated view of the data for the final user. Results An annotation server has been devised to provide information about the endogenous retroviruses detected and annotated by a specialized in silico tool called RetroTector. We describe the procedure to implement the DAS 1.5 protocol commands necessary for constructing the DAS annotation server. We use our server to exemplify those steps. Data distribution is kept separated from visualization which is carried out by eBioX, an easy to use open source program incorporating multiple bioinformatics utilities. Some well characterized endogenous retroviruses are shown in two different DAS clients. A rapid analysis of areas free from retroviral insertions could be facilitated by our annotations. Conclusion The DAS protocol has shown to be advantageous in the distribution of endogenous retrovirus data. The distributed nature of the protocol is also found to aid in combining annotation and visualization along a genome in order to enhance the understanding of ERV contribution to its evolution. Reference and annotation servers are conjointly used by eBioX to provide visualization of ERV annotations as well as other data sources. Our DAS data source can be found in the central public DAS service repository, , or at . PMID:19534743
A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.
2012-12-01
The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate pieces of this system progress, they will be combined and generalized to form a sort of social network for nationally consistent hydrologic modeling.
Bioclipse: an open source workbench for chemo- and bioinformatics.
Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl E S
2007-02-22
There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no successful attempts have been made to integrate chemo- and bioinformatics into a single framework. Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at http://www.bioclipse.net.
Web accessibility and open source software.
Obrenović, Zeljko
2009-07-01
A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.
Gichoya, Judy W; Kohli, Marc; Ivange, Larry; Schmidt, Teri S; Purkayastha, Saptarshi
2018-05-10
Open-source development can provide a platform for innovation by seeking feedback from community members as well as providing tools and infrastructure to test new standards. Vendors of proprietary systems may delay adoption of new standards until there are sufficient incentives such as legal mandates or financial incentives to encourage/mandate adoption. Moreover, open-source systems in healthcare have been widely adopted in low- and middle-income countries and can be used to bridge gaps that exist in global health radiology. Since 2011, the authors, along with a community of open-source contributors, have worked on developing an open-source radiology information system (RIS) across two communities-OpenMRS and LibreHealth. The main purpose of the RIS is to implement core radiology workflows, on which others can build and test new radiology standards. This work has resulted in three major releases of the system, with current architectural changes driven by changing technology, development of new standards in health and imaging informatics, and changing user needs. At their core, both these communities are focused on building general-purpose EHR systems, but based on user contributions from the fringes, we have been able to create an innovative system that has been used by hospitals and clinics in four different countries. We provide an overview of the history of the LibreHealth RIS, the architecture of the system, overview of standards integration, describe challenges of developing an open-source product, and future directions. Our goal is to attract more participation and involvement to further develop the LibreHealth RIS into an Enterprise Imaging System that can be used in other clinical imaging including pathology and dermatology.
Defending the Amazon: Conservation, Development and Security in Brazil
2009-03-01
against drugs is not 191 Nelson Jobim, interview by Empresa Brasil de Comunicação Radio, trans. Open Source Center, February 6, 2009, available from... Empresa Brasil de Comunicação Radio, trans. Open Source Center, February 6, 2009, available from http://www.ebc.com.br (accessed February 23, 2009...Institute of Peace, 1996. Jobim, Nelson. Interview by Empresa Brasil de Comunicação Radio. Translated by Open Source Center. February 6, 2009
Open-Source web-based geographical information system for health exposure assessment
2012-01-01
This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software. PMID:22233606
Apparatus and method for detecting gamma radiation
Sigg, R.A.
1994-12-13
A high efficiency radiation detector is disclosed for measuring X-ray and gamma radiation from small-volume, low-activity liquid samples with an overall uncertainty better than 0.7% (one sigma SD). The radiation detector includes a hyperpure germanium well detector, a collimator, and a reference source. The well detector monitors gamma radiation emitted by the reference source and a radioactive isotope or isotopes in a sample source. The radiation from the reference source is collimated to avoid attenuation of reference source gamma radiation by the sample. Signals from the well detector are processed and stored, and the stored data is analyzed to determine the radioactive isotope(s) content of the sample. Minor self-attenuation corrections are calculated from chemical composition data. 4 figures.
NASA Astrophysics Data System (ADS)
Gwamuri, J.; Pearce, Joshua M.
2017-08-01
The recent introduction of RepRap (self-replicating rapid prototyper) 3-D printers and the resultant open source technological improvements have resulted in affordable 3-D printing, enabling low-cost distributed manufacturing for individuals. This development and others such as the rise of open source-appropriate technology (OSAT) and solar powered 3-D printing are moving 3-D printing from an industry based technology to one that could be used in the developing world for sustainable development. In this paper, we explore some specific technological improvements and how distributed manufacturing with open-source 3-D printing can be used to provide open-source 3-D printable optics components for developing world communities through the ability to print less expensive and customized products. This paper presents an open-source low cost optical equipment library which enables relatively easily adapted customizable designs with the potential of changing the way optics is taught in resource constraint communities. The study shows that this method of scientific hardware development has a potential to enables a much broader audience to participate in optical experimentation both as research and teaching platforms. Conclusions on the technical viability of 3-D printing to assist in development and recommendations on how developing communities can fully exploit this technology to improve the learning of optics through hands-on methods have been outlined.
RADIOISOTOPES IN MEDICINE AND HUMAN PHYSIOLOGY. A Selected List of References
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCormick, J.A. comp.
1958-08-01
This bibliography contains 2862 references on uses of radioisotopes in diagnostic medicine, therapeutic medicine, clinical research, human physiology, general medical research, and immunology. The references were taken from the 1948 to 1956 open literature. A list of the journals from which the references were selected and an author index are included. (auth)'
Groeber, F; Schober, L; Schmid, F F; Traube, A; Kolbus-Hernandez, S; Daton, K; Hoffmann, S; Petersohn, D; Schäfer-Korting, M; Walles, H; Mewes, K R
2016-10-01
To replace the Draize skin irritation assay (OECD guideline 404) several test methods based on reconstructed human epidermis (RHE) have been developed and were adopted in the OECD test guideline 439. However, all validated test methods in the guideline are linked to RHE provided by only three companies. Thus, the availability of these test models is dependent on the commercial interest of the producer. To overcome this limitation and thus to increase the accessibility of in vitro skin irritation testing, an open source reconstructed epidermis (OS-REp) was introduced. To demonstrate the capacity of the OS-REp in regulatory risk assessment, a catch-up validation study was performed. The participating laboratories used in-house generated OS-REp to assess the set of 20 reference substances according to the performance standards amending the OECD test guideline 439. Testing was performed under blinded conditions. The within-laboratory reproducibility of 87% and the inter-laboratory reproducibility of 85% prove a high reliability of irritancy testing using the OS-REp protocol. In addition, the prediction capacity was with an accuracy of 80% comparable to previous published RHE based test protocols. Taken together the results indicate that the OS-REp test method can be used as a standalone alternative skin irritation test replacing the OECD test guideline 404. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Wright, Adam; Sittig, Dean F.
2008-01-01
In this paper we describe and evaluate a new distributed architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support), which leverages current health information exchange efforts and is based on the principles of a service-oriented architecture. The architecture allows disparate clinical information systems and clinical decision support systems to be seamlessly integrated over a network according to a set of interfaces and protocols described in this paper. The architecture described is fully defined and developed, and six use cases have been developed and tested using a prototype electronic health record which links to one of the existing prototype National Health Information Networks (NHIN): drug interaction checking, syndromic surveillance, diagnostic decision support, inappropriate prescribing in older adults, information at the point of care and a simple personal health record. Some of these use cases utilize existing decision support systems, which are either commercially or freely available at present, and developed outside of the SANDS project, while other use cases are based on decision support systems developed specifically for the project. Open source code for many of these components is available, and an open source reference parser is also available for comparison and testing of other clinical information systems and clinical decision support systems that wish to implement the SANDS architecture. PMID:18434256
NASA Astrophysics Data System (ADS)
Zhao, L.; Landi, E.; Lepri, S. T.; Kocher, M.; Zurbuchen, T. H.; Fisk, L. A.; Raines, J. M.
2017-01-01
In this paper, we study a subset of slow solar winds characterized by an anomalous charge state composition and ion temperatures compared to average solar wind distributions, and thus referred to as an “Outlier” wind. We find that although this wind is slower and denser than normal slow wind, it is accelerated from the same source regions (active regions and quiet-Sun regions) as the latter and its occurrence rate depends on the solar cycle. The defining property of the Outlier wind is that its charge state composition is the same as that of normal slow wind, with the only exception being a very large decrease in the abundance of fully charged species (He2+, C6+, N7+, O8+, Mg12+), resulting in a significant depletion of the He and C element abundances. Based on these observations, we suggest three possible scenarios for the origin of this wind: (1) local magnetic waves preferentially accelerating non-fully stripped ions over fully stripped ions from a loop opened by reconnection; (2) depleted fully stripped ions already contained in the corona magnetic loops before they are opened up by reconnection; or (3) fully stripped ions depleted by Coulomb collision after magnetic reconnection in the solar corona. If any one of these three scenarios is confirmed, the Outlier wind represents a direct signature of slow wind release through magnetic reconnection.
Experimental assessment of theory for refraction of sound by a shear layer
NASA Technical Reports Server (NTRS)
Schlinker, R. H.; Amiet, R. K.
1978-01-01
The refraction angle and amplitude changes associated with sound transmission through a circular, open-jet shear layer were studied in a 0.91 m diameter open jet acoustic research tunnel. Free stream Mach number was varied from 0.1 to 0.4. Good agreement between refraction angle correction theory and experiment was obtained over the test Mach number, frequency and angle measurement range for all on-axis acoustic source locations. For off-axis source positions, good agreement was obtained at a source-to-shear layer separation distance greater than the jet radius. Measureable differences between theory and experiment occurred at a source-to-shear layer separation distance less than one jet radius. A shear layer turbulence scattering experiment was conducted at 90 deg to the open jet axis for the same free stream Mach numbers and axial source locations used in the refraction study. Significant discrete tone spectrum broadening and tone amplitude changes were observed at open jet Mach numbers above 0.2 and at acoustic source frequencies greater than 5 kHz. More severe turbulence scattering was observed for downstream source locations.
An Open Source Model for Open Access Journal Publication
Blesius, Carl R.; Williams, Michael A.; Holzbach, Ana; Huntley, Arthur C.; Chueh, Henry
2005-01-01
We describe an electronic journal publication infrastructure that allows a flexible publication workflow, academic exchange around different forms of user submissions, and the exchange of articles between publishers and archives using a common XML based standard. This web-based application is implemented on a freely available open source software stack. This publication demonstrates the Dermatology Online Journal's use of the platform for non-biased independent open access publication. PMID:16779183
[GNU Pattern: open source pattern hunter for biological sequences based on SPLASH algorithm].
Xu, Ying; Li, Yi-xue; Kong, Xiang-yin
2005-06-01
To construct a high performance open source software engine based on IBM SPLASH algorithm for later research on pattern discovery. Gpat, which is based on SPLASH algorithm, was developed by using open source software. GNU Pattern (Gpat) software was developped, which efficiently implemented the core part of SPLASH algorithm. Full source code of Gpat was also available for other researchers to modify the program under the GNU license. Gpat is a successful implementation of SPLASH algorithm and can be used as a basic framework for later research on pattern recognition in biological sequences.
Passive rejection of heat from an isotope heat source through an open door
NASA Technical Reports Server (NTRS)
Burns, R. K.
1971-01-01
The isotope heat-source design for a Brayton power system includes a door in the thermal insulation through which the heat can be passively rejected to space when the power system is not operating. The results of an analysis to predict the heat-source surface temperature and the heat-source heat-exchanger temperature during passive heat rejection as a function of insulation door opening angle are presented. They show that for a door opening angle greater than 20 deg, the temperatures are less than the steady-state temperatures during power system operation.
DUAL HEATED ION SOURCE STRUCTURE HAVING ARC SHIFTING MEANS
Lawrence, E.O.
1959-04-14
An ion source is presented for calutrons, particularly an electrode arrangement for the ion generator of a calutron ion source. The ion source arc chamber is heated and an exit opening with thermally conductive plates defines the margins of the opening. These plates are electrically insulated from the body of the ion source and are connected to a suitable source of voltage to serve as electrodes for shaping the ion beam egressing from the arc chamber.